"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

April 29, 2017

Day #68 - CNN / RNN and Language Modelling Notes

At the end of every class, I have a feeling there is a lot to learn. People in the industry know things only at the application level. The depth of topics, mathematics discussed in class is very extensive. I always have a feeling of guilt "need to learn more". Every learning needs the breakpoint to correlate/understand end to end, to see the concept in a more familiar perspective. Always Keep Learning and Keep growing.

CNN Notes
  • In a CNN lower layers learn generic features like edges, shapes and feed it to higher layers
  • Earlier layer - Generic features
  • Later layer - Features specific to the corresponding problem
  • For any related problems we can leverage existing network VGG16, VGG19, Alexnet and modify the higher layers based on our need
  • Relu only passed those in Activation function where its > 0
  • Vanishing gradient problem - Weights will stagnate over a period of time
  • 6E/6W - Gradient Error with respect to weights
  • 6E/6I - Gradient Error with respect to Image
RNN
  • Main things is weights same across RNN
  • Weights between successive layers same
  • Document Classification, Data Generation, Chatbot, Time series - RNN can be used
LSTM - Long short term memory

Topics from Language Modelling class


Happy Learning!!!

No comments: