"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

January 21, 2017

Day #52 - Deep Learning Class #1 Notes

AI - Reverse Engineering the brain (Curated Knowledge)
ML - Machine Learning is subset of AI. Teaching Machine to Learn

Deep Learning - Rebirth of Neural Networks
  • Multiple layer of neurons
  • Directed Graph
  • First Layer is input layer
  • Last Layer is output layer
  • Intermediate layer is hidden Layer
  • Deep Learning is inspired by human brain
  • In Deep Learning features are learnt
  • Gradient Descent - Process of making updates in NN
  • Neural Networks is discriminative approach
  • Neurons in neural networks end up in becoming feature selectors
Discriminative Classifiers - Logistics, SVM (uses kernel for non-linear classification), Decision Trees
Generative Model - Naive Bayes

Types of Neural Networks
  • Autoencoders for dimensionality reduction
  • CNN Convolutional NN
  • RNN Recurrent NN
Interesting Deep Learning Demo Sites Discussed
imsitu.org
cloudcv.org

Happy Learning!!!

January 18, 2017

Neural Networks - Learning Resources

Happy Learning!!!

Interesting Data Science Projects

Happy Learning!!!

January 13, 2017

Day #51 - Neural Networks

Happy New Year 2017. This post is on Neural Networks.

Neural Networks
  • ANN - inspired by biological networks, Modelling network based on neurons
  • Key layers - Input Layer, Hidden Layer, Output Layer
  • Neural networks that can learn - Perceptrons, backpropagation networks, Boltzaman machines,recurrent networks
  • In below example for XOR implementation we use backpropagation 

Implementation overview
  • Initialize the edge weights at random (we do not exact weights we chose randomly, By training we find exact values)
  • Calculate the error - we have some training data and some results - Supervised learning, Calculated output not logical output, Error term present 
  • Calculate the changes of edge weights and update the weights (backpropagation process), Calculate edge weight changes and update accordingly
  • Algorithm terminates when error rate is small

Happy Pongal & Happy Learning!!!