Mini-batch train your brain
Mini-batch Gradient Descent: a variant of the gradient descent algorithm for training neural networks that splits the training data into small sets resulting in more frequent model updates and more robust convergence.
This blog will help you iteratively learn AI through small code snippets and short practical tutorials. We also occasionally do posts about small projects involving applications of AI.
If you have comments, suggestions or requests for posts, get in touch here.