Mini-batch train your brain

Mini-batch Gradient Descent: a variant of the gradient descent algorithm for training neural networks that splits the training data into small sets resulting in more frequent model updates and more robust convergence.

This blog will help you iteratively learn AI through small code snippets and short practical tutorials.