Mini-batch train your brain
Mini-batch Gradient Descent: a variant of the gradient descent algorithm for training neural networks that splits the training data into small sets resulting in more frequent model updates and more robust convergence.
This blog will help you iteratively learn AI through small code snippets and short practical tutorials. We also occasionally do posts about small projects involving applications of AI.
You can get in touch here for comments, suggestions or requests for posts.
We also offer consulting services for AI projects. If you are interested, contact us.