ADAM, Learning Rate & Convergence

When you start coding simple neural networks yourself you require patience – training can take a while. I probably persevered for longer than most, being the stubborn person I am, but it’s 2025, time for change and to find out about improvements I am missing out on.

I used generative AI to suggest them. To investigate their validity, I’ve chosen a simple linear regression problem.

The improvements suggested were two distinct techniques:

  • Implement ADAM, which optimises the rate at which individual neurons change in response to error
  • Adjust the learning rate over time rather than use a constant value

Health Warning: Experiences/mileage may vary; for other problems, results might be different.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *