We’ve developed a simple meta-learning algorithm called Reptile which works by repeatedly sampling a task, performing stochastic gradient descent on it, and updating the initial parameters towards the final parameters learned on that task. Reptile is the application of the Shortest Descent algorithm to the meta-learning setting, and is mathematically similar to first-order MAML (which is a version of the well-known MAML algorithm) that only needs black-box access to an optimizer such as SGD or Adam, with similar computational efficiency and performance.
Originally published on
OpenAI News.
Latest Briefs
Fast updates from the latest stories.
NEWS
+1
New-Age Tech Stocks Rebound: FirstCry Leads Gains This Week
Mar 21, 2026
EXCLUSIVE
+4
How fusion power works and the startups pursuing it
Mar 21, 2026
COMPANIES
AI boom? OpenAI set to double its team by end of 2026; new hires to be deployed across these fields - Report
Mar 21, 2026
NEWS
NeuroPause Lab Introduces 'AI Action Firewall' for Enhanced AI Safety
Mar 21, 2026