There’s a Humongous Problem With AI Models

Hello!
A new study highlights a glaring hole in AI models' ability to learn new information: turns out, they can't!

In other words, if you want to teach an existing deep learning model something new, you'll likely have to retrain it from the ground up — otherwise, according to the research, the artificial neurons in their proverbial minds will sink to a value of zero. This results in a loss of "plasticity," or their ability to learn at all.
"If you think of it like your brain, then it'll be like 90 percent of the neurons are dead," University of Alberta computer scientist and lead study author Shibhansh Dohare told New Scientist. "There's just not enough left for you to learn."

"When the network is a large language model and the data are a substantial portion of the internet," reads the study, "then each retraining may cost millions of dollars in computation."
Obstacles Course
This phenomenon of plasticity loss is also a major moat between current AI models and the imagined "artificial general intelligence," or a theoretical AI that would be considered generally as intelligent as humans.
After all, in human terms, this would be like if we had to fully reboot our brains from scratch every time we took a new college course, lest we nuke most of our neurons.

Still, as it stands, a practical solution is still out of reach.
"A solution to continual learning is literally a billion-dollar question," Dohare told New Scientist. "A real, comprehensive solution that would allow you to continuously update a model would reduce the cost of training these models significantly."
Thank you!
Join us on social media!
See you!