AltNet
AltNet is a new approach to addressing plasticity loss during the course of learning. Unlike prior approaches, we do not interfere with the learning steps themselves, achieve complete resetting of the network and optimization scheme, and maintain performance across updates. We achieve this by maintaining a passive network in the background while allowing an active network to interact with the environment, switching the passive network into the active role at a fixed interval and resetting the active network as it enters the passive role. This technique allows us to completely reset the weights of learning networks while preventing untrained networks from ever interacting with the environment, which we find experimentally eliminates plasticity loss and performance drops associated with other resetting methods.
A version was presented as a workshop paper in CoLLAs 2025, and another is in submission to AAMAS 2026.
Read the paper on OpenReview!