1. This article proposes a novel approach called Active Forgetting with Synaptic Expansion-Convergence (AFEC) to actively forget old knowledge that limits the learning of new tasks in continual learning.
2. AFEC dynamically expands parameters to learn each new task and then selectively combines them, which is formally consistent with the underlying mechanism of biological active forgetting.
3. The proposed method was evaluated on a variety of continual learning benchmarks, including CIFAR-10 regression tasks, visual classification tasks and Atari reinforcement tasks, where AFEC effectively improved the learning of new tasks and achieved state-of-the-art performance in a plug-and-play way.
The article is generally trustworthy and reliable as it provides detailed information about the proposed approach and its evaluation on various continual learning benchmarks. The authors have provided evidence for their claims by citing relevant research papers and providing results from experiments conducted on different datasets. Furthermore, they have discussed potential risks associated with their approach such as overfitting or catastrophic forgetting.
However, there are some points that could be improved upon such as providing more details about the datasets used for evaluation or exploring counterarguments to their claims. Additionally, the authors could provide more insights into how their approach compares to existing methods in terms of performance and scalability. Finally, it would be beneficial if they discussed potential applications of their approach in real world scenarios.