While biological intelligence grows organically as new knowledge is gathered throughout life, Artificial Neural Networks forget catastrophically whenever they face a changing training data distribution. Rehearsal-based Continual Learning (CL) approaches have been established as a versatile and reliable solution to overcome this limitation; however, sudden input disruptions and memory constraints are known to alter the consistency of their predictions. We study this phenomenon by investigating the geometric characteristics of the learner’s latent space and find that replayed data points of different classes increasingly mix up, interfering with classification. Hence, we propose a geometric regularizer that enforces weak requirements on the Laplacian spectrum of the latent space, promoting a partitioning behavior. Our proposal, called Continual Spectral Regularizer for Incremental Learning (CaSpeR-IL), can be easily combined with any rehearsal-based CL approach and improves the performance of SOTA methods on standard benchmarks.

Latent spectral regularization for continual learning / Frascaroli, Emanuele; Benaglia, Riccardo; Boschini, Matteo; Moschella, Luca; Fiorini, Cosimo; Rodolà, Emanuele; Calderara, Simone. - In: PATTERN RECOGNITION LETTERS. - ISSN 0167-8655. - 184:(2024), pp. 119-125. [10.1016/j.patrec.2024.06.020]

Latent spectral regularization for continual learning

Frascaroli, Emanuele
;
Benaglia, Riccardo;Boschini, Matteo;Fiorini, Cosimo;Calderara, Simone
2024

Abstract

While biological intelligence grows organically as new knowledge is gathered throughout life, Artificial Neural Networks forget catastrophically whenever they face a changing training data distribution. Rehearsal-based Continual Learning (CL) approaches have been established as a versatile and reliable solution to overcome this limitation; however, sudden input disruptions and memory constraints are known to alter the consistency of their predictions. We study this phenomenon by investigating the geometric characteristics of the learner’s latent space and find that replayed data points of different classes increasingly mix up, interfering with classification. Hence, we propose a geometric regularizer that enforces weak requirements on the Laplacian spectrum of the latent space, promoting a partitioning behavior. Our proposal, called Continual Spectral Regularizer for Incremental Learning (CaSpeR-IL), can be easily combined with any rehearsal-based CL approach and improves the performance of SOTA methods on standard benchmarks.
2024
184
119
125
Latent spectral regularization for continual learning / Frascaroli, Emanuele; Benaglia, Riccardo; Boschini, Matteo; Moschella, Luca; Fiorini, Cosimo; Rodolà, Emanuele; Calderara, Simone. - In: PATTERN RECOGNITION LETTERS. - ISSN 0167-8655. - 184:(2024), pp. 119-125. [10.1016/j.patrec.2024.06.020]
Frascaroli, Emanuele; Benaglia, Riccardo; Boschini, Matteo; Moschella, Luca; Fiorini, Cosimo; Rodolà, Emanuele; Calderara, Simone
File in questo prodotto:
File Dimensione Formato  
CASPER.pdf

Open access

Tipologia: Versione pubblicata dall'editore
Dimensione 1.37 MB
Formato Adobe PDF
1.37 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1354226
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact