Digitala Vetenskapliga Arkivet

Operational message
There are currently operational disruptions. Troubleshooting is in progress.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
FedCluLearn: Federated Continual Learning Using Stream Micro-cluster Indexing Scheme
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.ORCID iD: 0009-0004-5241-6961
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.ORCID iD: 0000-0003-3128-191x
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.ORCID iD: 0000-0002-3010-8798
Ericsson AB, Stockholm, Sweden.
Show others and affiliations
2026 (English)In: Machine Learning and Knowledge Discovery in Databases. Research Track / [ed] Ribeiro R.P., Jorge A.M., Soares C., Gama J., Pfahringer B., Japkowicz N., LarraƱaga P., Abreu P.H., Springer Science+Business Media B.V., 2026, p. 331-349Conference paper, Published paper (Refereed)
Abstract [en]

Artificial Neural Networks (NNs) are unable to learn tasks continually using a single model, which leads to forgetting old knowledge, known as catastrophic forgetting. This is one of the shortcomings that usually plague intelligent systems based on NN models. Federated Learning (FL) is a decentralized approach to training machine learning models on multiple local clients without exchanging raw data. A paradigm that handles model learning in both settings, federated and continual, is known as Federated Continual Learning (FCL). In this work, we propose a novel FCL algorithm, called FedCluLearn, which uses a stream micro-cluster indexing scheme to deal with catastrophic forgetting. FedCluLearn interprets the federated training process as a stream clustering scenario. It stores statistics, similar to micro-clusters in stream clustering algorithms, about the learned concepts at the server and updates them at each training round to reflect the current local updates of the clients. FedCluLearn uses only active concepts in each training round to build the global model, meaning it temporarily forgets the knowledge that is not relevant to the current situation. In addition, the proposed algorithm is flexible in that it can consider the age of local updates to reflect the greater importance of more recent data. The proposed FCL approach has been benchmarked against three baseline algorithms by evaluating its performance in several control and real-world data experiments. The implementation of FedCluLearn and the experimental results are available at https://github.com/milenaangelova1/FedCluLearn.

Place, publisher, year, edition, pages
Springer Science+Business Media B.V., 2026. p. 331-349
Series
Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349
Keywords [en]
Catastrophic forgetting, Concept drift, Data stream clustering, Federated continual learning, Time series data, Cluster analysis, Cluster computing, Intelligent systems, Learning systems, Neural networks, Concept drifts, Continual learning, Indexing scheme, Micro-clusters, Neural-networks, Stream clustering, Time-series data, Clustering algorithms
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:bth-28834DOI: 10.1007/978-3-032-05981-9_20Scopus ID: 2-s2.0-105019303763ISBN: 9783032059802 (print)OAI: oai:DiVA.org:bth-28834DiVA, id: diva2:2010837
Conference
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2025, Porto, Sept 15-19, 2025
Part of project
HINTS - Human-Centered Intelligent Realities
Funder
Knowledge Foundation, 20220068Available from: 2025-11-03 Created: 2025-11-03 Last updated: 2025-11-04Bibliographically approved

Open Access in DiVA

The full text will be freely available from 2026-11-04 07:53
Available from 2026-11-04 07:53

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Angelova, MilenaBoeva, VeselkaAbghari, Shahrooz
By organisation
Department of Computer Science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 79 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf