Digitala Vetenskapliga Arkivet

Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Decentralized and Incentivized Federated Learning: A Blockchain-Enabled Framework Utilising Compressed Soft-Labels and Peer Consistency
Tsinghua Univ, Beijing 100190, Peoples R China.;Fraunhofer Heinrich Hertz Inst, Dept Artificial Intelligence, D-10587 Berlin, Germany..
Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology.
Fraunhofer Heinrich Hertz Inst, Dept Artificial Intelligence, D-10587 Berlin, Germany..
Show others and affiliations
2024 (English)In: IEEE Transactions on Services Computing, E-ISSN 1939-1374, Vol. 17, no 4, p. 1449-1464Article in journal (Refereed) Published
Abstract [en]

Federated Learning (FL) has emerged as a powerful paradigm in Artificial Intelligence, facilitating the parallel training of Artificial Neural Networks on edge devices while safeguarding data privacy. Nonetheless, to encourage widespread adoption, Federated Learning Frameworks (FLFs) must tackle (i) the power imbalance between a central authority and its participants, and (ii) the challenge of equitably measuring and incentivizing contributions. Existing approaches to decentralize and incentivize FL processes are hindered by (i) computational overhead and (ii) uncertainty in contribution assessment (Witt et al. 2023), limiting FL's scalability beyond use cases where trust between participants and the server is established. This work introduces a cutting-edge, blockchain-enabled federated learning framework that incorporates Federated Knowledge Distillation (FD) with compressed 1-bit soft-labels, aggregated through a smart contract. Furthermore, we present the Peer Truth Serum for Federated Distillation (PTSFD), which cultivates an incentive-compatible ecosystem by rewarding honest participation based on an implicit yet effective comparison of worker contributions. The primary innovation stems from its lightweight architecture that simultaneously promotes decentralization and incentivization, addressing critical challenges in contemporary FL approaches.

Place, publisher, year, edition, pages
IEEE, 2024. Vol. 17, no 4, p. 1449-1464
Keywords [en]
Blockchains, Servers, Predictive models, Training, Computational modeling, Computer architecture, Smart contracts, Federated learning, blockchain, reward mechanism, federated distillation, decentralized machine learning
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:uu:diva-537263DOI: 10.1109/TSC.2023.3336980ISI: 001290231100016OAI: oai:DiVA.org:uu-537263DiVA, id: diva2:1893799
Available from: 2024-08-30 Created: 2024-08-30 Last updated: 2024-08-30Bibliographically approved

Open Access in DiVA

fulltext(3639 kB)170 downloads
File information
File name FULLTEXT01.pdfFile size 3639 kBChecksum SHA-512
6d0b6b81f9be9a7328c3739175962b6e0fb40d28ca2a248c1555b3822639ced0956bf2104c0d68b7634b412e843d1ac50445205e130372f27c6f221643e7585e
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Search in DiVA

By author/editor
Zafar, Usama
By organisation
Department of Information Technology
In the same journal
IEEE Transactions on Services Computing
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 170 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 113 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf