Thanks to our contributors
We would like to give our special thanks to all the contributors who made the new version of Flower possible (in git shortlog
order):
Adam Narozniak
, Audris
, Charles Beauville
, Chong Shen Ng
, Daniel J. Beutel
, Daniel Nata Nugraha
, Heng Pan
, Javier
, Jiahao Tan
, Julian Rußmeyer
, Mohammad Naseri
, Ray Sun
, Robert Steiner
, Yan Gao
, xiliguguagua
What's new?
-
Introduce SuperExec log streaming (#3577, #3584, #4242, #3611, #3613)
Flower now supports log streaming from a remote SuperExec using the
flwr log
command. This new feature allows you to monitor logs from SuperExec in real time viaflwr log <run-id>
(orflwr log <run-id> <app-dir> <federation>
). -
Improve
flwr new
templates (#4291, #4292, #4293, #4294, #4295)The
flwr new
command templates for MLX, NumPy, sklearn, JAX, and PyTorch have been updated to improve usability and consistency across frameworks. -
Migrate ID handling to use unsigned 64-bit integers (#4170, #4237, #4243)
Node IDs, run IDs, and related fields have been migrated from signed 64-bit integers (
sint64
) to unsigned 64-bit integers (uint64
). To support this change, theuint64
type is fully supported in all communications. You may now useuint64
values in config and metric dictionaries. For Python users, that means usingint
values larger than the maximum value ofsint64
but less than the maximum value ofuint64
. -
Add Flower architecture explanation (#3270)
A new Flower architecture explainer page introduces Flower components step-by-step. Check out the
EXPLANATIONS
section of the Flower documentation if you're interested. -
Introduce FedRep baseline (#3790)
FedRep is a federated learning algorithm that learns shared data representations across clients while allowing each to maintain personalized local models, balancing collaboration and individual adaptation. Read all the details in the paper: "Exploiting Shared Representations for Personalized Federated Learning" (arxiv)
-
Improve FlowerTune template and LLM evaluation pipelines (#4286, #3769, #4272, #4257, #4220, #4282, #4171, #4228, #4258, #4296, #4287, #4217, #4249, #4324, #4219, #4327)
Refined evaluation pipelines, metrics, and documentation for the upcoming FlowerTune LLM Leaderboard across multiple domains including Finance, Medical, and general NLP. Stay tuned for the official launch—we welcome all federated learning and LLM enthusiasts to participate in this exciting challenge!
-
Enhance Docker Support and Documentation (#4191, #4251, #4190, #3928, #4298, #4192, #4136, #4187, #4261, #4177, #4176, #4189, #4297, #4226)
Upgraded Ubuntu base image to 24.04, added SBOM and gcc to Docker images, and comprehensively updated Docker documentation including quickstart guides and distributed Docker Compose instructions.
-
Introduce Flower glossary (#4165, #4235)
Added the Federated Learning glossary to the Flower repository, located under the
flower/glossary/
directory. This resource aims to provide clear definitions and explanations of key FL concepts. Community contributions are highly welcomed to help expand and refine this knowledge base — this is probably the easiest way to become a Flower contributor! -
Implement Message Time-to-Live (TTL) (#3620, #3596, #3615, #3609, #3635)
Added comprehensive TTL support for messages in Flower's SuperLink. Messages are now automatically expired and cleaned up based on configurable TTL values, available through the low-level API (and used by default in the high-level API).
-
Improve FAB handling (#4303, #4264, #4305, #4304)
An 8-character hash is now appended to the FAB file name. The
flwr install
command installs FABs with a more flattened folder structure, reducing it from 3 levels to 1. -
Update documentation (#3341, #3338, #3927, #4152, #4151, #3993)
Updated quickstart tutorials (PyTorch Lightning, TensorFlow, Hugging Face, Fastai) to use the new
flwr run
command and removed default title from documentation base template. A new blockchain example has been added to FAQ. -
Update example projects (#3716, #4007, #4130, #4234, #4206, #4188, #4247, #4331)
Refreshed multiple example projects including vertical FL, PyTorch (advanced), Pandas, Secure Aggregation, and XGBoost examples. Optimized Hugging Face quickstart with a smaller language model and removed legacy simulation examples.
-
Update translations (#4070, #4316, #4252, #4256, #4210, #4263, #4259)
-
General improvements (#4239, 4276, 4204, 4184, 4227, 4183, 4202, 4250, 4267, 4246, 4240, 4265, 4238, 4275, 4318, #4178, #4315, #4241, #4289, #4290, #4181, #4208, #4225, #4314, #4174, #4203, #4274, #3154, #4201, #4268, #4254, #3990, #4212, #2938, #4205, #4222, #4313, #3936, #4278, #4319, #4332, #4333)
As always, many parts of the Flower framework and quality infrastructure were improved and updated.
Incompatible changes
-
Drop Python 3.8 support and update minimum version to 3.9 (#4180, #4213, #4193, #4199, #4196, #4195, #4198, #4194)
Python 3.8 support was deprecated in Flower 1.9, and this release removes support. Flower now requires Python 3.9 or later (Python 3.11 is recommended). CI and documentation were updated to use Python 3.9 as the minimum supported version. Flower now supports Python 3.9 to 3.12.