@youngboy
Protocol Models: Scaling Decentralized Training with Communication-Efficient Model Parallelism
Pluralis Research
> The research demonstrated that a large language model can be split and trained across consumer devices connected over the internet, with no loss in speed or performance.π
Full Paper: https://arxiv.org/pdf/2506.01260