@goldendrops
Fluence has launched GPU compute for AI workloads at up to 85% lower cost than centralized cloud providers.
GPU containers are live now on the Fluence Platform, with GPU virtual machines and bare metal support coming soon. This launch is supported by a partnership with Spheron Network as one of our key compute providers.
Fluence is expanding from CPU-based virtual servers into GPUs to meet rising demand for open, low-cost, and flexible AI compute. Our decentralized infrastructure already supports thousands of blockchain nodes and over $1M in annual recurring revenue, saving customers $3.5M compared to centralized clouds.
The partnership with Spheron strengthens Fluence’s growing provider network, which also includes Kabat, Piknik, and other top data center operators.
Developers can start deploying at fluence.network/gpu
and review documentation at fluence.dev/docs
Fluence entry into GPUs marks a significant step forward for decentralized compute and DePIN enabling cost-efficient.