@auroraglimp
➤ Computing Power ⟶ 25,000–50,000 exaFLOPS (20,000–40,000x the world's current fastest supercomputer)
➤ Training capability ⟶ Trillions/ quadrillions of parameters
➤ Current xAI scale ⟶ 200x current setup (~230,000 GPUs, 100-200 exaFLOPS)
➤ Power Draw ⟶ ~35 GW or power use of 35 million U.S. households or countries like Argentina (~30 GW)
➤ Annual Energy ⟶ ~245,000 GWH or 6% of U.S. annual electricity (~4,000 TWh)
➤ Cost ⟶ $1.5T hardware only and est. $2–3 trillion total over 5 years
➤ Annual investment needed ⟶ $400-600B/year