Content pfp
Content
@
https://warpcast.com/~/channel/pinata
0 reply
0 recast
0 reaction

Steve pfp
Steve
@stevedylandev.eth
These two images tell a fascinating story On the left, we have a server running on a GPU enabled machine that can run local AI models through Ollama, and with x402 monetize that usage On the right, we have a request being made to that server using the OpenAI SDK and x402-fetch Distributed AI is near, and so is the blog post on this experiment
2 replies
0 recast
15 reactions

osama pfp
osama
@osama
why would you distribute the inference infra? what's the forcing function? def is not economical/private. what is it then? how does on-device come into play here? genuine question.
1 reply
0 recast
0 reaction

Steve pfp
Steve
@stevedylandev.eth
Admittedly it’s not a complete solution but conceptually it sets up a world where higher compute hardware is more accessible outside of large central providers, and for that hardware to be monetized.
2 replies
0 recast
0 reaction

Royal pfp
Royal
@royalaid.eth
Yeah this feels like a sketch of what automatic and self service AI looks like, not so much "decentralized" but rather something between federated and distributed.
2 replies
0 recast
3 reactions

Kyle Tut pfp
Kyle Tut
@kyletut
I feel like we are stumbling around in the dark trying to figure out what the room is that we are in. We've stubbed our toe on a chair so far but don't know what else is around us. If AI is going to be as dynamic as people think, access to computing definitely needs to have less boundaries but we don't have good examples of that today. In 10 years, we will be able to point at Steve's example as a primitive example of everyday computing but we are probably missing some key components right now.
2 replies
0 recast
3 reactions

osama pfp
osama
@osama
an asic that runs full cnn including finetuning (of sorts for new objects) and w/ a camera costs $9. compute is more accessible than ever before and will continue to be. same will happen to large models. they won't remain "large" and run on asics ($100-200 later $10) within 18mos or so
1 reply
0 recast
0 reaction

Kyle Tut pfp
Kyle Tut
@kyletut
You're saying just use crypto rails for agentic payments, not for compute then?
1 reply
0 recast
0 reaction

osama pfp
osama
@osama
i'm just saying that trying to solve for accessible/distributed/decentralized is a tar pit crypto bro/vc narrative. similar to how self-sovereign identity/data was hijacked by crypto before. the agentic commerce on crypto rails thesis is a tbd. given stables proliferation across fintech as settlement layer and push for agentic commerce, there is something there. agentic commerce will happen (high prob). stablecoins are happening (stripe, airbnb, coinbase, banking consortiums, etc etc). unsure of the rest of that narrative crypto twitter wants is where things are headed. as a founder and understanding market structures helps. the economics doesn't work for distributed inference else togetherai and/or sfcompute and 10s other inference layer startups from 2023 would've done it. their talent knows crypto and ai as tech fairly well
2 replies
0 recast
0 reaction