Content pfp
Content
@
0 reply
0 recast
0 reaction

downshift. đŸŽī¸đŸ’¨ pfp
downshift. đŸŽī¸đŸ’¨
@downshift.eth
what hardware should i buy to run LLMs locally? naive question, i know, but i don't have a good sense of where to start. are there useful public guides for deciding? the state of the art is changing rapidly, so figured i'd ask here first...
4 replies
0 recast
5 reactions

shoni.eth pfp
shoni.eth
@alexpaden
I use a maxed out Mac Studio which is the best non gpu option iirc the cheapest ram too
0 reply
0 recast
0 reaction