Content
@
https://warpcast.com/~/channel/privacy
0 reply
0 recast
0 reaction
meatballs
@meatballs
My GPU doesn't have enough vram to get decent results from Ollama running locally. Any suggestions for how best to spend funds to get private llm facilities?
1 reply
1 recast
2 reactions
Phil Cockfield
@pjc
๐ I am hanging out for the Nvidia DIGITS mini desktop box. Not out yet. Keen to hear what others might be reaching for. https://nvidianews.nvidia.com/news/nvidia-puts-grace-blackwell-on-every-desk-and-at-every-ai-developers-fingertips?utm_source=chatgpt.com
1 reply
0 recast
1 reaction