Lokp Ray pfp
Lokp Ray
@lokpray
I was planning to code a bit on the plane and wanted to use local LLM only to realize that my laptop RAM can only run these two DeepSeek models: DeepSeek-Coder-V2-Lite DeepSeek-R1-Distill-Qwen-1.5B I thought the future was localized LLM with privacy 🥲 PS: the peak spec for macbook pro max has 128gb but still...
1 reply
0 recast
2 reactions

Kasra Rahjerdi pfp
Kasra Rahjerdi
@jc4p
i know the deepseek distills sounds enticing but for local stuff you really can’t beat the gemma / llama coding fine tunes
1 reply
0 recast
0 reaction