Content pfp
Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions

Daniel - Bountycaster pfp
Daniel - Bountycaster
@pirosb3
I just tried Llama 3 today, and I'm amazed its able to run locally on a regular Mac without internet access. The future looks incredibly bright! https://llama.meta.com/llama3/
3 replies
1 recast
17 reactions

Pretty Blocks pfp
Pretty Blocks
@prettyblocks
What are the specs of your regular mac? I'm pre-m1. Haven't tried it yet, but most of the good Ollama models have a pretty low t/s. All this local LLM magic makes me want to upgrade.
0 reply
0 recast
0 reaction