Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions
Daniel - Bountycaster
@pirosb3
I just tried Llama 3 today, and I'm amazed its able to run locally on a regular Mac without internet access. The future looks incredibly bright! https://llama.meta.com/llama3/
3 replies
1 recast
17 reactions
avi
@avichalp.eth
will try it now. do you see it replacing chatGPT or others as a primary LLM for you anytime soon? my experience with llama 2 was that, it could compete with with chatGPT 3.5 but no where close to what we expect these days
1 reply
0 recast
0 reaction
Manan
@manan
Crazy! What're you planning to use it for?
1 reply
0 recast
0 reaction
0xChris
@0xchris
L3 is so good!
0 reply
0 recast
0 reaction
Pretty Blocks
@prettyblocks
What are the specs of your regular mac? I'm pre-m1. Haven't tried it yet, but most of the good Ollama models have a pretty low t/s. All this local LLM magic makes me want to upgrade.
0 reply
0 recast
0 reaction
Wguider 💎🎩🍖🦅
@wguider
what is this?? same 10 $DEGEN ??
0 reply
0 recast
0 reaction