Content
@
https://warpcast.com/~/channel/farhack
0 reply
0 recast
0 reaction
COGNITION
@cognition
Need some high-performance compute for your FarHack project? We will be bringing a COGNITION PRO with 48GB of GPU VRAM to Chapter One today. We will also host a local, blazingly-fast, custom-trained llama3 model acessable via public URL. (We will post the link here before 1pm)
1 reply
2 recasts
2 reactions
bruno
@brunostefoni
Here is the link to our live LLM demo 70ee76bd73cc936db8.gradio.live
0 reply
0 recast
0 reaction
@
0 reply
0 recast
0 reaction
bruno
@brunostefoni
New link in case you want to try it out https://3c529f5658b6700d3e.gradio.live You can also come to try it out we are right here in the 2nd st location in the back playing with this awesome computer
0 reply
0 recast
0 reaction