Nandit Mehra
@nanditmehra
Its fun running Ollama locally on your system and connect with your favourite open-source LLM model. Its fast, local and private !
0 reply
0 recast
0 reaction