MetaEnd🎩 pfp
MetaEnd🎩
@metaend.eth
Listen: @fileverse but with @venice-ai or openrouter api support and MCP for mem0
1 reply
1 recast
3 reactions

Fileverse pfp
Fileverse
@fileverse
👩‍🏭👀 Halooooooo @metaend.eth Here's a lil overview of our llm roadmappp👇 Stage 1: Olama, local LLM <> webApp (https) with no extra download Stage 2: local storage webLLM and/or third-party models (API) Stage 3: suuuurrrpriiiiise 🧁 Let us know if u have any feedback or feature requests!!
1 reply
0 recast
1 reaction

MetaEnd🎩 pfp
MetaEnd🎩
@metaend.eth
And MCP? mem0 can be run locally and would add global memory for all connected llms
0 reply
0 recast
0 reaction