Ishika pfp
Ishika
@ishika
One of the new key metric goals we will see for consumer AI tools will be: "how can we minimize the amount of iterations it takes for the user to reach their desired output." A big part of this challenge is prompt writing, it’s still one of the steepest learning curves for users. Soon, we’ll see it fully abstracted away. A user can write something simple, and an intermediary LLM loop will automatically enhance the prompt before it’s sent to the core model for execution. This kind of backend refinement will quietly reduce friction and help users get better results, faster.
0 reply
0 recast
1 reaction