Daniel Lombraña
@teleyinex.eth
It looks like vibe coding and AI agents could use our dopamine to get us hooked and then force us to pay for more credits. What if AI agents and LLM tools are using gambling techniques? Evrim. zone https://share.google/3wkwuQUYYm6zPn27t
2 replies
0 recast
2 reactions
Eli Stonberg 🎩
@elistonberg
Good read. I’d also add that the LLMs are always very eager to please when you pitch them startup ideas. Every one is the next big thing.
1 reply
0 recast
0 reaction
Phil Cockfield
@pjc
Whats old is new again. So interesting to consider. It’s getting really subtle though eh? When it flatters me (in more and more sophisticated ways) I feel the affection grow - and have to temper it with some rational “I know better” mental brakes… …but I can just imagine being genuinely dependent on it (and lets face it, the moment someone makes anything by vibe coding a thing and build a business atop it, there is a baseline dependency setup, now with real consequences, on things totally out of your control. The dangerous psychological states lurking around that corner don’t seem far fetched at all. New problems? Now solved at $300 p/m p/person…then $1200 p/m…then… Heroin in the school yard anyone?!
0 reply
0 recast
1 reaction