Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions
Darryl Yeo đ ïž
@darrylyeo
LLMs are good at replicating state-of-the-art frontend code from a decade ago and it shows. Fifty lines of added imperative runtime code to implement a basic UI interaction that with a declarative, newly baseline browser built-in takes three. It's a real uphill battle. A gravity well pulling all my web apps toward verbosity, redundancy and mediocrity by default â because for better or worse, thatâs the historically dominant software development culture represented in the training data, and which is now not-so-subtly being imposed on me. I can only hope an inflection point will come where specialized model training is actually cost-effective and we can force the LLMs to âunlearnâ entire corpuses of old training data through exclusion, so we don't have to keep wasting precious context tokens correcting the few truly decent and up-to-date generalist models we currently have with oodles of docs embeddings and system prompt overrides. https://farcaster.xyz/polymutex.eth/0x43c25e74 https://farcaster.xyz/polymutex.eth/0x206cea70
4 replies
1 recast
14 reactions
Phil Cockfield
@pjc
That gravity warp field is gonna get real expensive. I like the metaphor. Its real!
0 reply
0 recast
0 reaction