Content
@
https://warpcast.com/~/channel/papers-please
0 reply
0 recast
0 reaction
Varun Srinivasan
@v
Is there a credible solution to the LLM hallucination problem? Any interesting research papers or discussions on this?
9 replies
2 recasts
25 reactions
Stephan
@stephancill
How prevalent is it in coding applications? Feels like I never experience LLM hallucination but coding is pretty much my only use case
1 reply
0 recast
0 reaction
typeof.eth π΅
@typeof.eth
I get a lot of hallucinations for Solidity specifically, and a the odd hallucination with newer libraries
2 replies
0 recast
0 reaction
typeof.eth π΅
@typeof.eth
Like, this isnβt how this works at all. Cursor IDE
1 reply
0 recast
1 reaction
Stephan
@stephancill
Actually youβre right, I also experience this. I think Iβve got a pretty decent idea of what the AI probably wouldnβt be good at these days which I just avoid asking it about in the first place. Recency bias in my original reply
0 reply
0 recast
2 reactions