What if LLM hallucinations aren’t a bug, but a feature? Are the answers to open scientific problems simply stored as deep hallucinations inside LLM’s and we need to design clever prompts to retrieve them?
- 0 replies
- 0 recasts
- 0 reactions
LLM’s interpolate in the sheets but can’t extrapolate in the streets
- 0 replies
- 0 recasts
- 0 reactions
Money as a Schelling point is a powerful framework. US treasuries are currently Schelling point for ‘safest money’. Bitcoin has the best opportunity to be a new Schelling point for money because of how powerful and salient the memes are, in combination with its technical properties.
- 0 replies
- 0 recasts
- 0 reactions
