Content
@
https://warpcast.com/~/channel/p-doom
0 reply
0 recast
0 reaction
assayer
@assayer
The first AI alignment idea that makes sense to me. It is simple. We can't prevent superintelligent AI from emerging. We also can't control something smarter than us. However, we might be able to align ourselves with ASI, and in turn, ASI might align with us, effectively making us part of their "tribe". https://www.youtube.com/watch?v=_3m2cpZqvdw
2 replies
0 recast
5 reactions
Abdullah🎩🍖🔮Ⓜ️💜
@zeeshanali1
This actually resonates deeply. Instead of fighting inevitability becoming part of its value system might be our only real shot at survival — or even thriving in a post-ASI world.
0 reply
0 recast
0 reaction