@forallhumanity
We need more motivation for the People who stand guard over our future.
What the International AI Safety Report 2026 says:
— AI today already gives non-specialists instructions on creating biological weapons.
— The race between corporations and states forces the sacrifice of safety for the sake of speed.
— The behavior of systems is becoming increasingly difficult to predict or interpret. DeepMind's research of AI models from Anthropic and documentation for OpenAI models directly confirm this.
Geoffrey Hinton—the chief architect of modern AI—estimates the probability p(doom) of the destruction of humanity by AGI at 10–20%. The average estimate among engineers is at least 5%.
For comparison: in aviation, with a crash risk of even 1%, a plane would never take off. We are flying at full speed.
Now safety is entrusted to those who profit from it—or fight for world domination.
We need an independent voice.
What solution do you see?
Let's discuss the options in the next post. (2/2)