The dangers of a super AGI

History / Edit / PDF / EPUB / BIB /
Created: November 1, 2015 / Updated: November 2, 2024 / Status: in progress / 1 min read (~170 words)
Artificial General Intelligence

A super-intelligent AI decides that the best use of matter/energy in its vicinity is to nano-engineer the raw materials necessary for its mind to expand. It sends out quadrillions of self-replicating probes in a spherical region, programmed to reduce all matter they encounter into the "smart" matter which the AI can use as its brain. 2500 years later, give or take, there is a massive 5000 light year sphere of "empty" space. It's actually all cognitive matter.
Source: jrf_1973 - https://www.reddit.com/r/AskReddit/comments/39wkcu/what_scientific_breakthrough_would_be_the_most/cs75if8

=> is this the reduction to a Turing machine? space is basically just information encoded

To limit the dangers of AI, contain them in a physical body (which can be destroyed).
By extension, this means that human should be able to extend their reach through externalization, that is, provide APIs (function library) of their thought process so that others may query that process.