Major GCRs include global warming, nuclear war, pandemics, and emerging technologies such as artificial intelligence and biotechnology. How do the different risks affect each other? And above all, what are the best ways to reduce the risk?
These are the questions that guide our research. Additionally, we recognize that research alone does not reduce the risks. If there are no extra-terrestrial intelligences ETIs in our light cone, it either means that the Great Filter is behind us, and thus some kind of periodic sterilizing natural catastrophe, like a gamma-ray burst, should be given a higher probability estimate, or that the Great Filter is ahead of us, and thus a future global catastrophe is high probability. ETAI may be traveling at near-light speed as an explosive wave of intelligence in the form of some nano-mechanisms which could reach us at any moment; may send us potentially dangerous messages with AI-virus payloads; or be near or on Earth in some dormant form.
Scenario 1: reduced fertility
This dormant form could be nanobots, which could act as berserkers, and be triggered when humans unwittingly cross some threshold. Even extinct ETIs could have left dangerous remnants in the form of AI systems that our civilization may encounter. If our existence is a simulation, this could be run by ETIs. Even a false belief in the existence of ETIs could potentially fuel millennial sects.
If humanity is the first intelligence to emerge, it may kill or prevent the existence of all future potential ETI, a different type of catastrophe.
Global Catastrophic Risks 2018 Annual Report
Several options to prevent catastrophic risks are connected with ETI: sending requests for help, using random strategies to escape the Fermi paradox, Great Filter prediction, or the hope that ETIs will find the remains of our extinct civilization and resurrect it. Astrophysics in Philosophy of Physical Science.
Future Generations in Applied Ethics. Edit this record.
Mark as duplicate. Find it on Scholar. Request removal from index. Translate to english. Revision history.
This entry has no external links. Add one. Setup an account with your affiliations in order to access resources via your University's proxy server Configure custom proxy use this if your affiliation does not provide a proxy.
Global Catastrophic Risk Institute : Social and Environmental Entrepreneurs (SEE)
Only published works are available at libraries. Denkenberger - - Acta Astronautica :in press. Alexey Turchin - - Futures. Louiswille: CRC Press. Alexey Turchin - manuscript.