"In any moment of decision the best thing you can do is the right thing, the next best thing is the wrong thing, and the worst thing you can do is nothing."– Theodore Roosevelt (attrib.)
A problem is a doubtful or difficult matter requiring a solution. It is something hard to understand or deal with. The decision making process is simple.
1. Define and clarify the problem: Does it warrant action? If so, when and how important is it?
2. Gather all the facts and understand their causes.
3. Think about the best possible options and solutions for your life.
4. Consider and compare the 'pros and cons' of each option. .
5. Select the best option - avoid vagueness and weak compromises in trying to optimize your happiness.
6. Explain your decision to those involved and affected.
While you are watching this 18 minute video that covers the wide ranging risks and opportunities, think about the following
- How do you make decisions? How do you make the optimal decision for you that aligns with your values and also takes into account the uncertainties that you face your values and the uncertainty?
- What are existential risks? How many did you identify
- Why is AI potentially a problem?
"In the Chinese language, the word "crisis" is composed of two characters,
one representing danger and the other, opportunity." – John F. Kennedy
What Are Existential Risks + Extreme Opportunities?
They are the risks that threaten the very survival of the human species, or those that could dramatically curtail its potential. There are many, from asteroid impact, to engineered pandemic to artificial intelligence (AI), and they are almost all understudied. AI risk is the least understood, but potentially the deadliest of all, as AIs could be extremely powerful agents with insufficiently safe motivations and goals. The problem is very difficult, philosophically and programmatically. If there obstacles are overcome, however, humanity can expect to look forwards to a world of dramatic abundance of health, wealth, and happiness.
In this riveting introduction Stuart Armstrong takes us through the top 12 existential risks and extreme opportunities that now exist. You will be surprised at the top 4 risks, especially when Climate change is not one of the top 4.
In addition, we will introduce living in exponential times a preliminary introduction about the quality of problem solving and decision making.
Stuart Armstrong’s research at Oxford University's Future of Humanity Institute centers on the risks and possibilities of Artificial Intelligence, the long-term potential for intelligent life (us), and anthropic (self-locating) probability. He aims to improve the understanding of the different types and natures of uncertainties surrounding human progress in the mid-to-far future.
Armstrong has a wry sense of humor about existential risks.