full transcript

From the Ted Talk by Max Tegmark: How to get empowered, not overpowered, by AI

Unscramble the Blue Letters

It's much better to be proactive rather than reactive; plan ahead and get things right the first time because that might be the only time we'll get. But it is funny because sometimes peploe tell me, "Max, shhh, don't talk like that. That's Luddite scaremongering." But it's not scaremongering. It's what we at MIT call safety engineering. Think about it: before NASA launched the allopo 11 mission, they systematically thought through everything that could go wrong when you put people on top of explosive fuel tanks and luncah them somewhere where no one could help them. And there was a lot that could go wrong. Was that scaremongering? No. That's was precisely the safety eireiengnng that ereunsd the scesucs of the mission, and that is precisely the strategy I think we should take with AGI. Think through what can go wrong to make sure it goes right.

Open Cloze

It's much better to be proactive rather than reactive; plan ahead and get things right the first time because that might be the only time we'll get. But it is funny because sometimes ______ tell me, "Max, shhh, don't talk like that. That's Luddite scaremongering." But it's not scaremongering. It's what we at MIT call safety engineering. Think about it: before NASA launched the ______ 11 mission, they systematically thought through everything that could go wrong when you put people on top of explosive fuel tanks and ______ them somewhere where no one could help them. And there was a lot that could go wrong. Was that scaremongering? No. That's was precisely the safety ___________ that _______ the _______ of the mission, and that is precisely the strategy I think we should take with AGI. Think through what can go wrong to make sure it goes right.

Solution

  1. launch
  2. engineering
  3. apollo
  4. people
  5. success
  6. ensured

Original Text

It's much better to be proactive rather than reactive; plan ahead and get things right the first time because that might be the only time we'll get. But it is funny because sometimes people tell me, "Max, shhh, don't talk like that. That's Luddite scaremongering." But it's not scaremongering. It's what we at MIT call safety engineering. Think about it: before NASA launched the Apollo 11 mission, they systematically thought through everything that could go wrong when you put people on top of explosive fuel tanks and launch them somewhere where no one could help them. And there was a lot that could go wrong. Was that scaremongering? No. That's was precisely the safety engineering that ensured the success of the mission, and that is precisely the strategy I think we should take with AGI. Think through what can go wrong to make sure it goes right.

Frequently Occurring Word Combinations

ngrams of length 2

collocation frequency
ai researchers 7
crushed human 3
artificial intelligence 2
human ai 2
sea level 2
human intelligence 2
ai progress 2
making ai 2
safety engineering 2
lethal autonomous 2
autonomous weapons 2
ai safety 2
human extinction 2

ngrams of length 3

collocation frequency
crushed human ai 2
human ai researchers 2
lethal autonomous weapons 2

ngrams of length 4

collocation frequency
crushed human ai researchers 2

Important Words

  1. agi
  2. apollo
  3. call
  4. engineering
  5. ensured
  6. explosive
  7. fuel
  8. funny
  9. launch
  10. launched
  11. lot
  12. luddite
  13. mission
  14. mit
  15. nasa
  16. people
  17. plan
  18. precisely
  19. proactive
  20. put
  21. safety
  22. scaremongering
  23. shhh
  24. strategy
  25. success
  26. systematically
  27. talk
  28. tanks
  29. thought
  30. time
  31. top
  32. wrong