Episode Transcript: #1350 - Nick Bostrom
Follow along with the complete transcript of this episode. The transcript automatically syncs with the video - highlighting the current segment and scrolling as you watch. Click any part of the transcript to jump to that timestamp.
And here we go.
Alright Nick, this is one of the things that scares people more than anything.
It's the idea that we're creating something or someone's going to create something.
It's going to be smarter than us. It's going to replace us.
Is that something we should really be concerned about?
I presume you're referring to babies.
I'm referring to artificial intelligence.
Yes. Well, it's the big fear and the big hope, I think.
Both?
At the same time, yeah.
How is it the big hope?
Well, there are a lot of things wrong with the world as it is now.
Try to pull this up to your face if you would.
All the problems we have, most of them could be solved if we were smarter
or if we had somebody on our side who were a lot smarter with better technology and so forth.
Also, I think if we want to imagine some really grand future where humanity or our descendants,
when they go out and colonize the universe, I think that's likely to happen.
If it's going to happen at all after we have super intelligence that then develops the technology to make that possible.
The real question is whether or not we would be able to harness this intelligence or whether it would dominate.
Yeah, that certainly is one question.