Ok. Now *that* is a fair argument for me to try to come up with a rebuttal for. You were sounding a little nutty for a bit but I’ll see if I can work with that. If I can’t, I may be forced to concede. Let’s see what I can do, if anything.
Well, first off: The population has increased along with the weaponry.
So, there’s a lot more people that have to be killed as time marches on.
i don’t know what a maximal population of Earth is; we haven’t reached its capacity. Also, “things seem to happen” which reduce populations from time to time, such as wars and pestilence, although humans do seem to increase like a virus on the planet.
So the likeliness of me alone in the basement? Not terribly.
I also contend (which supports your argument somewhat) that our dependance upon singular systems without redundancy *can* result in bad situations arising, such as our dependence upon an electric grid being run off of steam around the world and vulnerable.
So, we do have a number of very fragile systems that we’re been able to contain so far, with patching, such.
We’re already in a fragile state.
That being said, not everybody plays by the same rules. There are already people living underground, educating themselves, learning about new technologies and ways to combat it. An educated Luddite is not a bad thing and oddly enough, I share a lot of your concerns about the fragility of humanity in general.
We sometimes follow a path towards “progress” so quickly, we neglect maintenance or consequence until the problems start.
Will the potential day come that AI will destroy the planet? it’s likely it won’t matter if it does by then. We probably won’t be here tongue emoticon
I think the fear of the signers of the Life document are extremely premature, that’s all. AI is _far_ from something to be concerned with at this point.
It seems _more likely_ to me that people of that future that you project, should it come to pass, will be better qualified to deal with such questions, being far more educated than we are about things. Our best of technology is less than a pocket calculator to the future. We’re capable of dealing with the threats we have today. We couldn’t handle a future AI threat but again, I still have no reason to believe that future people’s won’t be capable of handling it.
We’re in Ancient Egypt, worrying about the Internet by saying we’ll run out of stone with all the chiseling going on