Humans Need Not Apply

Started by Professor_Fennec, September 01, 2014, 07:03:01 PM

Previous topic - Next topic
Quote from: MrBogosity on October 04, 2014, 07:46:59 AM
If we wanted to live like we did 200 years ago, we'd only need to work a few hours a week. But, of course, we'd have to do without electricity, running water, etc. As time goes on, the concept of the things we "need" changes. What were once pie-in-the-sky desires become necessities. Think of how necessary we consider vaccines to be (and rightly so), but how recently it's been that we've actually been able to make them well enough to eradicate the world's worst diseases!

And if we were content to live like we did in the 1950's (with electricity, running water, and primitive TV, but no personal computers, internet, cell phones, etc.) we'd probably only need to work 20 hours a week at the most.

As soon as a new technology comes along, it produces new desires to be met.

For instance, once reliable transistors were developed (the very early ones were less reliable than vacuum tubes),  people started not only making everything electronic, entirely new devices were developed that could only be practically made using electronics.  The cost benefits of high-volume mass production means almost everything these days is operated by a microcontroller (I've seen electric toothbrushes driven by microcontrollers).  Ubiquitous computing isn't coming, it's BEEN HERE for years already.  What people are really talking about is ubiquitous NETWORKING, and that could make a lot of current economic practices and strategies as obsolete as the current level of networking has made so many business and regulatory models.

Quote from: evensgrey on October 04, 2014, 09:40:52 AM
And if we were content to live like we did in the 1950's (with electricity, running water, and primitive TV, but no personal computers, internet, cell phones, etc.) we'd probably only need to work 20 hours a week at the most.

As soon as a new technology comes along, it produces new desires to be met.

For instance, once reliable transistors were developed (the very early ones were less reliable than vacuum tubes),  people started not only making everything electronic, entirely new devices were developed that could only be practically made using electronics.  The cost benefits of high-volume mass production means almost everything these days is operated by a microcontroller (I've seen electric toothbrushes driven by microcontrollers).  Ubiquitous computing isn't coming, it's BEEN HERE for years already.  What people are really talking about is ubiquitous NETWORKING, and that could make a lot of current economic practices and strategies as obsolete as the current level of networking has made so many business and regulatory models.

It's funny how that never ends up with society crashing itself, isn't it? So far, everytime technology has advanced to the point it made some business, regulatory, industrial, or social modal obsolete it has resulted in new opportunities and generally better conditions.

Quote from: dallen68 on October 04, 2014, 12:59:24 PM
It's funny how that never ends up with society crashing itself, isn't it? So far, everytime technology has advanced to the point it made some business, regulatory, industrial, or social modal obsolete it has resulted in new opportunities and generally better conditions.

While lots of societies have crashed (a lot of them quite hard), it's almost always been because they were essentially stagnant and ran into some problem they couldn't overcome because they were stagnant, and usually triggered by the things there were doing while stagnant.  The ones that come to mind immediately are the Mayan city-building culture, the Indus Valley city-building culture, and Imperial Rome. The first two appear to have collapsed when the environment changed in such a way that their agricultural system collapsed (a common problem for cultures that don't innovate in how they produce food), and Rome had an ultimately fatal set of economic problems that didn't actually kill it off directly only because outside forces finished it once it was sufficiently weakened.

I don't think people are understanding my fear.  This isn't an economic collapse I am fearing.  What I am fearing is a mass of people, save for an elect few, being shut out of the economy altogether.  With no economic value, the average human will not have anything to offer to justify being taken care of by sophisticated machines owned and operated by an elite few. 

If we are lucky, this transition will be gradual, allowing the human population to slowly decline as human labor is phased out of the economy as more and more resources, which would otherwise be freely available, shift to a new ruling class of aristocratic oligarchs.  It will be the oligarchs and their respective families that will live in the exclusive utopia where the machines see to their every need, not the masses of people who no longer have an economic value.  They will be the ones who pass on the legacy of humanity.  The rest of us will go extinct. 

I'm not mincing words here.  Even users of computers and networks will be replaced by yet even more sophisticated machines.  Anything we can do, a machine can potentially do better, and probably will given enough time.   

Quote from: Professor_Fennec on October 06, 2014, 03:27:47 AM
I don't think people are understanding my fear.  This isn't an economic collapse I am fearing.  What I am fearing is a mass of people, save for an elect few, being shut out of the economy altogether.

How does that happen? The economy IS people!

QuoteWith no economic value, the average human will not have anything to offer to justify being taken care of by sophisticated machines owned and operated by an elite few.

Wait, wait, where did this "owned and operated by an elite few" come from? That's new!

Anyway, you have people who aren't being taken care of by the machines (have unmet desires), and people who need work (untapped resources)...your problem solves itself!

Quote from: MrBogosity on October 06, 2014, 06:38:29 AM
How does that happen? The economy IS people!

The economy being people is the current paradigm because people exchange resources.  In a fully automated economy, resources will be exchanged between machines, not people. 

Quote from: MrBogosity on October 06, 2014, 06:38:29 AM
Wait, wait, where did this "owned and operated by an elite few" come from? That's new!

The notion of the "elite few" comes from the idea that sophisticated machines that take care of your needs will be to expensive for the masses who become increasingly unemployed because of their obsolescence.  Because momentum plays a big role in markets, those who own the machines (lets assume these machines are sophisticated enough to be self replicating and self maintaining), will undoubtedly be the most wealthy and politically powerful people of their day because these will be the people wealthy enough to become early adopters of this new technology. 

If the wealthy have their needs met, and can rely upon machines to get their work done and build their empires, what point is their to spend capital on human resources?  Even jobs like research and development positions are not safe from automation and smarter-than-human AI.   

Quote from: MrBogosity on October 06, 2014, 06:38:29 AM
Anyway, you have people who aren't being taken care of by the machines (have unmet desires), and people who need work (untapped resources)...your problem solves itself!

But it takes more than a need for work and a labor force to do that work in order to have a running economy.  Like any system, economies are not immune to physics.  People need various kinds of matter and energy in order to sustain and economy.  The old human based economy will lose access to its resources because it won't be able to compete with the automated AI based economy. 

Quote from: Professor_Fennec on October 07, 2014, 05:16:42 AM
But it takes more than a need for work and a labor force to do that work in order to have a running economy.  Like any system, economies are not immune to physics.  People need various kinds of matter and energy in order to sustain and economy.  The old human based economy will lose access to its resources because it won't be able to compete with the automated AI based economy. 

What are you basing this on?

Quote from: MrBogosity on October 07, 2014, 07:54:57 AM
What are you basing this on?

It's become quite clear he's basing it on paranoia.  His argument is something like "Humans won't be able to operate in the economy because they won't be able to compete with AIs, which will prevent them from being able to access products and services from the AIs, which will (somehow) prevent them from trading with each other as well."

I know you're trying to get him to go into the "somehow" part or see that it doesn't work.  He's been quite clear he won't.

Quote from: evensgrey on October 07, 2014, 05:01:23 PM
It's become quite clear he's basing it on paranoia.  His argument is something like "Humans won't be able to operate in the economy because they won't be able to compete with AIs, which will prevent them from being able to access products and services from the AIs, which will (somehow) prevent them from trading with each other as well."

I know you're trying to get him to go into the "somehow" part or see that it doesn't work.  He's been quite clear he won't.

Also, what's supposed to be preventing them from accessing the products and services from the AI's? So far, when a technology has come along that produces a something better than otherwise would have been, that product becomes more widely available. I don't see any reason it would be any different here.

If he's going to say because the humans can't produce what the AI's are producing, I'm going to say that the humans would then produce something else, even if the production amounts to consuming what the AI's are producing.

Quote from: dallen68 on October 07, 2014, 06:01:43 PM
Also, what's supposed to be preventing them from accessing the products and services from the AI's? So far, when a technology has come along that produces a something better than otherwise would have been, that product becomes more widely available. I don't see any reason it would be any different here.

If he's going to say because the humans can't produce what the AI's are producing, I'm going to say that the humans would then produce something else, even if the production amounts to consuming what the AI's are producing.

Anything we can do, a machine will figure out how to do better on its own.  Its only a matter of time before machines become as smart as us, but what about when they become smarter than us?  Don't you guys think that makes humans an obsolete technology?  I don't see how that's paranoid as it is visionary.  Perhaps you may have noticed the lack of ape men running around, and a dwindling number of apes in the wild.  Don't think for a second that we couldn't go extinct, too, by a superior intelligence that came from us.

Quote from: Professor_Fennec on October 18, 2014, 06:02:19 AM
Anything we can do, a machine will figure out how to do better on its own.  Its only a matter of time before machines become as smart as us, but what about when they become smarter than us?  Don't you guys think that makes humans an obsolete technology?  I don't see how that's paranoid as it is visionary.  Perhaps you may have noticed the lack of ape men running around, and a dwindling number of apes in the wild.  Don't think for a second that we couldn't go extinct, too, by a superior intelligence that came from us.

Last I checked, The Terminator franchise was science fiction, not a documentary series.

October 18, 2014, 02:54:01 PM #26 Last Edit: October 18, 2014, 03:16:22 PM by Altimadark
Where and how will this True AI get the energy and materials needed to function and propagate? How do you know they'll win out against competing technologies, i.e. brain augmentation, brain uploading? If they try to take over, how will they deal with countermeasures such as nukes and EMPs? What necessitates that True AI becomes antagonistic in the first place?

Remember than natural selection is about survival of the fittest, not necessarily the fastest, or the strongest, or, no, not even the smartest. We may perhaps produce an AI well and truly beyond the capabilities of the human brain, but if it can't get the energy it needs to function, what does it matter? What if it requires very rare materials? Can't propagate if you need materials which simply don't exist.

I could go on, but the point is that we simply don't know enough about True AI to make any accurate predictions about it. Forget "when" or "how," we don't even know if True AI will even come about. How can we even hope to make any accurate predictions about what will happen once it does? Your fears are based entirely on speculation.
Failing to clean up my own mistakes since the early 80s.

Isn't "True AI" an oxymoron anyway?

Quote from: MrBogosity on October 19, 2014, 09:05:56 AM
Isn't "True AI" an oxymoron anyway?

Only in the sense that everything we know about intelligence indicates any particular system either does or does not display any particular aspect of it.

As to whether or not you can build an intelligent system, I think it follows inevitably that it can be done provided only that mind-body duality is false, as is indicated by all reliable data.

Quote from: MrBogosity on October 19, 2014, 09:05:56 AM
Isn't "True AI" an oxymoron anyway?
I honestly don't know, and all of the examples which come to mind are fictional anyway. I'm thinking about the factory-built robots from Terminator and The Matrix as much as I am the Neruo AIs from Metal Gear Rising, which are basically artificial brains, and as such can't have their experiences copy/pasted over to another system.
Failing to clean up my own mistakes since the early 80s.