Humans Need Not Apply

Started by Professor_Fennec, September 01, 2014, 07:03:01 PM

Previous topic - Next topic
Quote from: Altimadark on October 19, 2014, 12:47:17 PM
I honestly don't know, and all of the examples which come to mind are fictional anyway. I'm thinking about the factory-built robots from Terminator and The Matrix as much as I am the Neruo AIs from Metal Gear Rising, which are basically artificial brains, and as such can't have their experiences copy/pasted over to another system.

Yeah, but they weren't "true AI" like the kid in the movie A.I. was. In fact, in Terminator 2 we learned that their brains were specifically inhibited to prevent them from thinking for themselves, hence the "reprogramming" scene where they remove the inhibitor. The kid in A.I. was truly intelligent, so in that case I'm wondering how the word "artificial" would imply to it (it would apply to the makeup of his body, but not his intelligence).

October 19, 2014, 03:18:32 PM #31 Last Edit: October 19, 2014, 03:22:26 PM by Altimadark
Okay, I think I understand what you're saying; I unwittingly applied a broad definition to a specific term. My bad.
Failing to clean up my own mistakes since the early 80s.

Quote from: Altimadark on October 19, 2014, 03:18:32 PM
Okay, I think I understand what you're saying; I unwittingly applied a broad definition to a specific term. My bad.

Oh, it's not just you, it's how EVERYONE is using the term A.I.!

Quote from: MrBogosity on October 20, 2014, 06:57:00 AM
Oh, it's not just you, it's how EVERYONE is using the term A.I.!

That makes me feel marginally better. Marginally.
Failing to clean up my own mistakes since the early 80s.

Quote from: Professor_Fennec on October 18, 2014, 06:02:19 AM
Anything we can do, a machine will figure out how to do better on its own.  Its only a matter of time before machines become as smart as us, but what about when they become smarter than us?  Don't you guys think that makes humans an obsolete technology?  I don't see how that's paranoid as it is visionary.  Perhaps you may have noticed the lack of ape men running around, and a dwindling number of apes in the wild.  Don't think for a second that we couldn't go extinct, too, by a superior intelligence that came from us.

That's a completely separate thing. The species we are now is not the same as 40,000 years ago. The species we are now is not the same as the one that will be 40,000 years from now. We will become extinct. Whether that is because our descendants evolve into a better suited species, or we create a mechanical species is irrelevant to the question. Either way, we will be unaware of the difference.

I'm not sure what "True AI" even means, but you could break AI down into a spectrum.

On one end, you would have basic AI where a program makes decisions based on pre-programmed decision tree.  More advanced versions might have the ability to modify what decisions it makes in the future based on past events.  Modern AI simulates neurons, creating virtual neural networks to allow for "thinking" more akin to an actual brain, including features such as pattern recognition and situation analysis.  The more complex the neural network, the more complex the tasks an AI can complete.  Imagine what one of these neural networks could do if, say, they were build from more than 100 billion virtual neurons, which is roughly equivalent to how many biological neurons a human brain has.  Once we get to that point, I think we will have been well past the usefullness of the term "AI", as the word "synthetic intelligence" would probably be more apt. 

Now imagine Moore's law still in effect, allowing computer power to double every 18 months.  That means, every 18 months, computers would double the amount of virtual neurons they could support.  That would be a nominal increase in synthetic intelligence, easily beating human intelligence, with all of its biological limitations.  At that point, it becomes impossible for the human brain to out-evolve the speed at which computer based intelligence can advance. 

The problem is, Moore's Law is predicted to run out mid-2020s: that's when transistors become so small that quantum effects make them useless (and even then, the fact that Dennard Scaling's already petered out means that we just won't get the same effect from them anyway). So further advances will have to wait for quantum computing, and who knows if there's some corollary of Moore's Law that will apply to that? It could stop AI development in its tracks, or give it just the push it needs!

Quote from: MrBogosity on October 23, 2014, 06:52:38 AM
The problem is, Moore's Law is predicted to run out mid-2020s: that's when transistors become so small that quantum effects make them useless (and even then, the fact that Dennard Scaling's already petered out means that we just won't get the same effect from them anyway). So further advances will have to wait for quantum computing, and who knows if there's some corollary of Moore's Law that will apply to that? It could stop AI development in its tracks, or give it just the push it needs!

It won't entirely run out, we'll just stop being able to squeeze more into less space.  Improvements in process technology will still allow larger devices for a while yet, but eventually a maximum practical die size will be reached.  If we can get a superconductor we can layer in there and devise a practical means to build up as well as out, then we've got a long way to go yet.

Quote from: evensgrey on October 23, 2014, 07:47:24 AM
It won't entirely run out, we'll just stop being able to squeeze more into less space.

That's what Moore's Law is!

Quote from: MrBogosity on October 23, 2014, 10:18:10 AM
That's what Moore's Law is!

Ah, no, it isn't.  Moore's Law just gives a rate at which the number of transistors that can be put on a single die increases.  The progressive reduction in size of transistors and diodes has been a major mechanism allowing this (along with improvements in process technology that have allowed improvements in the quality and size of wafers to improve yields, allowing the practicable die size to increase).

If we could, for instance, add additional layers of transistors over top of existing layers, we could continue to add additional transistors to dies, allowing Moore's Law to continue longer.  At this time, it is not viable to add additional layers over existing ones, but when we can no longer make smaller junctions, or bigger dies, this will be the only way forward.

It may be that we can't get computers much smaller then they already are, but you may have noticed that clock speeds haven't been increasing much as of late, either. 

If manufacturers can successfully transition to graphene or diamond wafers from silicon, then computing power could be increased dramatically, going from just a few Ghz to Thz speeds, since these materials can withstand significantly higher temperatures and voltages that silicon wafers simply cannot touch without extreme cooling techniques. 

We may yet still see great increases in computing power before the quantum computing revolution takes hold. 

Quote from: Professor_Fennec on October 26, 2014, 05:24:16 PM
It may be that we can't get computers much smaller then they already are, but you may have noticed that clock speeds haven't been increasing much as of late, either.

Yes, due to the petering out of Dennard Scaling that I mentioned.

QuoteIf manufacturers can successfully transition to graphene or diamond wafers from silicon, then computing power could be increased dramatically, going from just a few Ghz to Thz speeds, since these materials can withstand significantly higher temperatures and voltages that silicon wafers simply cannot touch without extreme cooling techniques.

How fast is their switching power, though?

Quote from: MrBogosity on October 26, 2014, 06:01:32 PM
How fast is their switching power, though?

I'm not sure what you mean by power.  Power is usually a term that references the combination of speed and strength.

As I understand it, one of the limitations of silicon is that it burns at lower temperatures than graphene and most certainly diamond, so to achieve higher clock speeds you need expensive cooling setups to handle the voltages required.  Also, silicon transistors are more fragile, meaning that they have a tendency to break over time.  The faster the clock speed, the more stress the transistors are put under. 

Graphene and diamond are much stronger than silicon, so their transistors should be much more durable at high clock speeds.  Plus, they can withstand much higher temperatures, meaning you can crank up the voltage to extreme levels without elaborate cooling systems.

Quote from: Professor_Fennec on October 30, 2014, 05:38:28 AM
I'm not sure what you mean by power.  Power is usually a term that references the combination of speed and strength.

As I understand it, one of the limitations of silicon is that it burns at lower temperatures than graphene and most certainly diamond, so to achieve higher clock speeds you need expensive cooling setups to handle the voltages required.  Also, silicon transistors are more fragile, meaning that they have a tendency to break over time.  The faster the clock speed, the more stress the transistors are put under. 

Graphene and diamond are much stronger than silicon, so their transistors should be much more durable at high clock speeds.  Plus, they can withstand much higher temperatures, meaning you can crank up the voltage to extreme levels without elaborate cooling systems.

The current race appears to be between the materials people working on carbon-based semiconductors and the materials people working on new superconductors operating at typical chip operating temperatures.  (A superconductor that operates at typical chip operating temperatures would be the ultimate heat pipe and allow for less waste heat production in the first place.)