BARBELITH underground
 

Subcultural engagement for the 21st Century...
Barbelith is a new kind of community (find out more)...
You can login or register.


Moore's Law and the Eschaton

 
 
Cookie H. Monster
13:55 / 19.05.02
Just caught this on
Slashdot - an article in EE Times discusses how CMOS, the transistor technology used in computer processors - virtually all electronic chips really - is reaching physical limits for miniaturization. Manufacturers will have run out of ways to pack more and more transistors into a computer chip, which in turn implies that they can't make them any faster or more powerful. Moore's law - which states that the maximum transistors per square inch doable by the industry doubles every 18 months, an empirical observation that has been remarkably accurate so far - seems to be hitting a ceiling.

The projected year for hitting this ceiling is 2012. *Ba-dump-tisssh*
 
 
Thjatsi
02:12 / 20.05.02
This statement rests on the rather large assumption that no one is going to find a better material for making computers during the next ten years.
 
 
Lurid Archive
09:11 / 20.05.02
How far are we from the theoretical limits set by quantum and relativistic effects?
 
 
KING FELIX
12:29 / 20.05.02
>This statement rests on the rather large assumption that no one is going to find a better material for >making computers during the next ten years.

Well, isn't that one of the interesting things about it, they will have to go to nanotech, dnachips or something really weird, regardless of how many users they abandon in the process.

I think its similar to if oil or fuel would disappear and car manufacturers would be forced to come up with cars that didnt run with regular fuel.
 
 
Ariadne
13:24 / 20.05.02
Why would they 'abandon' users, though? If the next generation Intel chip was to be based on carbon nanotubes instead of silicon, your old silicon-based PC will still carry on working - it'll just be large and slow compared to the new product?

So the car/fuel analogy doesn't really fit because there's no pressing need to shift to nano (or anything else) if we're happy to stay with the silicon-based speeds we have today. But of course the manufacturers wouldn't make any money that way, so on we go down the smaller-and-faster path.
 
 
Lurid Archive
13:31 / 20.05.02
True, to some extent. But despite the profit motive of Intel and others, isn't something to be genuinely gained by having increasingly faster computers? Doesn't it open up the possibilities of applications that we can only imagine now?
 
 
Ariadne
13:33 / 20.05.02
Oh undoubtedly, and I'm all for it cause I like new toys! I'm just writing about carbon nanotubes right now and it's all very groovy.

I was just saying that it's not an either/or situation - silicon will still be around for a long time, and a move to other technology isn't really that different to the way Moore's Law has operated so far.
 
 
Ariadne
13:35 / 20.05.02
Oh, and to answer your question above, Lurid, IBM reckons we'll hit the physical barrier, where current chip features can't be made any smaller, in about 10 to 15 years. Hence all the researchers running about looking for alternatives.
 
 
KING FELIX
17:19 / 20.05.02
Upon rereading it I discovered that my post wasnt all that clear.

What I was aiming at was that when manufaturers are forced to look at different ways of accelerating computers its a big possibility that applications that wasnt thought of before, because they simple werent possible due to inadequate speeds, might spring into existance.

I am thinking about more advanced applications, not really making word or quake run faste, but cellular automata, AI, and applications more concerned with massive number crunching.

What I meant with abandoning users, was more that if you had to come up with something *enirely* different, a lot of applications might have to be rewritten from scratch. Which I think would be a good thing, since there is too much bloatware nowadays anyway.

Anyway, I am just ranting....I have no idea what will happen.
 
 
Ariadne
17:41 / 20.05.02
Oh well, yeah, that's true, and all really quite exciting, if a bit scary to think what might happen. The possibilties I've read about nanotechnology are amazing - good things, like tiny bots tracking your health, and bad things like those same things used to control and spy on people.
And of course, as you say, all the things we can't even dream of yet. I'd be interested in what Mordant Carnival thinks of all this, where she thinks it'll go, cause she actually knows what she's talking about (as opposed to journalist-me who just knows odd bits and pieces).
 
 
Lurid Archive
18:23 / 20.05.02
But its also important to remember that the hardware is only one aspect of this, though it is an important aspect. For instance, when it comes to actual AI - and I mean something that could actually be regarded as intelligent and not just something that mimics some intelligent behaviours - no one has a clue about how to make a mind, as far as I can see. People are still arguing about whether it is theoretically possible. My view is that it is, but its a long way into the future.

Other technologies require imagination, as well as an advance in knowledge/technology. Did many people envisage the internet in the 60's? Dunno. I read about lots of wonderful ideas in New Scientist, some of which are bollocks, but some of which will be very exciting.
 
 
cusm
19:24 / 20.05.02
There's another solution. I don't recall the article now, but I read a piece on building verticly stacked chips. By that, someone figured out a process to lay multiple layers of silicon. So, you can basicly stack chips on top of eachother, building up rather than out. Makes the whole issue quite a bit less critical, as chips begin to resemble cubes rather than wafers.
 
 
netbanshee
20:19 / 20.05.02
...also remember that many of the steps taken to acquire the uses we put to technology today were guesses. Just applying the same answer in a more robust way is only one way of making work more efficient and speedier. As Nick was saying...once bloatware sheds some pounds and AI routines build better programs, there will be gains there as well.

Every time the process of technology has reached a ceiling, there were other means available just in time. True...people are talking about proteins, nanotubes, etc. but I'm sure that it's only a small part of the overall equation.
 
 
Cookie H. Monster
20:34 / 20.05.02

This statement rests on the rather large assumption that no one is going to find a better material for making computers during the next ten years.


From what i gather, the bottleneck is not so much replacing silicon tech as it is getting the new tech ready for commodity production - widespread, standardized and cheap. From the original article: "And whatever is to replace CMOS must be invented now, because it will take at least a decade to develop into a commercial technology, speakers said."
 
 
We're The Great Old Ones Now
09:02 / 21.05.02
Of course, you have to ask what a regular user could possibly want to do which would require more powerful chips...
 
 
Ariadne
13:29 / 21.05.02
Well, games, I suppose? They'll develop to take advantage of whatever advances in speed the manufacturers come up with. And graphics packages need lots of processing power. Plus there will do doubt be other apps developed to take use the power you have.

Plus the smaller the chips, the smaller your phone/video/computer need be. Maybe your phone could just sit behind your ear (so long as you fancy a nice tumour to go with it)?. Or your computer be mounted on the wall?

Away from home PCs, supercomputers currently take up enormous amounts of space, so the faster and smaller the chips, the faster and smaller the supercomputers.

Though I agree to an extent - I doubt most people use all the processing power of their PCs already. I certainly don't.
 
 
cusm
15:02 / 21.05.02
"No one will ever need more than 640k of memory" - Bill Gates, sometime in the early 1980s
 
 
tSuibhne
17:19 / 21.05.02
re: AI, gut feeling says some big steps will be taken in the not to distant future. People are doing some interesting things with information processing and knowledge management. That's just a gut feeling though, and I could be way off.

re: future of computers. Personally, I see a fragmenting of the market. I see the fields that need the faster processor speeds moving away from silcon based machines. But, I think your average desk top will likely stay silcon based for awhile longer. People just don't need to speeds they have.

Course if that happens that might mean that I'll have to acctually buy a computer at some point in the future, instead of my current plan of just canablizing parts from tossed machines for the rest of my life.
 
 
tSuibhne
17:29 / 21.05.02
oh, and a note on cusm's quote. It's not that people will not need the faster processor speeds sooner or later. It's that they do not need them at the rate that they are getting them. The average user uses a computer for email, web surfering, and some office stuff (word processor, spreadsheets, etc.) Untill about 2 years ago, I did all this, with no problems, using my old 486. I then scored a 100 MHz machine, and continued with that untill very recently, again with no problems. I just recently traded some stuff to a local used shop for a 300 MHz machine, so I could get a cable modem hook-up. I may try to upgrade the chip to 400MHz, so I can get a CDR. But, if I don't do that, this 300MHz machine will likely last me several more years. And even the 100MHz machine could still have some life in it, if the parts hadn't started to go on me, and I hadn't lost my free ISP access (I was piggy backing on my folks provider, but they moved)
 
 
Sensual Cobra
18:57 / 21.05.02
cusm - The silicon stacking you're talking about was mentioned in an article about 3-ish months ago in Scientific American. It seems promising, and closer to being ready for prime time than the other solutions. I think for raw computing power, it's going to be quantum computing, though who knows when that will be. I agree with the idea of a coming divide between improvements on silicon/electric processors and more esoteric, perhaps niche, solutions like quantum and bio-computing.
 
 
Sensual Cobra
18:59 / 21.05.02
Man, it's like, every time, with the italics.
 
 
tSuibhne
19:59 / 21.05.02
Ok, I just chatted with our nano guy here at work. I'll spare the full details because I'd probably get them wroung, but the basic jist of his feelings are:

A quantum dot will likely be used to replace the traditional transistor. These dots are 1/5 the size of a transistor, so you can put five of them in the space of one transistor, giving you five times the power. This move will be coupled with a basic change in architecture, but probably not a sudden one, instead the change will precede the switch over to the use of quantum mechanics.

He feels that the average user will only notice an increase in speed, as the look and feel of the computer (the case, monitor, etc.) will likely stay the same. He also informed me that with the currently proposed time lines, we should be able to have these new chips in place by the 2015ish deadline.

NOTE: There is a difference between nano technology and molecular computing. Nano could possibly hit the market by 2011, molecular computing isn't even on any near term time lines.

Also, our nano guy brought up an interesting point. Moore's Law is as much an economic law, as it is a technological. The key is that processor speeds will go up, while costs remain constant. It's his opinion that Intel, Athlon, etc. will do everything they can, to make sure Moore's Law holds true. Including swallowing the loss.
 
 
cusm
20:08 / 21.05.02
We run a server farm here at work, and I can say faster machines are always a blessing, even at the 2Ghz range. The faster the machine, the more users we can put on it, the more economical production is. But as for desktops, there has been a gap lately in need, but never for long. 3D gaming never takes very long to use up all available processor power to run bigger games with more polys. There's your bleeding edge. Of course, for that, PCs are like sporting equipment. You always want the best so you can stay competetive. There's one economic sector that will keep up with the growth.
 
 
tSuibhne
20:19 / 21.05.02
True, but I was refering to the majority of users. Gamers are still a minority. Gamers would likely move into a nitch area. It's already starting to happen somewhat. I don't know any hard core gamers who buy thier machines from Dell or HP or Gateway, etc. They're all buying their machines from vendors who cater to the gaming market.

The theory does need to be revisted, though after getting the scoop on how nano will address the current problems.
 
 
dlotemp
23:27 / 21.05.02
I'd like to note that quantum computing is already a reality. IBM-Almaden researcher noted in August 2000 that "quantum computing begins where Moore's Law ends - about the year 2020, when circuit features are predicted to be the size of atoms and molecules." In 2000, IBM made a quantum computer with five qubits - no, not the Harry Potter thing! - qubits means quantum bits.

Check out www.sciencedaily.com/releases/2000/08/000817081121.htm

Great site and a great article.
 
 
netbanshee
04:36 / 22.05.02
Like it was said before...Moore's law is not really scientific per se but more of an observable outcome and trend that appears in technology. And there's plenty of playing room to get high marks everywhere else. I've been doing web production and a little video/interactive....sure my machine can't step to new ones (almost 6 years old), but I learned how to use it in a more efficient way. Lots of factors are involved.

My question is...as far as new technologies are concerned...how will quantum computing be physically implemented. What materials or setup is needed to bring it to a useable and consumer level? I've seen projections with managing molecules floating on liquid gases, etc. but I doubt any average person will be able to implement it realistically. In a sense, what will the guts look like? I'm sure it'll be a reality when it needs to be, but what will it be exactly...seems theres a lot of projection without giving an idea of how it'll be engineered...
 
 
captain piss
10:52 / 22.05.02
Coming a bit late to this thread, but just to add a few points, based on what people have said to me (as an occasional techy journalist):
The processing power and chip complexity envelope is pushed mainly by graphics, with companies like Nvidia - which produced the graphics chip for the X-box – making what are really the biggest, densest and fastest chips today. Communications routers are probably next on the list. These feature in the equipment that telephone companies use to route all the different signals through the network. There’s a lot of pressure to integrate more and more on a chip because of space and power constraints.
But yes, as a few people have said, the bigger concern, at least for those companies making the electronics, is how to get people to use all the technology that is available. The industry is desperately searching for a ‘next big thing’, to take the place of mobile phones and PCs. A currently popular idea for what this will be is roughly summarised as ‘ambient intelligence’. The idea is that we will be surrounded by intelligent, but invisible, computing and networking technology embedded in clothes, furniture, vehicles and buildings.
There’s interesting stuff on this at MIT’s Media Lab (www.media.mit.edu/wearables/) and stuff on wearables at Carnegie-Mellon (www.wearablegroup.org).

Bizanchee: Haven’t a clue on the question of what quantum computers will look like when they’re in everyday use.
 
 
tSuibhne
14:58 / 22.05.02
biz, quick question, then I'll run off and bug the nano guy for your answer. Are you talking about quantum computing, or molecular computing? molecular computing is where you get stuff like Grant's nano swarms, etc. That's a long way off from being production (though I was told yesterday that IBM has a 300 gig harddrive that's the size of a quarter in prototype form). I've seen the ideas blured, so I want to make sure I understand the question.

Off hand from what I remember from yesterday though is that to most people, there will be little to no difference between the machines using quantum computing and the current machines. The innards might look a little different. And there will need to be modifications to the current archetecture, but it doesn't sound like anything increadibly drastic.

As for the 'next big thing' question. If anyone knew the answer to that question, they'd be a very rich man.
 
 
netbanshee
18:22 / 22.05.02
Since there's a wide variety of possibility pertaining to which tecnologies will be implemented in the next generations of technology, I'm curious about all applications. Molecular seems a bit off for now, but I guess nano mechanisms and quantum usage is what I'm getting at now.

How is quantum computing harnessed physically? What will the materials used? When speaking of quantum switches (both 1 and 0 simult.) they refer to the properties of what materials...electrons and atoms? So then, if this is the case, how does one set up these atoms in a physical space? Is it going to be a solid that changes structure and the data stream being translated into usable info....

It's just that right now it is very much an experiment and a projection of use. But what does say that IBM 5 qubit computer look like internally? Does it function like a machine, or is it 5 bits that some scientist is changing in a lab and producing positive results?
 
 
Lurid Archive
08:47 / 23.05.02
I was reading an article in New Scientist a few weeks back about how quantum computers will break the "Turing barrier". That is, they won't just be faster, they will be able, to do more things. It sounded like bollocks to me, but does anyone know?
 
 
netbanshee
15:32 / 23.05.02
...there's some speculation as to "how far machines will go", but some scientists agree that machines will meet and surpass humans in intelligence exponentially as the complexity that these system can handle grows. The Turing test was sort of a bladerunner (or better yet PKDick) style test...testing to see the intelligence of a machine and seeing if it rivals human ability by examining different attributes and seeing if they are comparable. It's thought that by 2030, machines will pass by flying colors.

Also to mention...the "...Spiritual Machines" book has tons of references in it regarding most of the conversation here. The back has a good weblink reference in the back. Just another reason to pick it up (or again) and check it out. I'll see if I can find anything to post up in regards...
 
 
netbanshee
15:41 / 23.05.02
So far...

IBM's research division

IBM's research division: Quantum area

Laboratory for Theoretical and Quantum Computing
 
 
Lurid Archive
11:23 / 26.05.02
It's thought that by 2030, machines will pass by flying colors.

Just thirty years in the future. Is that the same thirty years that it has always been? That same thirty years used that it will take before we have widely available fusion power?

I'm no expert, but I try to keep up. Computer power is certainly increasing exponentially, but there are reasons to believe it may not continue doing so for decades. More importantly, the implicit assumption is that increased power neccessarily equates to the production of an AI.

Now, there certainly have been some interesting advances. Like the discovery that simple behaviour patterns adopted by machines working together produces an insect like intelligence. But I've yet to see an idea that actually explains how to produce an intelligence close to that of a dog, say. You can get a machine to act a bit like a dog, but dogs can survive by themselves and adapt - the Sony dog AIBO has an impressive but limited set of behaviours.

As far as I can see there are people in the field who make claims that we will build vastly superior intelligences in just a few years - as they've said for decades now - and there are those who doubt it is even possible to produce actual intelligence. I tend to believe it is possible but I think the current understanding is overplayed and the depth of our ignorance makes a shaky foundation for the optimism of some.
 
 
tSuibhne
02:59 / 27.05.02
In the first book of his Ware series, Rudy Rucker mentions a theory about the creation of AI. It uses some philosophical arguements that I can't remember, but the basic jist is, man can not create AI, but AI can exist. How to get around the paradox? Create a program that is capable of evoloving, then take it as close as possible, and let it evolve into an AI. Evolution through reproduction will allow a multitude of systems, with different "specialities," like humans.

Interestingly enough, while I was reading this book, researchers announced that they had, in fact, created a program that evolved through reproduction.

Factor in the fact that modern knowledge management has created systems that have moved beyond the standard "converting data into information" mode of computers, into systems that are able to take data, turn it into information, and then turn that information into knowledge. By acting on that information, observing the outcome of that action, and then readjusting future actions, based on the observed action.

Finally factor in the fact that this movement in knowledge management, is one of the main things fueling the current "Information Revolution" and you've got a recipe for an interesting near future.

Will this mean that AI is a "few years away?" Probably not. Have we just taken a big step closer? Yup. Will molecular computing help in bringing AI about, probably (since it means A LOT more processing power, and we are talking about a lot). In the end, only time will tell.

NOTE: I was some what skeptical myself of the ability to create AI, untill I started working in the knowledge management field. That's when i started getting excited about it.
 
 
tSuibhne
03:07 / 27.05.02
Oh, and to just throw out a bit of information that kind of pertains to things here. Researchers have created molecular computers (your nano swarms, etc. not to be confused with quantum computing) but have not been able to do so reliably. Sometimes it works, sometimes it doesn't. Our nano guy here at work, pointed out the problem that there really aren't any tools that small to build these machines with. Something that needs to be rectified, if they are going to go to the assembly line.

This ability to reliably create molecular machines, is the main stumbling block now.
 
  
Add Your Reply