BARBELITH underground
 

Subcultural engagement for the 21st Century...
Barbelith is a new kind of community (find out more)...
You can login or register.


IRIS - the next wave of P2P, brought to you by the US Goverment

 
 
tSuibhne
18:11 / 02.10.02
article

"A team of government-funded US scientists is building a Peer-2-Peer (P2P) network that they say will solve technical problems with existing P2P networks, such as Gnutella and Kazaa, and might even one day supersede the web.

The network, dubbed the Infrastructure for Resilient Internet Systems (IRIS), will speed up searches and information transfer over the internet, and aims to foil "Denial of Service" attacks by hackers - in which a web server is swamped with requests for a page until it crashes.
"

"IRIS is being designed specifically to solve these problems. Its three design criteria are to guarantee: • that as long as there is no physical break in the network the target file will always be found; • that adding more information to the network will not affect its performance; • that machines can be added and removed from the network without any noticeable adverse affects.

"There is no single network that meets all these three properties as yet," Balakrishnan told New Scientist.
"

"The project will be developed over the next five years by researchers from five institutions, including MIT and the University of California at Berkley, who have jointly received a $12 million grant from the National Science Foundation to develop IRIS."

"The first application will be a distributed version of the web. This raises the prospect of it being very easy to published information anonymously, for example, pirated music and video.

But he does not believe this should curtail his research. "How do you prevent people from doing bad things? I don't think this is a technical problem," says Balakrishnan.

In fact his team is developing algorithms precisely to thwart the censorship or control of information on IRIS. "People are working in our team to prevent removal of information," he says. "I am not interested in censoring the publishing of information."
"

Talk about aiming high. This is going to prove VERY interesting if it gets pulled off.
 
 
Elijah, Freelance Rabbi
01:08 / 03.10.02
i really dont think you can make a network that does not lose quality as more machines are added. Unless they come up with some way to creat insane bandwidth directly (like running fibre optic to EVERYONE) i dont the they can do it.
 
 
tSuibhne
02:42 / 03.10.02
That's definitely one of the issues that I'm wondering about. The only thing I can think of is a system where information is distributed to the point that instead of a single network, like we have now with the internet. You actually have several small networks. With the majority of high demand information residing in each of these networks. If each of these smaller networks are fed by large pipes, then that MIGHT address the issue. For instance, if you work in a data center that has a T-1 connection for the building. Another office adding a few more computers will have little effect on your connection speeds. Because there is enough room in the pipe to accomidate the increased bandwidth needs.

I believe that telco's are currently (or were before thier industry went to shit, not sure now) looking into situations where a fiber line is run into a residential area, and then the people in that area use that pipe. I know a year or two ago, this was being thrown around as the answer to distance restrictions of DSL. So instead of measuring your distance to the local Phone Company office (where the DSL line ussually originates) you measure the distance between you and the nearest Point of Service (POS). In my mind this kind of architecture could greatly reduce the amount of clients that are sharing a pipe. Giving each user a higher amount of average bandwidth. We ran some simulations for a contract at work on something similiar to this, and found that with average traffic patterns (email, browsing the web, and some small interactive web services) it acctually takes a lot of people before you start to see degredation of throughput.

And something that just occured to me. Perhaps this will result in a shift in the buisness model for residential broadband. Currently, residential broadband is based on the connections. Most user agreements say that you'll only use one computer at a time on the line. Buisnesses though, that demand high throughput needs, pay according to a promised average bandwidth. The telco promises to deliver a certain amount of bandwidth on an average basis. And then the company can put as many connections on it as they want. The telco's then sell unpromised bandwidth to other companies. When they have to many clients in a certain area, they increase the size of the pipe to accomidate new business.

Course, I might be completly full of shit. It's been a very long day of running around, and I'm just thinking off the top of my head.

Also, I'm assuming what they meant was not that there is 'no' effect. And instead meant that the effect is so minimal that the average user won't notice it.
 
 
captain piss
15:03 / 03.10.02
I have never quite grasped what all the excitement is about with P2P. It's good for music sharing and...research, like the SETI business and that DNA screen-saver thingy. But..is that it? The video-sharing DiVX malarkey hasn't really got going, it's very difficult to get the decoders for that - obviously cos content providers are twitchy about web sites that make it available.

I dunno - I still can't see what's so potentially ground-breaking about this IRIS thing- will read your posts again and get back
 
 
tSuibhne
18:18 / 03.10.02
The video-sharing DiVX malarkey hasn't really got going, it's very difficult to get the decoders for that -obviousy cos content providers are twitchy about web sites that make it available.

Um, care to clarify? I had no trouble what so ever finding the Div5 codec when I started pulling down fansubs.

One of the reasons that P2P is a big thing is that it pushes the internet back near it's roots. That being the free and open exchange of information. Say you've got a file that you want to send to a friend (let's ignore the possible legal issues here for the purpose of this discussion, there are a lot of very legal ways to use P2P) If the file is to large to send via email, you're left trying to find other ways to send it. Either through the reguliar mail on disk, or going through the hassle of setting up a ftp site or web site that the person can download from. With p2p the person can just download directly from you.

It also provides a more technical advantage, when discussed in the scope that IRIS is discussing it, that being a completly distributed enviroment. Right now, for the most part, if a sever goes down for some reason, that information is inaccessable. In a distributed p2p enviroment, like the one that I think IRIS is working with, that information would still be available, but from another source. In a manner of speaking you are making it easier to gather and track information.
 
 
captain piss
13:38 / 04.10.02
hmm, fair enough- thanks for explaining that. The advantages of these things don't seem all that bowel-shattering to me but then I guess people have barely engaged with the possible advantages of having easy and more democratic data-sharing, and I'm also a bit naive as to the possibilities at this stage. What sort of future applications do you guys see as potentially exciting on the P2P front?

Oh yeah- you're right about DiVx (the recently-hyped MP3-for-video scheme, for the initiated)- there's no hassle getting the codec- don't know what I was on about there. Can't seem to get hold of any free films, though.
 
 
tSuibhne
17:55 / 06.10.02
DivX - Not sure about "free films" but fansubs (anime that's yet to be licensed in a country, and so translations are only available through fan created subtited versions) basically run on DivX these days.

Possible applications for P2P - Not quite sure acctually. Off the top of my head I'm thinking less about new apps, and more about current apps that work better. I'll ask around though.
 
 
Tom Coates
08:52 / 07.10.02
Well the most important thing about P2P file-sharing is that essentially it's ungovernable in that (when done properly) it doesn't pass through any centralised servers. It's therefore a really good way of getting information past censorship and spread quickly. And it might not be the most efficient use of computing space to replicate things computer to computer, but it does mean that it scales well - the more popular something is, the easier it is to get hold of.
 
 
BioDynamo
20:04 / 11.10.02

At least in my country, if I've understood things correctly, there is a legal difference in how file-sharing P-2-P is treated, as compared to "mass distribution". Basically, sharing a whole movie P-2-P is not criminal, whereas placing it on the net for general download would be, or distributing pirated DVDs or something. So it does open up big possibilities in the area of freedom of information.
 
 
Cloned Christ on a HoverDonkey
01:24 / 15.10.02
Just out of interest, for anyone who does want to try to get hold of any illicit, royalty-leeching DivX movies; the eMule client connects to the eDonkey network, where there are a plethora of very high quality movies available.

Not that I'm encouraging this form of behaviour, of course.

It's illegal, don't you know?
 
 
tSuibhne
22:38 / 17.10.02
Of course, I forgot about one of the most promising aspects of p2p software. That being grid computing.

The basic idea of grid computing is several machines working together to amplify their processing power. In a little more detail. A grid computer will use any unused processor power from your machine to run other people's jobs. An early example of this is the @Home projects. The most famous being Seta@Home. Where people helped processes radio signals from outer space. That program ran as a screen saver and was only active when the machine was dormant. I beleave, current attempts at grid computing will be able to run while you are using your machine, but don't quote me.

Another example of the idea of grid computing is the "Cell" chip that is supposed to be in the PS3. That chip will work with other chips that it is connected to to amplify the power of the PS3. An example would be the PS3 working with any chips that might be in your TV to deliver even better video quality.

At the heart of this is the fact that, with the possible exception of some video games and high end graphics packages, there's never any software on the market that fully utilizes a newer chip's processing power. For instance, Intel just released their new 3GHz chips, but, again with the possible exception of some video games or high end graphics packages, there are no programs that are even requiring a 2GHz chip to run. Obviously, software makers are not going to create software that requires a chip that either hasn't been released, or has just been released. Economics restricts them to processing speeds that are at least 2-3 years old. This results in machines that have processing power to spare. Grid computing is proposing to use that unused processing power for other various jobs. What exactly, is acctually up in the air. Theoretically, there are no limits.
 
  
Add Your Reply