A Future Internet Might Not Use Servers

A Future Internet Might Not Use Servers

You’d think that given how pervasive the internet is, we’d be stuck with the fundamental architecture it uses: servers that many devices connect to for their information fix. But a team of Cambridge University scientists wants to shake things up — and remove servers altogether.

A project named Pursuit aims to make the internet faster, safer and more social by implementing a completely new architecture. The system does away with the need for computers to connect directly to servers, instead having individual computers being able to copy and re-publish content on receipt. That would allow other computers to access data — or, at least, fragments of data — from many locations at once.

If that sounds like peer-to-peer sharing, it’s because it is. But the difference here is that it would be rolled out on a huge, unprecedented scale: it would be internet-wide. Dirk Trossen, one of the researchers from the University of Cambridge Computer Lab, explains why that’s a good thing:

“Our system focuses on the way in which society itself uses the internet to get hold of that content. It puts information first. One colleague asked me how, using this architecture, you would get to the server. The answer is: you don’t. The only reason we care about web addresses and servers now is because the people who designed the network tell us that we need to. What we are really after is content and information.”

Essentially, the idea is to remove the concept of the URL from the internet. In fact, the researchers explain that online searches would stop looking for URLs (Uniform Resource Locators) and start looking for URIs (Uniform Resource Identifiers). URIs, then, specify where data is — and where to go in order to find it — rather than being the single point of call. Trossen explains what that means for the user:

“Under our system, if someone near you had already watched [a] video or show, then in the course of getting it their computer or platform would republish the content. That would enable you to get the content from their network, as well as from the original server… Widely used content that millions of people want would end up being widely diffused across the network. Everyone who has republished the content could give you some, or all of it. So essentially we are taking dedicated servers out of the equation.”

The upshot? Speed, efficiency, and reliability, with no central server to buckle under the load of demand. Admittedly, it’s a very bold aim — but if it can be pulled off, it could radically change our experience of the internet. [Pursuit via PhysOrg]

Picture: nrkbeta


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.