Updated: 1/1/2003; 9:17:03 AM.
Mark Oeltjenbruns' Radio Weblog
The glass isn't half full or half empty, it's too big!
        

Monday, December 09, 2002

A legitimate use for P2P technology

 

            I was reading a slashdot entry on Advances in Decentralized Peer Networks.  One of the responses made a good point that there are very few legitimate uses for P2P technologies.  Sure, there are the token uses like free speech from diffucult governments, but I’ve never run across them.  Don’t get me wrong, I think they can serve a valid purpose.  At the same time I think that 99.999% of their use is for “Pirating” of material.  At this time I’m not here to discuss that P2P use.  I’m here to briefly describe what I feel is a legitimate use for the technology. 

            Server overload.  That would be a perfect use for the vast array of computers out on the Internet.  What I envision is what is typically done at a local level with load balancing of servers.  Instead of the main servers doing it though, the overload is shared by distributing it to the P2P network.  That way when the demand for content sky rockets the P2P network will help out. 

            Some examples include election results bogging down a news site, the server could provide a pointer to a P2P network link or a site that is getting “slashdotted” from a mention on the net.  Some new and large file sized RocketCam movies are discussed, the huge demand would be too great a load on a server or too expensive for a metered web site.

            Although I think in theory I like the idea, I do find a few flaws with it. 

·        Most P2P networks are static and file centric.  They don’t cope well with dynamic data

·        Some sites don’t want to ease the load.  Portals would not want to see their “customers” be served by someone else.

·        It would be slower in some cases.

·        It is nothing new(?).

Static and file centric:

            The underlying format for any data distributed in this manner would be files, the problem is in how they are accessed.  What would be needed is some way to search for content that isn’t file based. 

For instance, if I want election results, I want the most current ones out there, not some archived version from this morning.  I want to be able to do a www.portal.com/election2004/results.html and if the server can’t handle the request it returns a query ID that would be used to query the P2P network for the information.  The retuned information would be a web page with everything needed for display.  In addition, behind the scenes, updates would be floating through the network updating every copy that is attempting to mirror the data, triggered by the underlying web server being mirrored.  The underlying network could even send out kill requests when the load goes back to normal levels or a query ID hasn’t been accessed for awhile.  Garbage collection.

“Customer” Service:

            What I mean here is that sometimes no response is better than someone else responding.  Portal sites are not in the business of providing data to surfers, they are in it to make money usually by some form of advertising.  If people can go some place else, the portal can’t count them from an advertising perspective.  Some of this could be eased by a feedback mechanism that would inform the server of the requests it served.  Even this though would only be a partial solution since whatever ads were served when the mirroring took place would get additional air time that could be hard to track, viz. lost revenue. 

            What is good for the surfer isn’t good for the server.

Slower:

            Having to wait for a P2P network query to take place could be slower than just hitting refresh until the server responds, assuming it is still able to respond. 

The benefit could be greater if it was made more transparent to the end user.  If I have to fire up some application to get onto the network, set up a query and wait, then maybe it would be too much of a hassle. 

One solution to this could be smart Web server error messages.  Perhaps a dynamic error message like “I’m swamped, leave me alone!” could be made dynamic with query ID’s of current pages that are really getting asked for a lot.  Clicking on the link would automatically invoke the query of the P2P network, serving the page. 

Perhaps a webmaster would want to do that.  For instance, he knows that some page or information he is hosting will get some sudden surge in usage, he could make the web server just display the query ID page for any requests for the information, forcing the entire load onto the P2P network.  End users would be able to get the information and other pages on the site not getting slammed so hard could still be served.

Nothing New:

            Some of this is already being done in things like caches, load balancing on the server side etc.  However, I think that a better solution is out there: P2P networking. 


7:44:39 PM    comment []

© Copyright 2003 Mark Oeltjenbruns.
 
December 2002
Sun Mon Tue Wed Thu Fri Sat
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31        
Nov   Jan


Click here to visit the Radio UserLand website.

Subscribe to "Mark Oeltjenbruns' Radio Weblog" in Radio UserLand.

Click to see the XML version of this web page.

Click here to send an email to the editor of this weblog.