Computeruser.com
Latest News

Peer review

Peer-to-peer networking is all the rage, but is it business-worthy?

There probably are some LAN administrators who did a double take when they saw certain headlines in technology publications over the past year. For example, “P2P: Wave of the Future” might have come as a shock to anyone who thought they had seen the last of LANtastic 10 years ago.

That’s just one indication of the drastic changes in the meaning of peer-to-peer networking has changed. Basic networking courses describe peer-to-peer (P2P) as a network management system that eschews dedicated servers in favor of workstations that serve double duty, sharing disk directories and attached printers with other users while simultaneously running local applications.

Back in the mid-1980s, Microsoft embraced the concept with Windows for Workgroups (3.11), attacking veterans of the field like Artisoft’s LANtastic. That was then Shortly afterward, Novell released NetWare and rendered peer LANs obsolete. P2P always compromised the performance of both applications and resource-sharing functions, but its greatest weakness was that administration–particularly security–could not be centralized, and was necessarily either weak or time-consuming. Another reason was the emergence of powerful dedicated server platforms made by hardware makers.

Whatever the weaknesses of earlier P2P systems, current systems bearing the label P2P are considerably unlike their older namesakes, but are similar enough that network administrators preparing for the coming P2P wave should bear in mind the reasons why the earlier incarnations dropped from prominence.

P2P II: Proofs of concept

Current P2P technology is yet another child of the Internet, dating back only to the 1997 founding of Hotline Communications, whose software was designed to let Net users set their PCs up as download sites accessible to other users. Internet P2P became famous–or infamous–in 1999 with Shawn Fanning’s Northeast University programming project, Napster www.napster.com.

From a technology standpoint, Napster proved to be a great demonstration of the potential power of P2P, with thousands of individual users sharing files located on their own PCs, forming an expansive catalog of data. The fact that the files being shared were MP3 music tracks was not especially significant, aside from verifying that relatively large files could be managed through such a system. The music industry felt that the copyright status of those tracks was rather important, but that’s another story.

While Napster has garnered the most notoriety, it is not alone among P2P systems used for music distribution. Gnutella www.gnutella.com, developed by a former subsidiary of America Online, is just over one year old, but has now become the leader of the guerilla challenge to the music industry since Napster has agreed to become an “enhanced membership service” with royalty fee agreements with BMG Music. In some regards, Gnutella is also the more powerful technology demonstration, at least from a peer purist’s point of view. In the view of at least one of those purists, Freenet founder Ian Clarke, “Gnutella is peer-to-peer, but Napster is not.” While Napster allows shared files to be stored on many nodes-users’ computers–that are decentralized, access to those locations is managed through a centralized database that runs on dedicated servers. Napster is a site and a company; as its press release explaining its settlement with BMG states, “First and foremost, Napster is a business.” As a business, of course, it has an address. Gnutella, on the other hand, is that most ethereal of entities in the world of IT: a standard. It doesn’t require the coordination of directory or login servers, just one of many free client programs (Bearshare www.bearshare.com is currently one of the best) that let anyone become a peer on the network.

(Of course, this makes Gnutella a far more elusive target for the Recording Industry Association of America, which has won an injunction to shut down Napster until it gets its paying service act together.)

In this regard, Gnutella is quite similar to Clarke’s Freenet freenet.sourceforge.net/, which has not become known as a music-sharing service, but conceivably could be used for the exchange of files of any type. Clarke has positioned Freenet as a movement as well a technology, however. His comments on P2P are laced with such phrases as “outdated copyright laws and media empires,” which clearly reveal that he has an agenda beyond technology development. “Hopefully, peer-to-peer will accelerate the democratization of the media,” he says on his Web site.

No analysis of the Internet P2P developments should ignore the impact of such issues as the RIAA lawsuits and other copyright disputes. The future of P2P has two distinct faces, however. One is the movement arising around people like Clarke, who hope to use decentralized network tools to thwart commercial control of the Internet. The other deals only with business technology, and includes companies hoping to use P2P to enhance network operations in all forms. Many of those efforts will be focused on private, internal LANs as well as the Internet.

One other P2P pioneer in the public arena warrants consideration before we turn to the pragmatic world of business. While Napster, Gnutella, and Freenet distribute data sharing, network servers also run applications too powerful for most workstations. This, too, has been explored as an area where P2P system design could prove valuable.

The Search for Extraterrestrial Intelligence (SETI), operating within the University of California system, may seem the most impractical of tasks, but it’s an enigmatic gamble that has nonetheless attracted many scientists for 40 years. Some of the most meticulous participants in the search analyze radio “noise” from literally billions of potential frequencies for possible structured signals. This analysis requires a great deal of computer processing, and SETI has not always enjoyed the funding to procure extensive supercomputer time.

Since 1999, however, the SETI project has tapped a new source of computing power through [email protected] setiathome.ssl.berkeley.edu/. Volunteer participants download processing software that includes a SETI screensaver; when a user has been inactive for a predetermined period, the software activates and starts chewing through analysis tasks; results are uploaded, and new tasks are downloaded periodically from the [email protected] Web site. Currently, the distributed-analysis software generates 15 teraflops for SETI. To put that into perspective, the world’s most powerful computer–IBM’s ASCI White–generates 12 teraflops.

Business uses for P2P

Now comes the tricky part: turning the innovations of P2P technology into something of practical use in the near future. There are already a few commercial ventures trying to use P2P on the Internet, several companies trying to market peer-centric software to corporate network users, and ambitious dreams by such industry heavyweights as Intel and Microsoft.

The candidates closest to fruition are several companies trying to emulate the Napster and [email protected] models with paying customers. A pair of companies, Porivo www.porivo.com and Centrata www.centrata.com plan to enlist users willing to let someone else use their PCs for processing when they don’t need it, in the manner of [email protected]

Porivo’s immediate plans involve using the P2P model to gain insight into the performance of Web applications and the Internet itself. Porivo’s first application, PeerReview, can analyze the performance of specific Web sites or business applications, providing real-world performance data from the end-user’s perspective. PeerReview analyzes Internet use and reports on a potentially wide cross-section of user profiles, geographies, demographics, technographics, and connection types.

Suzanne Knott, Porivo’s manager of marketing, is cautiously optimistic in her assessment of P2P. “While P2P file-sharing, communications, and entertainment applications make the Internet more engaging and useful, they can also cause performance problems and dramatically alter and increase the need for breakthrough Internet infrastructure technologies to support them,” she says. “It’s interesting that the same technology that causes these problems can also serve as its cure–and this shows the real significance of the P2P model.”

Porivo is still determining how members or contributors to the system would be compensated, and Centrata’s plans are even less clear. Presumably, though, participants would be paid for providing use of their computers.

A relatively recent addition to this group, Juno Online Services www.juno.com, already has an impressive base of members, and has established a precedent of free Internet access–though it soon plans to restrict free access, as competitors such as NetZero have done.

Juno was one of the very first free ISPs when it was founded in 1996, and it has done well enough to draw more than 4 million active subscribers to its advertiser-supported service. Advertiser support is drying up in many media outlets since the recent rash of dot-com bankruptcies, so Juno is trying a P2P idea. With software already loaded on millions of computers, Juno hopes to sell the use of their spare time to clients with significant ongoing processing challenges.

The prospect of Juno’s P2P reconfiguration has raised many ethical concerns. One stems from the fact that Juno’s service agreement could require that users leave their PCs powered up 24 hours a day to maximize the unused CPU time leased out. In exchange for free Internet access, Juno users would pay for electricity, telephone connections (long-distance in some cases), and PC maintenance. Few home users may want to pay for it –and with monthly fees for “classic” ISPs now as low as $10, few will, especially in an age of accelerating energy costs. And these resource costs could lead to corporations justifiably banning Juno from employees’ workstations.

Corporate workstation owners are also concerned with another problem in the Juno model: An unidentified third party would control the data processed (and possibly stored) on PCs. Companies’ PCs could end up performing work for their competitors. Home users should have similar concerns: What if their computers were used for illegal or unethical purposes? Cracking encryption keys could be a tempting abuse of SETI-level processing power.

Other P2P projects will follow the path of distributing data storage–the Napster model, as opposed to the processor model that [email protected], Porivo, Centrata, and (perhaps) Juno are taking. One of the most successful of these, to date, has been Alibre www.alibre.com, which provides the backbone for a distributed online catalog of Computer-Aided Design components-CADalog. com www.cadalog.com. Few P2P applications are as developed as CADalog yet. Still in the development phase is Lightshare www.lightshare.com, which is expected to challenge eBay and other online auction sites by letting network members barter with each other directly.

Off the Net

Many of the same ideas have been suggested for use on private, internal LANs, but usually by different vendors. A couple of products are well-developed, but not yet widely deployed. Consilient www.consilient.com is a peer collaboration system that passes XML documents called sitelets between workstations, allowing users to set up a workflow system, as well as share access to documents. Consilient is not a pure peer-to-peer system (at least by Clarke’s definition), because it still requires centralized servers to manage communication between the clients.

Perhaps the most impressive business P2P tool currently available, though, is NextPage’s NXT 3 e-Content Platform, which distributes and manages storage throughout a LAN with equanimity. NXT is fairly expensive (NextPage builds custom solutions for customers), but it addresses a problem of overwhelming and understated cost: network storage. Anyone who’s ever worked with servers knows it’s a constant struggle to free up room on the server, and server performance often suffers as a result. The growing amount of disk space required for corporate network servers exacts a high cost in terms of hardware, administrative effort, and support technology like enterprise backup systems or storage-area networks (SANs).

Distributed storage has captured the attention of many of the architects of PC development. The University of California at Berkeley has a high-level research project, called Oceanstore oceanstore.cs.berkely.edu, to develop P2P storage methods. Microsoft has its own initiative, Farsite www.research.microsoft.com/sn/Farsite/, researching the same subject. And Intel is adding to new chips circuitry it calls NetBurst, specifically designed to enhance peer-managed network communication.

The current efforts to make P2P a key part of LANs and the Internet are promising, but it’s too early to tell if this model will dominate data networks. Some of the same problems that stymied the older versions of P2P–administration challenges, compromised security, and pressure from server manufacturers–could challenge the latest network craze, as well. P2P may also collide with the emergence of thin clients. The idea of P2P, after all, is ultimately to take advantage of the wasted capacity of desktop computers, but the goal of thin-client computing is to skim off most of that excess power. The next year will offer interesting developments, and possibly an answer to the fate of P2P. Either way, network administrators should bone up on P2P once again and be ready for the coming P2P wave.

Leave a comment

seks shop - izolasyon
basic theory test book basic theory test