Peer-to-Peer Networking: Return to the Roots

When Napster and its sister file-sharing applications were grabbing headlines in 2000 and 2001 because of the boom of audio-file swapping, the news overshadowed the fact that peer-to-peer networking had been around for nearly a decade, and can trace its legacy back to the dawn of the Internet. As marketing hype and media buzz were focusing on peer-to-peer (P2P) as the next "hot" item on the IT front, it merely blurred the definition of what P2P really is. It has been used as an umbrella term for anything from instant messaging to grid computing. Peer-to-peer networking breaks the client-server relationship, allowing a computer to take on both roles and exchange data and resources with other networked computers and devices.

Once the glow came off Napster, skeptics stepped forward to challenge the usefulness of the technology. Lee Schlesinger (2002) says, for instance, "But I still don't see P2P providing a strategic advantage for most organizations any time soon. There are already client/server products that do what most P2P applications do, and that are more compatible with hierarchical corporate networks."

But actually the inverse is happening: many product and services are coming to market - or are already there -- that incorporate different aspects of P2P into their approaches. Peer-to-peer networking can be characterized as four primary scenarios, according to the Peer-Peer Working Group (2002) which was created with the backing of Intel :

  1. Collaboration: real-time and off-line communications in support of working groups and other teams. The emphasis is on having the freshest data available to participants. Work can be behind a firewall or out on the Internet.
  2. Edge Services: Peer-to-peer computing can help businesses deliver services and capabilities more efficiently across diverse geographic boundaries. In essence, edge services move data closer to the point at which it is actually consumed acting as a network caching mechanism. These services include content management and file sharing.
  3. Distributed computing and resources: Using a network of computers, peer-to-peer technology can use idle CPU MIPS and disk space, allowing businesses to distribute large computational jobs across multiple computers.
  4. Intelligent agents: Agents reside on peer computers and communicate dynamically various kinds of information back and forth. Agents may also initiate tasks on behalf of other peer systems. For instance, intelligent agents can be used to prioritize tasks on a network, change traffic flow, search for files locally or determine anomalous behavior (virus activity).

In its simplest form, peers communicate symmetrically and operate as both clients and servers. Each peer signals that it is "alive" to nearby peers, which in tern echo the original peer's affirmation. Once awake, the peer can search the contents of the shared directories on peer machines. A search request goes to all network members, starting with the closest peers. If one of the peers has the file that is sought, it transmits the file information back through the network to the requesting peer. The original peer collects the list of files matching the search requests, selects the file and downloads it directly from the target computer. A peer network eliminates reliance on a central server and removes it as a central point of failure (Doherty 2002, 96).

More powerful desktop computers these days make P2P network feasible because they have spare computing and storage capacity. In addition, today's enterprises and networks support fault-tolerant, load-balancing systems that tout 99.999% uptime. In hybrid P2P systems, a central server may provide directory services and initiate communication, but the bulk of the work is still done by the endpoints.

Intel has back the Peer-to-Peer Working Group, which recently combined with the Global Grid Forum. Sun Microsystems has underwritten an initiative called a Project JXTA that aims to produce a set of open, generalized peer-to-peer protocols that allow any connected device (cell phone, to PDA, PC to server) on the network to communicate and collaborate. There are at least seven companies targeting products to the corporate sector, the most prominent being NextPage and Groove. The big rap on peer-to-peer is that it has security flaws. Virus and worms have begun appearing on file-sharing networks (KaZaa and Morpheus) and malware writers are incorporating P2P techniques to propagate across the Internet P2P technology now allows conventional Internet worms to update themselves (and perhaps make themselves invulnerable to antivirus software) by communicating with other infected systems to receive new code (Versomi 2002)

Groove

The key issue with peer-to-peer networking is how it is implemented to address specific user needs and overcome technological roadblocks. Groove is a desktop collaboration application that allows teams to share data across insecure networks by using strong encryption and authentication methods. In other words, it enables end-users to spontaneously share information within and across company borders while at the same time meeting IT requirements for security, data integrity, and availability.

Groove is the brainchild of Ray Ozzie, the creator of Lotus Notes, and his development team at Groove Networks. He lays out his thoughts regularly on his weblog, from which much of the information below is taken. He started working on the idea of Groove five years ago when he saw that Notes had limitations. With decentralized, horizontal team organization growing within companies and an increasing need to bring together teams that draw across organizational (and network) borders, he saw the advantages of less centralized, hierarchical approach to collaborative applications.

Building on an earlier development relationship with Microsoft, Ozzie convinced Microsoft that the initiative warranted a $51 million investment in the startup. The endorsement from Bill Gates shows that Groove has identified a niche that is not being addressed by more traditional applications. As well as being a shrewd business to keep Microsoft from trying to clone Groove, the alliance allows Groove Networks to get a foot into corporate environment where Microsoft has a strong presence. Although Microsoft has software like Exchange, SharePoint and web-integration tools for collaboration through e-mail or portal/forum approaches, it tries to provide a broad range of services, which tends to produce. Groove has stayed sharply focused on team collaboration and leaves more flexibility in the hands of the users.

Groove (2002) is a hybrid network service. Team members can install the Groove Workplace on each of their computers and operate as a pure peer-to-peer network. To improve data synchronization among team members, a company can use a relay server that holds data until team members log on, but data still resides on the desktop computer, not the server. Finally, Groove also offers a management server that allows a network administrator to control the deployment and use of Groove within a corporate environment, including security policy, usage, license and management services. These two server additions address specific needs of corporations and government, without restricting the independence of the end-user to collaborate and create freely.

Groove uses 192-bit encryption that is automatically applied to all messages, whether the user wants it or not. Groove messaging also has the advantage that it does not mix with spam and viruses, thus reducing a prevalent risk in e-mail communications. Groove has been well received in homeland security and defense areas. In this fast-paced environment since the 9/11 tragedy, law enforcement and intelligence agencies in multiple jurisdictions are finding the need to communicate securely and quickly across networks. Groove provides the ability to get up and running quickly.

Conclusion

The Internet's three fundamental assets are information, bandwidth and computing resources, all of which are underutilized, partly due to the traditional client-server computing model. Information may not be readily available in the terabytes of data stored online and on network; fiber optics backbones lie idle, while some hot spots choke traffic; and computation capacity has grown. P2P networking technologies can greatly improve the utilization of Internet resources. Improving performance in information discovery and content delivery and information process is just one side of the benefits. P2P could enhance the reliability and fault tolerance of global computing systems, for instance, by distributed e-mail systems that are not solely dependent on preconfigured mail servers and network file systems that support disconnected computing (Gong 2002, 37-8).

But more than pure P2P applications, like Gnutella and Freenet, the future lies in combining technologies so that users have more robust, reliable and adaptable communications and collaboration. By breaking out of the client-server model that has predominated in business, network and application developers can strike the right balance. Some experts would argue that P2P is a return to the Internet's roots because TCP/IP itself does not think in terms of server/client, but endpoints. This concept has also matched up with the realization that the P in the acronym can also stand for People (Udell, 2002).