"Sweat your assets." That's the rather unlovely advice coming from IT consultants these days. Instead of simply buying-in more technology, so the thinking runs, companies should take stock of what they already have and ensure that they are using those IT assets to their full potential.
"This year, up to nine-tenths of the IT budget in many companies will be spent on finishing IT projects already under way, rather than starting new ones," explains Bruce Richardson, senior vice-president at AMR Research, an IT analysis company.
Looking around at the existing IT equipment in the average company, probably the most obvious asset is the PC sitting on hundreds or thousands of desks. All through the night, these machines stand idle. Large stretches of the working day see them scarcely ticking over. Could they be doing more?
Yes, they could. The idea of using the spare computer power of desktop machines is not new. Most famously, since 1999 the Search for Extra Terrestrial Intelligence (Seti) has been encouraging internet users to donate some computer time to helping it crunch through enormous quantities of data from the stars in order to find evidence of communication. The project connects PCs together all over the world in an enormous network (see Alan Cane's viewpoint on facing page).
Another similar project, at Oxford University, sifts through billions of possible drugs molecules to find shapes that might provide successful cancer treatments. Could other companies apply the same principles to their problems?
Yes, they could. This technology is properly called distributed computing (because the computing power is distributed among many different machines), but has become commonly known as peer-to-peer (P2P) computing. According to some experts, it could present a new and more efficient way of using the latent computer power on office desktops.
It works much like this: the spare "clock cycles" of computer chips, when the computer is standing idle, are used up in the background by specially developed software. In the case of Seti and most of the other high-profile examples of the technology in use so far, this software performs a series of uniform calculations on a large body of data, which is transmitted to the computers over the internet.
When the calculations have been performed, the results are returned to the centre and fed into an enormous database. New data are then sent out to the desktop machines and so it goes on.
Peer-to-peer means exactly what it says: all the computers on the network are peers. They are all equal, in contrast to the usual network in which a server sits above a group of clients, controlling some of their functions. In its strict sense, peer-to-peer networking differs subtly from distributed computing, as in a distributed network some computers are more equal than others. They have to be, in order to centrally collate and make sense of the information sent to them from the large number of machines on the network.
A variation on distributed computing is also "grid computing": each machine on the network contributes its resources (memory, processing power, storage space) to a central pool, which users draw on as needed, in much the same way as power stations contribute to an electricity grid from which users draw energy as needed.
However, the popularity of internet file-swapping services, such as Napster, which form non-hierarchical networks of large numbers of computers, has made P2P a technology buzzword, now loosely used to describe any project involving a large-scale network made up of many desktop PCs working on a single problem.
The cost-cutting attractions of distributed computing are obvious. Intel, for instance, proudly announced it had saved more than $500,000 by developing a network to share resources for number-crunching on its chip development projects. Unfortunately, setting up such a network is not at all simple.
When applications run on a single machine or a network of machines dedicated to a few tasks, the management of that application is correspondingly straightforward. When a machine fails, or the software or operating system misses a beat, it is obvious.
Debate on P2P management issues
When the application is spread out on a huge network of machines that are dealing with that application as an afterthought, and whose primary purposes are far removed, the management can become a nightmare. Raw data must be sent out to many machines to be processed, and the data returned, checked and slotted into place. But individual computers may be switched on or off by their users, connections between them may fail, data may be lost in transmission.
For this reason, distributed computing tends to work best for applications where the same processes are applied over and over to a large amount of data and where the results are not time-sensitive (as the data is being sent to so many machines, it can be difficult to predict when it will be sent back).
These management problems have inhibited P2P networks so far, comments Philip Guildford, principal consultant at Analysys, an IT research company: "No-one should underestimate the complexity of these issues. That's not to say it cannot be done, but it is difficult."
Alain Wiedmer, vice-president at Platform Computing, argues that much of the difficulty has been overcome. Platform's software enables companies to parcel out tasks and manage their P2P networks, without interfering with the application that is being divided out. Platform is not alone: several other companies, including Entropia and United Devices, also have products that assist in setting up distributed networks.
Not all industries require such huge computing resources, and companies within sectors that do not need heavy research may find it difficult to imagine circumstances in which this technology would become useful. Mr Wiedmer notes: "We are only talking about companies that need to do big quantities of data crunching for research, or for processing as in the case of say, utilities, with their big systems devoted to billing, which is very data-intensive. Other companies, such as retailers or small businesses, will not find this architecture so relevant."
Those other businesses, though, could still benefit from different forms of peer-to-peer networking.
Content distribution over a P2P network would be rather like having Napster inside your corporate IT system. Rather than store data centrally in complex filing systems on big servers, it should be possible to store work on all users' machines and lay all those machines open to searching.
Proponents argue that this would unleash the pent-up knowledge lurking on individual corporate desktops but not shared by the rest of the organisation, as well as saving on the costs of storage, duplication of documents and time spent searching through corporate servers.
Collaborative working would benefit from a set-up such as this, according to Groove Networks, a company created by Lotus Notes creator Ray Ozzie. Groove specialises in enabling the creation of ad hoc online "meeting rooms" for colleagues to share information on a P2P network (see profile of Groove Networks, page four).
Again, managers should beware of the possible pitfalls in setting up such projects. How to protect the privacy of individuals and their work, while opening up documents for all or part of the organisation to view? How to to ensure the security of the network if it is run over the internet?
Or what about the possibility of many different versions of the same piece of work proliferating over the network - how do you ensure that people are using the correct version?
Another possible home for the technology is in wireless networks. Transceivers in personal computers, laptops and handheld devices based on the Bluetooth or 802.11 standards could allow users to set-up ad hoc networks for sharing data.
Mr Guildford notes: "This could be very useful for teams working on a client site, companies with flexible office space in a loose cluster, companies in temporary accommodation, near-virtual companies where the employees meet up to work together once in a while.
"There would be worries about security, but there is lots of work under way to improve security and many incentives for using such networks."
What exactly is peer-to-peer networking?
In most computer networks, one computer or group of computers (known as servers) oversee and control the actions of a number of less powerful computers (clients). Communications to and from each client computer on the network are routed centrally through the servers.
As the name suggests, peer-to-peer networks eschew that conventional hierarchy in favour of a community of equals, where each computer may communicate directly with each other, rather than having to go through central servers.
Despite its high-profile use on the internet, with services such as the Napster music-sharing site, peer-to -peer was not viewed as a network architecture for business until companies such as Intel, International Business Machines and Sun Microsystems began to endorse the technology in 2000.
Intel has been closely involved in a project with Oxford University, while IBM has merged peer-to-peer into its vision of "grid computing", whereby large networks of computers share resources such as processing power and memory among themselves, and Sun has unveiled its JXTA P2P project, which provides a framework for building P2P applications.