Intel's Philanthropic Peer-to-Peer Programme, good for science, or good for marketing?

Santa Clara 07 April 2001Intel has launched a peer-to-peer programme, under which people can download a programme onto their PCs which should perform scientific calculations for Cancer Research. Although this sounds very noble, it looks more like good marketing for Intel than really helping science. Intel calls its new programme "a philanthropic effort to help combat life-threatening diseases by linking millions of PCs to create the world's largest and most powerful computing resource". Joining the Intel Philanthropic Peer-to-Peer Programme are the American Cancer Society, the National Foundation for Cancer Research (NFCR), the University of Oxford, and United Devices Inc.


To participate in the research effort, PC owners first have to download a small computer programme from the Intel site. After running the downloaded file, the programme is installed on the user's computer and automatically begins computing. It runs whenever computation resources are available.

As a first step in finding new drugs and a potential cure for leukemia, the no. one cause of childhood death by disease, researchers have to evaluate the cancer-fighting potential of hundreds of millions of molecules. The NFCR scientists estimate that this task will require a minimum 24 million hours of number crunching, which was previously unimaginable, as Intel claims. This particular drug-optimisation programme evaluates four proteins. One of the four proteins has been identified as critical to the growth of leukemia, and shutting it down may lead to a potential cure.

Intel also claims that the programme will attract millions of people so the resulting virtual computer will be 50 Tflop/s or 10 times faster than the biggest supercomputer in the world. So much for the claims but let us have a closer look. There are not millions of users to the programme yet. It is questionable whether there ever will be so many. Today, there are many possibilities to participate in these kinds of programmes. Only the first, Seti, has many millions of users who chose for it. Internet providers like Juno are looking at possibilities to tap their customers power, sell it, and use part of it for scientific research.

You can assign your free computer time to only one of those programmes. Furthermore, in Europe, still a large proportion of Internet users do not have free access, so one is not willing to participate in such programmes. Even if Intel reaches the number of users they hope for, and get their 50 Tflop/s machine, it will only be five times faster than the current supercomputer, that peaks at over 12 Tflop/s. The big advantage of a supercomputer is that there is an interconnection network, allowing relatively fast inter-processor communication. So seen in application turn-around time, a 12 Tflop/s supercomputer is much faster than a 50 Tflop/s peer-to-peer distributed system.

Interesting enough, the second machine in the TOP500 is an Intel machine, the ASCI Red with 3.2 Tflop/s peak and close to 10.000 processors. Let us look at how this machine would do on the 24 million hour problem claimed to be "unimaginable" on today's computers. Straight forward, it would take 100 days. That is a lot, but not "unimaginable". If it needs 24 million hours on a Internet distributed computer, it can probably be optimised to run much faster on the real supercomputer.

How many PCs on the Internet are needed to do something in the same order as the 10.000 processor supercomputer? Firstly, you need more machines, because a machine is not available 24 hours, for one of many reasons. Most obviously, the user can need his own machine: after all he did not buy the computer just for participating in the Intel programme. The machine can be accidentally turned off. The network connection can be down , slow or unreliable.

So if one in five or one in ten of the subscribed machines are really available, it is already pretty good. The interconnection between the PCs is very slow compared to the supercomputer, the memory per processor is probably smaller too. Let us assume we choose our applications carefully and we only have a performance degradation of, again, a factor of ten. Hence to be comparable in speed to a supercomputer, you need something in the order of one million PCs, at least. And that only for a very small set of applications.

So with the ideas and programmes developed by others, with the use of computer power provided by people all over the world, and with claims that are at least a little bit exaggerated, this looks like an initiative that seems to be designed with being good for Intel marketing as its main goal, and helping science as a second one. One rather would have seen that in reverse order.

Ad Emmen

[Medical IT News][Calendar][Virtual Medical Worlds Community][News on Advanced IT]