Monthly Archives: April 2009

Clive Cookson

This time last week no one outside a small group of public health experts had heard that the world faced an imminent flu pandemic. But virologists at the US Centers for Disease Control and Prevention (CDC) in Atlanta were already working feverishly to crack the genetic code of virus isolated a few days earlier from two patients – in California and Texas – suffering from flu apparently linked to a mysterious outbreak in Mexico.

Over the weekend, as pandemic scare stories hit the media, the CDC researchers had completed the RNA sequences of the virus, using the latest tools of molecular biology. (Flu virus has a genome composed of RNA rather than the related DNA that makes up the genes of almost all other organisms.) The achievement is a real testimony to the powers of 21st-century science in an emergency.

By Monday, the CDC’s RNA sequences were available on the public GenBank database, for anyone to download. You don’t even need an access code or password. That is testimony to the openness of modern, publicly funded research.

For the uninitiated the viral genetic code is an endless and nonsensical series of four chemical letters – “atgaaggcaat” is a typical short stretch – but for experts the sequence tells a fascinating tale of past mutations that have shaped the virus. They can read a history of infections of birds, pigs and people.

But there is no doubt that the predominant contribution comes from pigs, virologists say. Hence the unfortunate decision by the World Health Organisation originally to call the disease swine flu – reversed tonight (see my previous blog and comments).

A porcine origin does not mean that the current H1N1 virus is circulating among pigs today. Now that it is spreading among humans, people face an immensely greater risk of catching Mexican flu from other people than from pigs.

Scrutiny of the viral genome – and comparison with other flu viruses – is also beginning to give scientists clues about its likely behaviour and effects on people. These seem to be reassuring.

The Mexican strain currently lacks some of the molecular characteristics associated with the most virulent viruses – adding to the emerging epidemiological evidence that it causes mainly mild illness. Experts believe the apparently high mortality rate in Mexico is due to the result of vast under-reporting of less severe cases.

Of course flu is a notoriously fast-changing virus, and it may mutate into a much more dangerous form. Or it may turn out to be less lethal than normal seasonal flu, which kills 250,000-500,000 people a year worldwide.

Whatever happens, scientists will for the first time in history be able to follow the changing genetics of a virus during a pandemic, as it happens.

Clive Cookson

The flu strain that is spreading from Mexico and causing alarm about a possible pandemic has generally been called “swine flu” by health authorities, including the World Health Organisation.

But pig producers and animal health experts understandably dislike that term. Not only does it give pigs a bad name (and incidentally damage consumer demand for pork products) but also, they say, it is inaccurate.

In fact the H1N1 virus responsible for the outbreak has not been linked directly to pigs, in Mexico or anywhere else. The virus has not been isolated from any animal apart from humans, though virologists surmise that it may have originated in a pig.

Like birds and people, pigs can act as a “mixing vessels” in which different viruses swap genes and produce a new strain. The Mexican virus appears to contain porcine, avian and human genetic components.

The Paris-based animal health organisation OIE proposes calling it “North American flu”, to reflect its geographical origins. After all, the last pandemic, in 1968, was caused by “Hong Kong flu” – and the great 1918-19 pandemic was “Spanish flu”.

For me, North American flu is too much of a mouthful. I’d prefer “Mexican flu”.

Clive Cookson

The exhibition hall here at the European Future Technologies Conference in Prague is awhirr with robots.

A robotic lamprey from Italy swishes elegantly around a small pool while a salamander walks around outside the water. A few metres away a humanoid baby called Now, created by a Slovenian-German collaboration, is taking its first steps – falling from time to time but gradually learning from experience how to keep its balance.

And displaying Gallic sophistication, French robots are demonstrating tasks that will be needed if machines are ever to serve as intelligent home companions – and servants – for people. One serves a drink while another, topped with a chef’s hat, prepares the ingredients for a ham and cheese omelette.

All these are experimental robots, requiring intense care and attention from their human creators to keep going. Their successors, which will need to be far more robust and durable, will not be ready for commercial application for 10 to 20 years.

Research into intelligent robots, with some of the mental flexibility and learning ability of natural organisms and even people, seems be undergoing something of a renaissance. The field is beginning to recover from a period when it suffered from public disillusion – and therefore poor funding – in the wake of excessive claims for robotic intelligence during the 1980s.

The EC Future and Emerging Technologies Programme, with a budget of €100m a year and rising fast, is a prime source of new money for European robotic researchers. They still lag well behind the Japanese in hardware and “mechatronics” – the mechanical and electronic engineering of robots – but may be ahead when it comes to the software for artificial intelligence.

Japan has a long-standing cultural affection for humanoid robots, symbolised by the 1960s television series Astro Boy that inspired many Japanese robotic researchers now in middle age to work in the field.

The western world, in contrast, has a deep suspicion of robots, dating back to Karel Čapek’s science fiction play Rossumovi univerzální roboti (Rossum’s Universal Robots) premiered in Prague in 1921. R.U.R – a great success in translation between the wars – introduced robots to the world. It is not a happy story: the robots rise up against their human creators and kill them all.

Europe is showing signs at last of overcoming this historical legacy and learning to love the prospect of intelligent robots.

Clive Cookson

In Prague for the first European Future Technologies Conference. I’m chairing the opening session and a panel discussion about “multidisciplinary transformative research”. 

The conference and associated exhibition are the first to showcase the achievements of Europe’s 20-year-old Future and Emerging Technologies programme. They include mind-reading computers and friendly companion robots.

I must admit I hadn’t heard of FET (oh dear, another Euro-acronym to learn) but it looks as though it has achieved a reasonable return from its €100m a year budget, which is set to double over the next six years. I’ll write more about some of these achievements later in the week after I’ve been round the exhibition.

Henry Markram, who heads Switzerland’s Blue Brain project at the Ecole Polytechnique Fédérale de Lausanne, gave a powerful opening keynote address. If all researchers communicated as well as Henry, we’d have no trouble enthusing young people about science and technology.

Blue Brain is the world’s most advanced attempt to “reverse-engineer” the brain by simulating all its functions on supercomputers. Henry demonstrated an amazing simulation of neurons at work in a rat’s brain and told us a full simulation of the human brain would be possible within 10 years. Among many other things, this could help scientists to understand psychiatric conditions that remain medical mysteries.

Then came the dignitaries’ official opening ceremony. The pair advertised in the conference programme – the Czech prime minister and European information commissioner – were otherwise engaged, though Commissioner Viviane Reding did make a video presentation. So their stand-ins, Czech education minister Ondřej Liška and EC chef de cabinet Rudolf Strohmeier, cut the ribbons (literally, with scissors). Why a future-oriented conference needed such a traditional opening is rather a mystery; sadly my suggestion of a ribbon-cutting robot came too late.

Then my panellists made several interesting points about multidisciplinary research. One was that scientists become more interesting in working outside their original discipline as they grow older – age and status provide a security blanket for adventures that younger researchers feel they can’t risk.

Security fears can promote interdisciplinary thinking in a quite different way, observed Ivan Havel, director of the Centre for Theoretical Study at the Charles University’s Institute for Advanced Studies – and brother of former Czech president Václav Havel. During the Communist era, when dissident scientists from different disciplines were forced to meet in secret, their discussions were so fruitful that they have been maintained ever since.

Clive Cookson

Many of Britain’s science journalists will be feeling sad – and nostalgic – today after hearing that John Maddox died on Sunday at the age of 83. He was the most influential science editor of the 20th century and created a blueprint for the modern research journal.

In two spells as editor of Nature, from 1966 to 1973 and 1980 to 1995, Maddox converted a staid journal, for which the word “venerable” might have been invented, into a lively news-seeking and news-making publication without sacrificing its scientific authority.

Maddox had earlier made his mark in newspaper journalism, as science correspondent of The Manchester Guardian from 1955 to 1964, where he developed a campaigning style and an appetite for scoops that he carried over to Nature.

His appointment at the Guardian came when science correspondents were still a novelty. The late 1950s were an optimistic period for science and technology, with newspapers showing gung-ho enthusiasm for aerospace, astronomy, nuclear power and medical research. But Maddox managed to delve into the dark side of technology, including a ground-breaking investigation of the 1957 Windscale reactor fire.

He left the Guardian in 1964 for the Nuffield Foundation, where he spent two years leading an influential project to update the school science curriculum (I was to be a beneficiary of his work, studying the Nuffield Physical Sciences A-level course). Maddox returned to Nuffield in the 1970s, serving as the foundation’s director in between his two spells editing Nature.

When Maddox arrived at Nature, the journal was approaching the centenary of its foundation and retained a gentlemanly, old-fashioned aura. It still attracted some first-rate research papers (such as the one by Crick and Watson announcing the discovery of the DNA double helix) but its owners, the Macmillan family, wanted Maddox to shake up the publication in the face of growing competition. Science, its long-term American rival, was benefiting from the shift in scientific power from Europe to the US.

Maddox obliged by transforming the slow and somewhat arbitrary process by which Nature selected papers for publication. He set up an efficient peer review process for routine submissions – though he was quite happy for important papers to be whisked through on his say-so without formal review – and actively solicited contributions from the world’s top scientists. At the same time Maddox gave Nature its first news and features pages, and built a young team of editors and writers who would go on to fill many roles in science journalism elsewhere (including my FT colleagues Alan Cane and Nick Timmins).

In his second term as editor, Maddox was even more determined to raise his journal’s profile. The most controversial coup was first to publish a paper by the late Jacques Benveniste, a French immunologist, who purported to demonstrate the scientific basis of homeopathy – a form of alternative medicine scorned by most scientists – and then to make a highly publicised visit to Benveniste’s lab in the company of James Randi, a famous magician and fraud-buster, who revealed that the experiment was based on a scientific illusion.

Maddox had grown up near Swansea, the son of furnaceman in an aluminium smelter, and his favourite retreat was a cottage deep in rural Wales. But there was little trace of a Welsh accent left by the time Maddox retired finally from Nature. The most distinctive feature of his voice then was the gravely timbre given by a lifetime’s smoking and drinking. When Maddox wrote his flowery but elegant editorials – late in the evening with the nominal deadline already passed – cigarettes and red wine were the standard accompaniment.

Maddox was knighted in 1995 and made an honorary Fellow of the Royal Society – a far rarer distinction – in 2000. He is survived by his wife, the biographer Brenda Maddox, and four grown-up children (including Bronwen Maddox, former FT environment correspondent and now chief foreign commentator of The Times).

John Maddox © The Royal Society

John Maddox © The Royal Society

Clive Cookson

Last month I wrote about scientific persistence paying off for Mark Pepys, professor of medicine at University College London, who has worked for more than 30 years on diseases caused by the abnormal build-up of amyloid protein.

He was celebrating then the signing of a deal with GlaxoSmithKline, the giant UK-based pharmaceutical group, to develop a treatment for amyloidosis, a rare fatal disease in which amyloid accumulates in organs throughout the body.

GSK will use a drug called CPHPC, which Prof Pepys developed during the 1990s, as the basis of the treatment. CPHPC works by removing SAP, a blood protein that sticks to amyloid and stops enzymes dissolving it away.

Now Prof Pepys is celebrating again. The Proceedings of the National Academy of Sciences, a leading US research journal, has published promising results from the first clinical trial of CPHPC to treat Alzheimer’s disease.

Prof Pepys and UCL colleagues gave CPHPC to five Alzheimer’s patients for three months. They found that it removed SAP from their brains.

The treatment caused no side-effects and the patients showed no clinical deterioration during the trial, though it did not last long enough for the researchers to assess clinical benefits.

Rececca Wood, chief executive of the Alzheimer’s Research Trust which part-funded the study, said the results “are cause for cautious optimism. New treatments for Alzheimer’s disease are desperately needed, and it is possible that this small molecule could be a future candidate.”

Prof Pepys now plans a larger trial in which 80 to 100 patients will receive either CPHPC or a placebo pill over several months, through he still needs to raise funding for it.

“That should tell whether CPHPC brings a significant clinical benefit,” he says. “It will not need to do very much to be better than the Alzheimer’s drugs currently on the market.”

Prof Pepys started developing CPHPC with support from Roche. But the Swiss pharmaceutical group pulled out of the collaboration, saying it had other research priorities, and last December Roche passed all its rights in CPHPC to Pentraxin Therapeutics, a UCL spin-out company.

Following last month’s deal with GlaxoSmithKline on amyloidosis, Prof Pepys hopes  the clinical results with Alzheimer’s might tempt GSK to take on CPHPC as an treatment for that disease too.

While Alzheimer’s is potentially a far larger market than amyloidosis, development costs would be much greater – and there is much more competition from other drug companies.

Although there is much medical controversy about ultimate cause of Alzheimer’s disease, Prof Pepys believes that CPHPC could treat the underlying disorder by enabling enzymes to remove amyloid from the brain.

Clive Cookson

Everyone interested in renewable energy has a favourite, which he or she may feel is unjustifiably neglected in the competition between alternative sources. With solar, wind, wave, tidal, hydro, biomass, geothermal and many others vying for investment, I have long believed the heat beneath our feet provides a huge energy store that deserves more public attention.

So I was delighted to attend a briefing organised by the Royal Academy of Engineering in London, at which geologists and engineers called for more investment both in ground source heat pumps, which extract energy from near the surface for individual buildings, and in deeper geothermal projects, which involve drilling down to 1,000 metres.

“We are a sitting on top of a gigantic free reservoir of natural heat in the ground – ubiquitous and potentially available to all,” says David Banks, a heat source expert. “All we need to access it is a hole in the ground (a trench or borehole) and a pump, to lift it from a low temperature, typically 10-14ºC in the UK, to a high temperature at which it can be distributed and used to provide space heating.”

Banks says heat pumps are beginning to take off in Britain, with the market doubling every year. But the total installed base in the UK, which he estimates at 6,000 units, is still far smaller than in Scandinavia and north America.

In Sweden 350,000 units are installed, providing more than 10 per cent of heating in homes and offices. The US accounts for about half the world’s total ground source heating capacity, with 80,000 units installed every year.

Heat pumps have struggled because the initial capital costs of installation are high – typically £10,000-£15,000 for a 6 kilowatt domestic system – and natural gas is still relatively cheap in Britain. But they could pay off within 10 or 15 years for households in rural areas who have no mains gas supplies and enough space to lay the 100 metres of pipes required to extract heat from the ground. For commercial and public buildings, the economic returns are better.

While heat pumps extract low-level heat always available in the ground for use in buildings, some parts of the world have easy access to much more intense heat. This may be a blessing in energy terms but it often comes at a price because such places are vulnerable to volcanic activity and earthquakes. Iceland and the Geysers geothermal field in California are good examples.

Britain’s geology is not so favourable for geothermal energy but hot rocks are down there if you drill deep enough. The country’s only operational scheme has provided hot water to homes and offices in the centre of Southampton for more than 20 years, but David Manning, director of Newcastle University’s Institute for Research on Environment and Sustainability, says a recent project in Weardale, County Durham, shows the potential for geothermal energy in north-east England.

Prof Manning and colleagues supervised the drilling of a test borehole 1 km deep at Eastgate, which produced plentiful supplies of hot water at 40ºC – see picture below. It is expected to create a spa and heat buildings in a new “renewable energy village“.

The same geological formation of hot granite hundreds of metres deep extends up to the Tyne Valley. Geologists are now planning another borehole on the old Newcastle Brewery development site.

Clive Cookson

Embryonic stem cells get all the publicity in stem cell research, good and bad. Their supporters see them as the future of regenerative medicine, producing all manner of new human tissues to treat degenerative diseases. Opponents – mainly from religious groups – hate the fact that they originate with the destruction of an embryo.

No treatment based on human embryonic stem cells has yet been tested on patients, though the US Food and Drug Administration recently told Geron that it could begin a clinical trial of embryonic stem cells to treat spinal injury. Meanwhile, as the UK national stem cell conference in Oxford heard today, universities are making good progress using adult stem cells, derived from the patients themselves, to repair bone and cartilage.

At Southampton University Richard Oreffo is leading a programme to fill holes or gaps in bones caused by accident or disease with a “living composite” material, made of stem cells extracted from the patient’s bone marrow mixed with a biocompatible scaffold.

Four patients have so far received transplants of living composite, says Prof Oreffo. Early signs are encouraging: the material is integrating well with the patients’ own bone and stimulating natural regrowth.

Meanwhile Alicia El Haj of Keele University is working on a 10-year clinical trial at Oswestry Orthopaedic Hospital, using adult stem cells to repair cartilage damaged in accidents. The patients’ stem cells are multiplied outside the body, before being injected back into injured joints.

Although the Oswestry trial uses stem cells on their own, Prof El Haj is also leading a more futuristic research project in which stem cells are linked to microscopic magnetic nanoparticles. “We can then use a magnet to move the stem cells around the body and control what they do there,” she says.

Magnetic control could be far more effective than simply injecting stem cells into the patient. The nanoparticle system has already produced new tissue growth in laboratory mice and is about to be tested in goats, ahead of clinical trials.

While adult stem cells are more readily available than embryonic stem cells – and pose no ethical problems – they are much less versatile. However Prof El Haj said techniques developed for adult stem cells, such as magnetic control, could be adapted to embryonic stem cells or the recently discovered “induced pluripotent stem cells” (which are made by reprogramming adult cells so that they revert to an embryonic state).

At present stem cell trials use one-off procedures developed by individual research teams. “We need to move away from bespoke therapy into standard procedures that can be used by [doctors] anywhere,” says Prof El Haj.

Although many scientists and patient groups are impatient for stem cell research to deliver clinical benefits more quickly, Prof Oreffo says it is important not to push ahead too fast: “The last thing we want is a case that goes wrong, because that would set the field back tremendously.”

Clive Cookson

Scientists at Sheffield university have taken an important step towards using stem cells to restore hearing to deaf people.

Their research shows for the first time how embryonic stem cells can be converted into the specialist cells we rely on for hearing. These sensory hair cells and auditory neurons, as they are known, cannot be regenerated in adults using existing medical technology; once they are damaged, hearing loss is permanent.

The long-term aim is to treat deafness by transplanting new auditory cells, generated from stem cells, into people who have lost their own.

“We have found the recipe to persuade embryonic stem cells, which can become any cell in the body, to become auditory cells,” says Marcelo Rivolta, who has led the Sheffield project for the past five years. “Our lab studies have shown that these cells behave and function just like their counterparts in our developing ears.”

The research started by studying cells from the developing ears of aborted human foetuses (around 10 weeks old) and then applied the findings to embryonic stem cells (which originate in early embryos just a few days old). The next step will be to graft the specialist auditory cells into deaf strains of laboratory animals.

The research – funded by the charities Royal National Institute for Deaf People and Deafness Research UK – is published online by the journal Stem Cells and will be discussed at next week’s UK National Stem Cell conference in Oxford.

Ralph Holme, director of biomedical research at RNID, says: “Stem cell therapy for hearing loss is still some years away but this research is incredibly promising and opens up exciting possibilities by bringing us closer to restoring hearing in the future.”

A more immediate application will be for research into deafness. “We have now an experimental system to study genes and drugs in a human context,” says Dr Rivolta, who is originally from Argentina.

“In addition to the future potential for restoring hearing with stem cell therapy, the recent research success means that we may now have better ways to test the efficacy and toxicity of new drugs on auditory cells,” adds Vivienne Michael, chief executive of Deafness Research UK.

Clive Cookson

The UK life sciences industry is involved in an unseemly spat with one of its regulators.

Three trade bodies have accused the Human Tissue Authority of introducing “unrealistically excessive” fee increases that “show complete disregard of economic environment and run counter to government support of the life sciences sector”.

The HTA licence fee for a company using human tissue for clinical applications rises today by 45 per cent to £11,000 for its main site and by 280 per cent to £3,800 for each satellite site. The new fees were only announced on Friday, following a consultation exercise.

In a joint statement the BioIndustry Association, Association of British Healthcare Industries and British In Vitro Diagnostics Association say: “Coming at a time when many companies are already suffering severe financial difficulties, this adds further and unexpected costs with extremely limited notice.”

According to the trade associations, the HTA charges much more than its counterparts in other European countries for regulating human tissues and cells intended for medical applications.

In response, Adrian McNeil, HTA chief executive, says: “We have done everything we can to keep licence fees to a minimum. These include applying the lightest possible touch when implementing complex legislation and introducingstreamlined systems and processes.” The whole fee structure will be reviewed this year.

But the underlying problem is that UK Treasury guidelines require the HTA to recover regulatory costs from licence fees, without cross-subsidy between different sectors.

The associations are unlikely to change matters merely by complaining. A concerted refusal by every member company to pay the increased fees might force the government to rethink its policy – but such ungentlemanly action is not on the agenda.

The world of research

The science blog is no longer updated but it remains open as an archive.

Clive Cookson, the FT's science editor, picks out the research that everyone should know about, in fields from astronomy to zoology. He also discusses key policy issues, from R&D funding to science education. He'll cover the weird and wonderful, as well as the serious side of science.