Andrew George of Brunel University ponders whether existing research culture is severely lacking in the storytelling needed to encourage future breakthroughs.
I can still remember the horror of discovering that everything I had worked on was wrong. I was a PhD candidate just starting my second year, and my supervisor and I had developed a test for rheumatoid arthritis, which seemed a revelation. We wrote a paper for a prestigious journal but just before we sent it off, we decided to do one more experiment to check we were correct.
We weren’t. Everything that I had done in the last year was ruined and I had to start an entirely new research topic. It was a tough but valuable lesson for a young scientist – you should always go further to test your ideas.
That was 35 years ago, and I wonder if someone starting out as a researcher today would be encouraged as I was to go the extra mile. Does the incessant drive to publish and measure outcomes mean that researchers are under pressure to cut corners, and have less time and freedom to pursue their ideas?
The Wellcome Trust – one of the world’s largest funders of health research – recently launched a review of research culture, to find out if research has become so hyper-competitive that it “cares exclusively about what is achieved and not about how it is achieved”.
What helped me develop as a researcher was reading stories about those who came before me. For scientific research to be successful in the long term, I think researchers need a strong set of values, including an unwavering commitment to the truth, and a drive to test any idea to destruction.
Though they may seem opposed to the ideals of the rigorous scientific method, the best way of instilling these values is, as ever, through the stories and myths that we tell ourselves.
The power of stories
In ancient times, people would sit around their fires at night and tell stories. Stories about their creation, stories of great deeds and feats, and stories that rehearsed how people interacted with each other and the world they lived in. One of the oldest of these still to be read is the ancient Greek Iliad of Homer.
The story explores what it means to be a warrior and leader, how people should accept fate, achieve fame and the consequences of pride and anger. Young people listening to those stories learned what was expected of them, reinforcing the collective values and beliefs of society.
In the modern world, myths and stories still have an important role to play – even in scientific research. Scientists have stories about important people and great events in science, such as the discovery of penicillin, uncovering the structure of DNA, the development of vaccines and the battles that Galileo and early proponents of a sun-centred model of the solar system fought with the reactionary forces of the church.
Together, these stories help young scientists understand the collective benefits of research that go beyond personal advancement and success.
These scientific myths are based on reality, though sometimes strict historical accuracy has been sacrificed to better make a particular point. In a similar manner, the stories of Homer would have been based on real events – such as the Trojan wars – but they evolved in the storytelling. It’s unlikely the Trojan Horse really was a large-scale model of a horse that soldiers hid in.
The future of science
It’s important to recognise that how we do research has changed. This was brought home to me recently when I reread The Pursuit of Nature, the story of some of the great Cambridge physiologists of the mid-20th century.
I was lucky to be taught by one of the authors, Alan Hodgkin, who won the Nobel Prize for working out how nerve cells transmitted electrical impulses. He started his work on nerves in the second year of his undergraduate studies and built his own equipment out of biscuit tins.
Nowadays to succeed you must win big grants and build up a research team. Often more than 20 authors will contribute to a research paper. Hodgkin only ever had a few people working in his team and was more likely to publish with one or two close colleagues.
This ‘industrialisation’ of science is right and necessary. It has accelerated the impact of research in society and allowed scientists to discover and develop new technologies. There is probably nothing left that can be discovered using equipment made from biscuit tins. But amid all this change, we haven’t adapted the way in which we instil the ethics and values of science and research into young researchers.
When I was an undergraduate and PhD candidate, my supervisor worked on the lab bench. We had coffee and tea together every day.
I learned from her, and colleagues, what it meant to be a scientist. Today, the interaction between supervisors and junior researchers tends to be more transactional, about the experiments and data. There is less time for the apprenticeship of research.
Of course, there is training in how to do research. Graduate schools and doctoral training centres have raised standards in the education of PhD candidates. But I doubt that many people develop their values and moral compass from PowerPoint presentations.
In my own life, the popular myths of great scientists fed a culture that cherished curiosity as a good all on its own. We need to develop these stories, curating them by selecting those that are appropriate and developing new ones that make useful points. As scientists, with a commitment to the truth, we should also ensure that they are accurate representations of reality that also reflect the collective endeavour, rather than the supposed genius of a few white men.
All cultures need their myths, and each lab needs its lore.
Andrew George is an emeritus professor at Brunel University London.