One of the staples of crime drama is the ‘cold-case squad’. This allows program-makers to add period detail to the scenes set in the past, while the present-day scenes can show implausibly attractive forensic scientists hunting for clues in a creepy location such as a long-abandoned children’s home (an activity obviously best performed during the hours of darkness by two people who separate in mid-search for no apparent reason).
I have often wondered whether it is worth establishing a cold-case squad for technology and science, to investigate those lines of inquiry that went cold 50 years ago but would now repay further investigation; or inventions that suffered from a miscarriage of justice. I was recently talking to a reader about the Microwriter — a six-button typing device that allowed you to type with one hand at astonishing speed. It was invented in the 1970s and failed. Might it be worth bringing it back?
As with criminal investigations, scientific and technological lines of inquiry are prone to get sidetracked or hit dead ends. In both types of investigation there are many path-dependencies, and vital breakthroughs often happen accidentally. Yet few people spend much time investigating scientific cold cases, perhaps because we usually take the Whig view that progress is inevitable, that everything which can be discovered will be discovered in time. With this comes the belief that all failures failed for a good reason.
Undoubtedly, many of the world’s greatest inventions — the wheel, steam power, analgesics, the soft-close toilet seat — would have emerged eventually. But there are so many strange gaps and delays in the history of innovation (the Romans never invented the stirrup, for instance) we cannot be entirely confident that many useful advances might not have been held up for decades, even centuries, by chance events similar to those which accelerated many discoveries. These unlucky non-events are largely invisible. Our deep faith in progress is rose-tinted by survivorship bias.
Many forensic scientists have remarked that deaths where foul play is suspected often rely on such slender evidence to begin with that you wonder how many murders go undetected. Indeed without the insight and tenacity of Detective Constable Hazel Savage, a small article might have recently appeared in the Gloucestershire Echo, ‘Funeral of popular local builder’, remembering Fred West as a big-hearted family man with a reputation for high-quality grouting.
How many great ideas have similarly failed to take root because some tiny detail was overlooked, or not taken seriously?
New ideas are especially fragile because they usually take hold in one, often fairly random person who spots something tangential that nobody else thinks important. DC Andy Laptew, the first person to put Peter Sutcliffe in the frame as the Yorkshire Ripper, was another case in point. His suspicions started with a chance observation. He and his partner always told the same ice-breaking joke when interviewing a couple: ‘Now’s the chance to get rid of your husband.’ His visit to Sutcliffe was the first time no one laughed or reacted. He then noticed Sutcliffe had a gap in his teeth matching dental evidence from the crimes. He went on to assemble more and more evidence against Sutcliffe, yet was ridiculed by his bosses. Tim Berners-Lee was only a little luckier: his first proposal for the world wide web was returned with the note: ‘Vague but interesting.’
Or consider this alternative history, written in a 2014 academic paper, ‘What if Fleming had not discovered penicillin?’ by Sulaiman Ali Alharbi, Milton Wainwright and others:
‘It is September, 1928; a 47-year-old man walks…up the steps to his place of work. He would rather not be there. Summer is not yet over and he has had to return to London because of an emergency, otherwise he would still be enjoying life at his cottage in Suffolk… The man sits at the bench; he is a scientist and this is his laboratory. Casually, he picks up a few old petri dishes on which he has been growing bacteria. He glances through them until he comes to one that looks unusual. A colony of mould has somehow found its way into the dish and is dissolving the bacteria around it. He shows the unusual plate to his assistant who shows only mild interest and then hands it back without comment. The scientist has one last casual look, decides the phenomenon is of no importance, and drops the dish into a bucket of disinfectant. Our scientist then picks up his bag and hurries off to catch the train back to Suffolk.’
The authors surmise that, had this entirely plausible story unfolded, we still might not have access to antibiotics now. Research efforts might have concentrated on sulfonamide drugs instead. Someone might have discovered the same antimicrobial effect 20 years later, but by then the focus would have been somewhere else.
The authors add an interesting fact: it has been discovered that as early as 1930 an application for a research grant was made to the UK Medical Research Council by Stuart Craddock, one of Fleming’s assistants, who during this period worked on penicillin. The request was refused.
You might think that things have improved since then. We have systems, peer review and all kinds of processes to improve decision-making. But maybe they haven’t. Katalin Karikó, the academic behind the mRNA breakthrough which led to the Moderna and Pfizer vaccines against COVID-19, nearly abandoned her research in 1995 after a slew of grant requests were rejected and she was demoted by her university.
The problem is this. In the earliest stages, it is immensely hard to distinguish really big ideas from slightly silly ideas. Hence any process designed to eliminate silly ideas will also disproportionately harm the emergence of the few very best ideas. The process is entirely without self-correction, however, as in the event that a good idea is killed, no one will ever know, and in the rare event that a seemingly silly idea succeeds, it will be post-rationalized and written up as though it were perfectly sensible all along.