Science in the ol’ days: A millennial’s perspective

Einstein had it easy as a scientist. His most famous paper had no references and his work was seldom peer reviewed. In one instance in 1936, he withdrew a paper submitted to Physical Review on the grounds that he had not authorised it to be shown to a specialist before publication. In another instance, he asserts

Other authors might have already elucidated part of what I am going to say. […] I felt that I should be permitted to forgo a survey of the literature, […] especially since there is good reason to hope this gap will be filled by other authors.

Einstein, of course, didn’t actually have it easy—being forced to flee his native Germany is the obvious counter example. And he faced stiff competition in the scientific arena. I mean, have you ever been to a scientific conference in which half of the attendees had or would win a Nobel prize?

A who’s who of physics in the early 20th century: 17 of the 29 attendees of the Fifth Solvay Conference in 1927 had or would go on to win a Nobel Prize. Image: Wikimedia Commons

Everyday life has changed in countless ways since Einstein’s time. But has the practice of science changed? Do these Einstein anecdotes generalise? That’s what I’ll try to answer here. Except it will be more of an educated guess because, as the title notes, I’m a millennial. For context, that means that I had the distinct benefit of Google, StackOverflow, and a mass of open source software throughout my graduate days. I’ve never done real scientific research without my own computer and an internet connection.

Given my lack of experience with science in the ol’ days—an amorphous term I’m taking to encompass the 20th century with occasional earlier periods—I offer the following caveat: while some research went into this post, it should not be taken seriously.

Authors could get away with more

Special treatment wasn’t the reason Einstein could get away without peer review. Other scientists avoided it as well. It was simply less common back in the day. Many journals only formalised peer review procedures in the last 50–60 years. Before that, the editor had a lot more discretion over what to publish. Not completely uncoincidentally, 50–60 years ago was also the time photocopiers were first mass produced. Before that, copies of a manuscript were more precious, so fewer reviewers were assigned. (Nowadays, unlimited copies means peer review occasionally gets out of hand.)

The arrival of ubiquitous peer review didn’t stop authors sneaking in comments that seem brazen by today’s standards. Take this from 1973:
I am arguing primarily on the basis of ‘intuition’ (i.e. experience) gained during my approximately two decades of active participation in scientific research.

Other ways to raise eyebrows these days? Write a paper without citing anyone. Or contain the whole abstract in a single sentence, like this one from 1978:

Then again, at 222 characters, the whole thing would fit in a tweet. Maybe he was just 40 years ahead of everyone else?

Medical ethics were less of an obstacle

The best days of medical research are yet to come” stated Jonathan Moss in a 2005 convocation address at the University of Chicago. Sounds like typical fodder to inspire graduates, right? Except most of his remarks lamented what is no longer feasible in medical science. He suggests that discovering anaesthesia wouldn’t be possible today given modern regulatory burdens. In 1846, the dentist William Morton discovered that inhaling ether worked as an anaesthetic. In order, his experimental subjects were goldfish, pigeons, the family dog, himself, his two dental assistants, and then some dockworkers.

Speaking of drugs let’s not forget the 1960s experiments in which engineers, mathematicians, and designers were dosed with LSD to test how it affected their problem solving ability. The results, while somewhat subjective, seem rather compelling. And they seem consistent with the finding that spiders on acid build better webs, as determined in the 50s and 60s. The human trails were shut down in the mid 60s.

Returning to Moss’s convocation address, he also cited the x-rays and heart catheterization as developments that are no longer feasible. While he recognised the obvious positives of improvement in human and animal welfare, he concludes by noting that drug development is now vastly more expensive than it used to be: only 25 percent of the cost is attributable to the research itself, with most of the remainder going to legal and regulatory issues.

Moss’s examples of experiments are downright tame by comparison to those Bill Bryson details in his book The Body. Some of the goriest follow:

  • In the early 1700s, to experiment on blood pressure, Reverend Stephen Hales opened the cartoid artery of a tied-down horse and measured how high the blood shoot up an attached glass tube.
  • In 1822, seeing a rare opportunity for a literal inside look, Dr William Beaumont observed human digestion with the help of a man who survived a bullet through his stomach (the hole being big enough for food to fall out).
  • In 1953, in an attempt to minimise seizures in a patient, Dr William Scoville removed three inches worth of tissue from each hippocampus. (His access point to the brain, you ask? Created with a drill saw from the local hardware store.)
  • The last example circles us back to the University of Chicago, and an experiment in which 10 rats were subject to total sleep deprivation until they died 11–32 days later. (In my notes, I’d recorded that as happening back in 1898. Turns out it was actually 1989. I know lab rats still a raw deal, but surely keeping them awake for weeks straight wouldn’t fly these days?)

On scientific titles

On the Origin of Species, Darwin’s famous work, has a nice ring to it and gets to the point. Had Darwin’s publisher not intervened, however, the title might have begun with the wordier An abstract of an Essay on the Origin of Species.

Starting titles with unnecessary words used to be all the rage. (It’s still too common.) One word in particular stands out: “On“. On the capillary phenomena of jets, for example, or On the quantum correction for thermodynamic equilibrium.1

In my cursory research, I failed to find a consensus behind the prevalence of on. That said, three different StackExchange questions provided many plausible hypotheses:

  • Including on avoids implying the paper is a comprehensive treatment of the topic, and is instead merely a discussion related to the topic.
  • It’s tradition: scholars were doing it in Latin (de taking the place of on), so why stop now?
  • Many famous works are entitled “On …” and therefore using this form in your own titles adds some grandeur by association.

Although it didn’t answer the question of why on titles were so common, one comment is particularly apt: You could go full Victorian: “Some initial considerations towards a revised perspective on the …“.

Don’t let the introductory phrases fool you though. Some of the earliest scientific papers were rather vivid as Timothy Lenoir summarises: Description of the appearance of three suns seen at the same time on the horizon, Observables on a monstrous head, and On a species of wild boar that has a hole in the middle of its back, which foams when it is pursued by hunters.

There was also clickbait far before anyone was clicking. Notice of an extraordinary fish, the title for an 1835 article, may arouse curiosity, but isn’t much use to anyone looking for reports on whale sharks in Manila Bay as noted in the fittingly titled piece On the proper wording of the titles of scientific papers.

Budding scientists were more resourceful

Chemistry sets are what first piqued Gordon Moore’s interest in science. In the 1940s at age 16, he was producing dynamite and nitroglycerine in his garage. “You could buy anything,” as he put it.

Given who we’re talking about, it’s hard to overstate the importance of that experimentation at a young age: Moore played a pivotal role in both the development of the transistor and bringing it to the mass market (he’s a co-founder of Intel). His research in chemistry and physics made modern computers possible.

Robert Hall has a similar story, albeit less explosive. In the 1960s, the laser was being developed, refined, and miniaturised. One of Hall’s parts in that was the idea of polishing the gallium arsenide crystal. His experience for this? From polishing glass as a child to make his own telescope.

Other budding scientists were resourceful in a less hands-on way. At age 11 or 12 (in the early 1930s), Richard Feynman got interested in radios. In his autobiography, he notes that radios back then were out in the open and easier to troubleshoot. He could therefore “fix radios by thinking”. If that doesn’t foreshadow a kid on his way to becoming a theoretical physicist, then I don’t know what does.

Bill Wadge, now a Professor Emeritus of computer science, also tried his hand at assembling radios, though with less success. Instead, what whet his appetite for science was having a tangible representation of binary logic. Speaking of his childhood: I could build switches with wood, nails and bare copper wire. I worked out various circuits on paper and was able to do binary addition with triple pole double throw switches.

Papers didn’t need explanations

A typical 1980 paper is nearly three times wordier than a typical 1910–40 one. The 1984 paper from which the finding comes2 proposed a number of explanations as to why. Some might surprise you. For example, that Japanese publications in the 50s had strict length limits because they were limited in getting enough wood-pulp to make paper to print on. Others might surprise you in a different way. As they put it: one- and two-page papers in astronomical journals have been nearly eliminated by a communal and editorial decision that observational data must be accompanied by interpretation to be publishable.

We don’t often think of science nowadays as a means of collecting the strangest things we can find—with no way as yet of explaining them—but that is what what science was
– Samuel Arbesman in Overcomplicated

Scientific tools were less digital

What’s your preferred way to cut and paste? Ctrl + X, Ctrl + V? Cmd + X, Cmd + V? Or with a knife and glue?

What’s your preferred platform for calculations? Matlab? R? Python? Or pen and paper?

What’s your preferred dataset format? Excel spreadsheet? NetCDF? CSV? Or printed tables in the back of the manuscript?

You see where I’m going, so I’ll stop. Actually, I’ll stop because I’m out of my depth. Like I said, I’ve never done scientific research without a computer. I’ll leave it to others to describe their nostalgia, or lack thereof, for obsolete technology, whether that be floppy disks, carousel slide projectors, reprint request cards, dark rooms for photography, or the physical volume of the Science Citation Index.

Feel old yet? If not, note that in the not-to-distant future, I’ll be able to add the following question to my list above: What’s your favourite library? The opening answers will be software libraries, not physical places.

Looking forward

In 20 years time, we might look back and wonder

  • Were we really publishing the vast majority of science in the static form of a PDF rather than something interactive?
  • Did we really judge productivity by publications rather than, say, software development and maintenance?
  • Was there a time when we didn’t turn to Google first to search for scientific datasets? (As of writing, Google’s Dataset Search is brand new.)
  • Why were we committed to interpretable statistical models in cases where black box machine learning tools give better answers?

Remember, the current practice of science only looks modern by comparison.

Footnotes

1. I’m guilty of entitling my first real scientific document with an On. My honours dissertation was On the rate of refreezing in a bore hole in an ice shelf. I came to my senses and removed both the on and most of the passive voice by the time it was published as an article as Estimates of the refreezing rate in an ice-shelf borehole.

2. The 1984 article I cited regarding the length of astronomical papers is worth a read if only for the entertaining snippets including the following:
The letter sample had to be fudged, since some did not exist in 1950. (Well, at least they’re honest about fudging.)
The data set apparently constitutes a sort of Rorschach blot test of how one thinks science is changing and how it ought to change. (In the paper, five different experts come to five different conclusions as to the reason for the increasing lengths of scientific papers. The quoted text is evidently a euphemism for we can’t agree).

Author: Ken Hughes

Post-doctoral research scientist in physical oceanography

One thought on “Science in the ol’ days: A millennial’s perspective”

Comments are closed.

%d bloggers like this: