Catching up on the literature is a daunting aspect of graduate studies. As a physical oceanographer, I regularly cite work from 30 to 40 years ago. In that time, and all the way back to the turn of the 20th century, the scientists before me got to answer all the low-hanging-fruit problems and write the papers that will be cited thousands of time. They leave behind the messy, complex, and esoteric questions for the current grad students. Surely, then, I would think the 60s or 70s or even earlier would have been the best time to be a grad student?
Grad students decades ago (presumably) had fewer worries about whether a planned research question had already been answered, whether the question was actually important, and how their answer moved forward the (then not-so-large) literature. But, and it’s a big but, the methods at their disposal made finding the answer anything but straightforward.
I don’t envy anyone who had to consistently, say,
- Create a stack of punched cards in order to run calculations on a shared mainframe computer
- Use a slide rule together with pen and paper rather than a calculator
- Write long-winded code in a compiled language like C++ to make a simple plot
- Pass off their handwritten work for someone else to type
- Refer to dense mathematical handbooks
These weren’t the point of doing science. They were merely a necessary tool on the way to an answer or manuscript. Sometimes not even that.
The only tool from the list above that I’ve ever used is the last. I was after the solution for the temporal evolution of temperature in a solid around a cylinder held at a constant temperature. As per my supervisor’s suggestion, I consulted Conduction of heat in solids, the bible for heat conduction problems (Carslaw and Jaeger, 1959). Here’s the answer
The solution to the differential equation I wanted to solve was this ungainly integral. Admittedly, there was some satisfaction getting to this point using such a purist approach. Except there’s a singularity at the lower limit of integration, which makes the “answer” rather unhelpful. Instead, there’s a straightforward way to solve the problem numerically in Matlab, so that’s what I did. This is just one anecdote, but it’s emblematic of the tools available decades ago vs the tools available now.
Reading between the lines of my list of old-fashioned tools, there’s a recurring theme of minimal, if any, room for error. I rely on trial and error daily to develop code, test vague and incomplete hypotheses, or even just write a paper. It’s quicker to iterate toward a solution through trial and error by running code that I know wont work to see what error I get rather than checking all the inputs. I don’t consider this laziness; it’s simply offloading labour to the computer in order to free up concentration for more important, less tedious tasks. Scientists sharing mainframe computers the size of a wall never had that opportunity.
The new tools available
The past 10 years has seen countless different scientific tools and resources developed. As of writing (April 2018)
- Stack Overflow launched 10 years ago
- Google launched 20 years ago
- Inkscape launched 14 years ago
- GitHub launched 10 years ago
- Wolfram Alpha launched 9 years ago
- Google Scholar launched 13 years ago
- Mass digitization of old papers occurred about 10–15 years ago
- Numpy launched 13 years ago
- Matplotlib launched 15 years ago
- Modern text editors launched 10 years ago (using Sublime as the example)
- Modern blogging tools launched 15–20 years ago
- WestGrid (Western Canada’s HPC collaboration) formed 16 years ago
- And internet speeds have increased who knows how much
This list might appear to contradict my conjecture that the current decade is the best. If all of these tools launched 10 or more years ago, then shouldn’t that make 2000–2010 the best decade? No—these tools take time to mature. Take Numpy and Matplotlib as an example. Without these libraries, we wouldn’t be able to use Python as a free, superior alternative to Matlab. But when they launched, their installation was troublesome. Six years ago, Conda came along and made it all much simpler. And thanks to StackOverflow, its seldom difficult these days to find the way to make Numpy and Matplotlib do exactly what you want.
But what about the 90s?
Someone writing this post 20 years ago may have had a similar outlook. They might have listed recent advances including Matlab, Windows 95, word processors becoming WSYIWYG, or email becoming mainstream. These advances are, relatively speaking, arguably better than my list above of what we’ve seen in the 21st century. But the 90s also had its computing challenges. For example, dial-up internet will tests anyone patience and computer specs were laughably poor by today’s standards. Let’s be honest, though, this argument against the 90s is weak at best. Maybe this decade was the Goldilocks period when knowledge was easily accessed but the scientific literature was still manageable and wide open. I really can’t say, I was just a kid.
And what about the next decade?
Better tools might appear in the next decade that make doing science even easier. But, and I don’t think I’m being pessimistic, it’s hard to imagine the tools we use getting significantly better. Incremental improvements, sure, but fundamentally changing, probably not. Couple that with the exponential growth in the literature and our consequent decay in attention to it and it becomes easy to see why there are arguments like John Horgan’s that science is hitting a wall. I certainly don’t believe we’ve hit a wall or reached the end. (I tried reading, but couldn’t finish, Horgan’s book The end of science.) Instead, I think we’re only getting started. I’m just glad I got into research now, not 10 years later when there’ll be a much bigger handicap to getting caught up.