Open Book Publishers logo Open Access logo
  • button
  • button
  • button
GO TO...
Contents
Copyright
book cover
BUY THE BOOK

13. Staying in the game

© 2018 Daniel Nettle, CC BY 4.0 https://doi.org/10.11647/OBP.0155.13

‘I’ve done this a long time’, she said.
‘I’ve seen long careers and careers cut short.
The difference is how you handle the darkness’.

–Michael Connelly, The Late Show

A student from another university interviewed me. An assignment for one of her classes was to interview a researcher from a field of her choice about their work and their career. She chose me. As one of her prepared questions, she asked me what my aspirations were for the next ten years. I replied that, looked at in the round, what I hoped for in ten years was to have remained alive, and ideally continued doing some research.

I think the student was a little taken aback. Perhaps she had been told how focussed and driven successful researchers have to be; how they have to have boundless confidence and a nine-point plan. So to hear that the best I hoped for was merely to somehow stay in the game (and that I was not completely confident I would succeed in doing so) was disarming to her. In some ways, though, my answer was a sensible one. Science (and other creative endeavours) are rather like animal life in a Darwinian world. There are many more ways to be dead than alive, and the vast majority of all lineages die out. To succeed—more exactly, to not yet have failed—is to still be in the game; to flourish is simply to not yet be extinct.

A document has done the rounds at my university that attempts to inform young researchers about the career structure of academia. At the left-hand side is a fat horizontal arrow, pointing rightwards, which is marked ‘PhD student’. Then to the right of that one, pointing in the same direction, a slimmer arrow is marked ‘Post-doc’. To the right of post-doc, and about half the girth, is the arrow marked ‘Independent research fellow’. Finally, at the extreme right is the slimmest arrow of all ‘tenured university professor’. It is less than a quarter the width of the ‘PhD student’ arrow. And where each horizontal arrow meets the next, there is a vertical downwards arrow reminiscent of the way sinks or earths are depicted in physical diagrams. ‘Exit academia’, this arrow is labelled.

You might think if you make it to the right-hand side, to ‘tenured university professor’ then the shoals have been navigated and you are in the game forever. Not really. The research longevity even of professors is finite. There comes a point where things get becalmed. What was front page news when they were a PhD student only makes the middle pages now. They have friendly but sad annual meetings with the Dean, like the solid lay wardens of a settled religion whose congregations are declining; wondering if the fervour will ever quite come back. They are gently asked, or volunteer themselves, to take on senior administrative functions, or more teaching. There is nothing wrong with this, of course. The life of the body has to be lived. Besides, these functions are important ones. Still, they are not what most of us went into it for. We were seduced by the primary research process: the idea that you could find a question; hit on your own approach; perform and manufacture the work; and finally, see it there in print, with your name attached, a thread woven in to the tapestry of human knowledge. A thread of memory.

Perhaps the student was seeking some pearls of wisdom about how to stay in the research game. I am not sure I managed to give her any; and my immediate thought was that I am the last person she should have been seeking to emulate. If she knew how narrowly I have hung on, I thought, she might have chosen someone else for her assignment. But, on second thought, maybe the people who have narrowly hung on are the most informative. After all, you can find any number of books about the practices of the most successful, the mega-stars—the case study of the ‘winner’ is an established genre. It has always struck me that, interesting as these books might be, in a way they pose the wrong question. It’s very hard to win the big prizes, and thus of some interest to know how the few people who do, do it. But the more pressing question is: how do you stay in the game without winning them? How can you live a worthwhile and satisfactory life if you are a competent businessman but not a Bill Gates; a competent actor who is not a Marlon Brando; or a useful scientist who never garners the accolades of a Stephen Hawking? That’s what takes real grit, humanity, wisdom and technique: to just be there, quietly, purposefully, usefully, afflicted with neither pomposity nor despair, whatever the weather. To flourish in the middling state. The character we respond to most in Ronald Harwood’s play The Dresser is not ‘Sir’, the famous lead actor, but Thornton, the long-term bit-part player who loves his craft and stays in the ensemble, taking small roles with dignity year after year. A good life: always working, always touring; but never London.

I suppose, then, I am well qualified to say something about staying in the game. I have done it for over twenty years. I have contributed to research on several topics in biology and the social sciences. I hold a full professorship in a decent research-intensive university. I get some very nice academic invitations. So it seems I have avoided the ‘exit academia’ arrows, possibly even with aplomb. But—and this is what qualifies me—I haven’t avoided them nearly as easily as you probably think. If it looks that way from the outside, that’s only because you lack the data I have. You don’t see the failures, the false starts, the wasted time, or the awkward conversations with the Dean. (All of these are ubiquitous, by the way, just not well written up in the literature.) And opposite the Scylla of the ‘exit academia’ arrows has always lurked the Charybdis of my own demoralization: walking away from the game even if the game does not eject me. I gave up completely once, for several years, then eventually clawed my way back in; a second time I partly gave up but left myself attached by a lifeline, a lifeline I duly climbed up within a year or so; and probably two or three more times over the years I reached the point of starting to make other plans. Periodic demoralization and depression are not rare amongst researchers. It’s not ‘not caring any more’, or ‘not being able to be bothered’, as depression is often and erroneously characterized. It is caring so much, being so bothered, that one cannot advance on any front. One drowns in one’s own disorganized and gradually souring passion.

There’s a lot to trigger demoralization an depression in the typical diet of the contemporary researcher: intrinsic uncertainty about the subject matter and one’s progress through it; rewards that are always deferred and whose arrival is highly unpredictable in time; structurally frequent rejection that is hard not to take personally; permanent opportunities for unfavourable social comparison; and, at least in British universities since they became so obsessed with research ‘metrics’, an officially deniable but still palpable sense of threat. Despite all this, though, I still believe that to spend one’s working life as a weaver on the collective tapestry of human ideas is a noble calling, and a privilege. You’ve just got to find a way of doing it that suits you, works well enough, and keeps your spirits up most of the time. So, for what it’s worth, I thought I would set down some of the lessons I have learned trying to stay in the game all these years. These are the lessons that I wish I had told that student about, and I wish she had asked me for.

Lesson 1. Every day has to count for something

I try to start each working day with a period of uninterrupted work. Work, for me, is: collecting data, analysing data, writing code, drafting a paper, writing ideas in a notebook, or just thinking. Things that do not qualify as work are: background reading, literature searches, answering correspondence, marking students’ assignments, peer-reviewing a paper, sorting out my website, correcting proofs, filling in forms, tidying datasheets, having meetings, and so on. These are work-related activities, which are necessary for work to be possible, but are not the work itself. It’s very important that this distinction be maintained. Don’t try to do a simultaneous mixture of the two (it’s obvious how that is going to end); and always do work first, work-related activities second. For example, I might decide to start the day with two hours of work, and then, at 11 o’clock say, allow myself to start the work-related activities required to keep the show on the road. Working requires emotional commitment. It needs to be planned the day before. The phone is off; the email is off; if you are likely to be disturbed, there is a ‘do not disturb’ sign on the door. And if like me your work time is first thing in the day, then you never peek at your email before starting. I like to be very quiet in the morning, not even getting into too animated a discussion, reserving my energy for doing some work; then, once work is done, I can be more relaxed and expansive.

I thought that starting the day with a period of work was just a system that I had discovered by trial and error works for me, but something very like it turns out to recur in descriptions of the creative life,1 and self-help books on how to be more productive.2 And these accounts stress that it is imperative to do work every day. I don’t mean you shouldn’t take days off: I don’t usually work at the weekends, and I take about a month completely off every year. I mean that if today is a working day, it must contain a period of work; it cannot be completely filled up with work-related activities. Though there are some challenging exceptions (e.g. field work, travel), I try to maintain the every-day rule. If your day is filled with meetings, then fine; you have to get up an hour earlier that day to at least get one hour in on that particular day. And when it comes to scheduling meetings, you will probably have some latitude; don’t schedule them for 9 or 10am. If people ask to see you, say you are not available at those times, but that you are happy to see them at lunchtime or afterwards. And it’s good, when you are thinking of whether to drop into a colleague’s office, to be considerate about the time of day: are they likely to be working, or just doing work-related activities? Could this be sorted out over lunch or in the afternoon?

Why is the every-day rule so important? Well, there are only about 200 working days a year, so 1 day is 0.5% of a year. Proper work is really hard. And we are lazy, weak creatures. We are not set up to forage in really hard ways when much easier ways of foraging are only a click of the browser button away; why would we be? If you allow that there is some set of circumstances X that permit starting the day without settling to proper work first, then you will manage to convince yourself that X obtains quite often: namely, whenever you are a bit tired or stressed; when the problem you are working on is getting difficult; or when your belief in your current project is a bit insecure. But not working on it today won’t solve any of these problems; indeed, will make them worse. One day without useful work rapidly becomes two or three, and then a whole week; then before you know it, your working practice has descended into undifferentiated low-value grazing on work-related activities, without really getting anywhere. That’s why the difference between amateur writers and professionals is that amateur writers write when they feel inspired, whilst professionals write every day. See your designated work time out each day, even if it means staring at the wall for an hour.

You may well say, it’s all very well for you to advocate this idyllic lifestyle, since you work at a nice university that gives you low teaching and administration burdens. True enough, but I make two observations. The first is that I know plenty of people who have lower teaching and administration burdens even than me, and still don’t get much done. The second is that your deep work doesn’t need to, and probably can’t, take many hours out of each day. Even one good hour per day would cumulate quite rapidly over the course of weeks and months. And surely you can carve out one hour per day? In fact, I find few historical examples where real work goes on for more than a few hours per day, even absent any other demands. Murakami writes in the morning and spends his afternoons training for his marathons, and the great G. H. Hardy would work for 3-5 hours on his mathematics, then take himself off to Fenners to watch the cricket. For those of you who enjoy palaeo-bullshit, 3-5 hours a day was Marshall Sahlins’ suggestion for how much time humans spent working (i.e. foraging) in hunter-gatherer societies. The remainder was available for rest, social life, self-maintenance, and just being. This pattern, Sahlins teaches us, constitutes a form of affluence; not the affluence of the consumer society, but the affluence of doing a bit of the stuff that matters most deeply to you, and having simple wants beyond that.3 Once you have done your 3-5 hours, there is time for really talking unhurriedly to the people you work with; going to talks; having walks; understanding how to do something you don’t currently understand; or whatever.

Daily deep work keeps the black dog away, for there is nothing worse for mood than the sense that one is not progressing. And it can spiral in a bad way: the more you feel you are not progressing, the worse you feel; the worse you feel the more your hours become non-deep junk; and the more exhausted you are by non-deep junk hours, the less you progress. As they say up here, many a mickle makes a muckle. This translates roughly as: large things are composed of many small things. This is not a commitment of Northern folk to a particular kind of reductionism. Rather, in this context, it means that the biggest gains to your overall productivity stem not from any macro-level great leap forward, but from small changes to your daily practice. If you make each and every day a bit more productive, then the months and years kind of take care of themselves. And if, as one colleague complained to me a while ago, you are putting in eighty hour weeks and still not getting your important goals achieved, then the answer is not to put in more hours: it is to put in fewer.

Commitment to deep work entails choices. It means not travelling to more than one or two conferences or workshops a year; not taking on peripheral involvement in collaborations extraneous to your main purpose; not filling up your diary with ‘it might be useful…’ training courses or committees; or applying for money you don’t really want or need. Saying no is hard; we worry about missing out on something, and about the social or reputational consequences of a refusal. In my experience, these are siren voices. It is better to be a polite ‘no’ than a ‘yes’ who turns out to be over-busy, late, and frustrated; better no application than one rushed together at the last minute. A phrase I find useful is ‘whilst I would love to, I do not have the capacity’. I enjoy its ambiguity, and it never leads to anything other than a sympathetic and understanding return message.

Lesson 2. Cultivate modest expectations

A friend of mine talked to me years ago about how to start a theatre company. Hire a small cheap room and invite three people along, she said, then spend your time making work that means something to you. If two of the three people come along, that’s fine. Don’t sweat too much about getting West End producers to attend. As a foolish young man, I thought this was rather negative. What is wrong with these arts people? What’s the point of making work if hardly any one sees it? But now I am older I understand her wisdom. If you can manage not to care who comes, you can make the work with freedom and right mind. If you are worrying about whether the West End producers will show up (they probably won’t), then you can’t. And if your expectations are modest, you can not only meet, but sometimes exceed, them.

I was at a discussion meeting recently where a number of us from similar fields had been assembled. The organizer said at the outset that he thought together we could get a paper into the journal Science (arguably the world’s most prestigious journal) from the discussions of the meeting. I asked what the paper would be about, and he replied that he didn’t know yet but he hoped it would become clear over the course of the two days. This, it seems to me, is the very antithesis of my theatre friend’s wisdom; it is focussing on the accolade and not what the accolade is an accolade for. The consequence is not just that we don’t have a paper in Science. We don’t have a paper at all.

I have never hit the heights of papers in Nature and Science. For a long time, this and other failures rankled. I know that it cost me at least one job, and there is the slightly uncomfortable feeling of being thought of as never having quite made the grade. There are two possible philosophies here: keep on trying, winners never give up, or find an inner sense of value in your work, rather than relying on glittering prizes. I respect and can see the logic of both philosophies, and people are very different in terms of what keeps them going. But I would be a card-carrying hippy if hippies weren’t so against the carrying of cards, so I think you can probably guess which one temperamentally attracts me.

Why? Well, the glittering prizes we academics strive for are positional goods kept deliberately scarce by bureaucratic or commercial interests, and allocated in ways whose relationship to long-term value is probably quite weak. For example, Nature is a for-profit enterprise that rejects nearly everything in order to defend its exclusive market position. If we all send everything there, the rejection rate goes up. If we all increase the quality of our science, it still nearly all gets rejected, by the very design of the institution. The idea that all good papers can be in Nature or Science is as ludicrous as the idea that all Olympic athletes can get gold medals, but without the strong link between actual ability and finishing position that obtains in the Olympics.

If you are in the habit of comparing yourself relative to peers or rivals against simple external yardsticks, your modal experience will be a feeling of failure (believe me, I’ve been there). For a start, the external comparators available to us all have right-skewed or ‘winner take all’ distributions. In such distributions, the median is always well below the mean. So most people look worse than average, and all but one person can find someone who is doing a lot better than them. Besides, there is an unhelpful asymmetry of information. You get new information on your own progress every day (have I published any new papers since yesterday? No.). For your peers and rivals, you check their website maybe once a year (six new papers since I last looked!). So of course it looks like they are progressing better than you. But they may not be: you are just sampling less frequently for them than you are for yourself. The solution? How about trying not to think about it?

There is a subtle issue here of the allocation of mental energy. My theatre friend’s modesty of aspiration came from a deep understanding that mental energy allocated to chasing the external trappings of success is not being allocated to the authenticity of the work. By trying to make a West End hit, you make something which is, at best, derivative of previous West End hits. Its capacity to be truly transformational is probably limited. Great art often begins on the fringe. Similarly, valuable future paradigms and innovative ideas start life in obscure places. Journal editors cannot yet see their potential, and the authors themselves are tentatively feeling their way into something new. So by focussing on capturing the established indicators of prestige, you distort the process away from answering the question that interests you in an authentic way, and into a kind of grubby strategizing. Or so I tell myself, admittedly through clenched teeth at times.

In truth, the best things in my career have been fringe efforts, done in ludic spirit with no funding, and published in journals that base publication decisions on ethical and analytical soundness, not some editor’s hunch about whether something is a West End hit or not. This has allowed me licence to do what I thought was interesting, even though the big journals or regular funders would never look at them. Whether they stand the test of time, we won’t know until well after I am gone; but that would have been true too had they been published in Science or Nature. The outputs of which I am most proud are judged pretty much worthless in terms of the metrics like journal impact factor that universities obsess about. This hurts. However, I am consoled by the fact that there is a small band of people, spread across the world, who really get what I was trying to do, and think it is interesting. I know because they write to me; because they actually download and cite those papers a fair amount; and because as I get older I start to see tiny signs of my influence in their work. That’s the best I can hope for, and all I try to need.

Of course, your Dean might have a few words to say about your Stoic disavowal of impact factors, massive grant income, and what not. This is a difficult problem. It is important to understand that your Dean is not a bad person, or anti-intellectual; they are just relaying the pressures that are hitting them, on down the line. And they are right that our positions should not be sinecures: students and tax-payers pay our salaries, and are entitled to audit what they are getting. You should not go out of your way to avoid prestigious publications or grant income when the opportunity arises. It’s just that somehow those things have not to dictate your direction or self-worth; and you have to find a way of keeping going whether or not they come. I suppose I have had a happy knack of paying enough unto Caesar as to keep the Roman army off my back, but not so much that I lost my independence of spirit. A life I admire is that of Spinoza, who preferred to work a fraction of his time as a lens grinder than accept patronage or a university chair. This meant he could stay in the game on the strength of his lenses (which by all accounts were very fine), and pursue his philosophy with complete freedom and honesty. I am no lens grinder, but I try to pay my way in the world through a very steady flow of openly shared, thoughtful, workman-like science, even if most of it is not deemed stellar; trying to be a public communicator; being a good-enough teacher; and contributing my fair share to the common weal of university life.

Lesson 3. Publish steadily

I’ve been able to stay in the game despite not hitting the big metrics because I have always managed to publish one or two workman-like empirical papers every year, pretty much without exception. I have often done other stuff too: popular books, a textbook, reviews, more speculative ideas pieces and so on. But I do not depend on these other things. Every year, whether or not these other things happen, there is a peer-reviewed primary paper or two, not just with my name on, but actually written by me, with empirical data or original computation reported in it. This has been important both for avoiding the ‘exit academia’ arrows, and for keeping depression away.

I think the mistake a lot of people make is focussing too much on getting the big shot, the single career-establishing paper in a top journal, and therefore not quietly building up a solid, progressive portfolio of sound work. Think of staying in the game as trying to keep your head above water. You can achieve this by giving a couple of small kicks with your feet per minute. Each of these only imparts a limited amount of energy, so if you follow this strategy, you can’t afford to miss a minute. If you do, you will start to sink, and your small kicks may not be enough to regain the surface. On the other hand, as long as you keep your small kicks regular enough, they will keep you smiling indefinitely. An alternative strategy is to come up with a super-duper big kick that will send you free of the surface for many hours. Good for you if you manage this, but on average, your attempts will fail (all mine have). So you spend a minute trying to devise your schmancy big kick, and during that minute, you haven’t produced any normal small kicks. That means you are a bit lower in the water, and so that big kick is going to have to be even bigger (hence, even less likely to succeed) when it comes. So you get even more focussed on making your big kick really big; this is hard, and absorbs all your attention, and another minute goes by in which your position in the water has declined very slightly. The danger is, of course, that in the end you reach the point where the kick you need to save yourself would be infinitely large.4

I think my great strength is that I have always continued to produce something moderately useful, even when things weren’t going well, and even when the big, bold, transformative ideas I so hanker for have eluded me. This has kept my head comfortably above water—indeed, left time and energy to strive after other things too—whilst I watched cleverer people than me gradually disappear into the ‘in prep’ section of their CVs, never to return. Relatedly, although I have made many false forays on my journey, I have got something out of every foray I have made, be it a methods paper, a model, a review, or a minor empirical study. This is a good knack. So if you are worrying about staying the game, rather than planning your next Science publication, I would ask yourself where your 1-2 solid papers each year are going to come from. Just as you should not go a single day without proper work, you should not go a single year without publishing anything, as one year rapidly becomes three.

Lesson 4: Get your hands dirty

Some people go into research because they enjoy the technical stuff: building and manipulating equipment, designing and carrying out experiments, being in the field, and so on. The problem these people have is that they under-invest in writing up everything they have done. They end up with mounds of unpublished data and cool techniques that have not really led to concrete outputs, and hence have not contributed to the field in the way they should. Other people conceive of the job of researcher as closer to the job of novelist. I don’t mean they make everything up. I mean that what excites them is the writing, the putting it all together into a text that manages to capture their varied manifold of ideas and observations satisfactorily. I am the writer type: what has always attracted me is authoring wide-ranging books, articles, syntheses. Laying out the big ideas. I am never happier than when I have a free morning with a laptop and a pot of tea.

Whereas the technical type person only writes up as a last resort, to get the next round of funding or whatever, the problem with writers types like me is that we spend too much of our energy on the writing up. We view the gathering data as no more than background research, assembly of exemplary material for the writing we are doing. As a consequence we pick fruit that hangs too low: we end up using poor easy methods like online surveys, doing hasty secondary analyses of existing data, or just giving up the pretence and writing purely verbal papers that make various assertions in a sometimes appealing but often rather approximate manner. The only regret I have about my chequered career is that I have spent a bit too much energy on writing up—reviews, discursive papers—and not quite enough challenging myself by getting my hands dirty with primary research.

It’s not that I don’t see the value of the verbal argument, the synthetic text. Quite the contrary. It’s that my verbal big ideas pieces, in the final analysis, have mostly not been quite good enough to be satisfying. Getting my hands dirty with difficult primary research helps me do them better. This is because science is a specific and concrete endeavour, and hence doing the specific and the concrete is a way of disciplining one’s grasp of it. If you work on animal behaviour, then hours spent observing your animals are never wasted. If you do social research, then hours in your field site are what keep you sharp. And analysing your own data, as well as warding off dementia, brings the possibility of seeing new patterns and hence growing as a theorist as well as a data analyst. The mind fed on its own devices can become flabby and tendentious: only through a practice of repetitive confrontation with the primary phenomena we are allegedly talking about is it honed, and its confirmation biases challenged. Your animals or your people have a way of doing something you didn’t expect: this is the source of a new idea or interest. Your data will be messy, and will do a better job of refining your ideas than the peer-reviewers or conference debates can ever do. And I suppose, beyond all this, doing the primary activities of your research area is simply a way of keeping busy, keeping away from too many low-quality hours spent in front of a computer. It is a way of executing Robert Burton’s famous anti-depressant maxim: Be not idle!5

Keeping your hands dirty also means learning how to do new things. And this is a good thing: the skills I picked up in graduate school could not possibly have sustained me this long. Learning new skills has always paid dividends of one kind or another; and stepping back from doing primary research myself has always been the point at which things have started to go less well.

§

I learned lessons 1-4 by through making the best of an often-bad job. These are not necessarily good ways of being a researcher, I thought, so much as good ways of managing to remain a researcher despite being as neurotic, hyperactive and easily-discouraged an individual as I am. But, reading back over them, perhaps I am doing myself a disservice. Perhaps lessons 1-4 (or lesions 1-4, as my word processor seems to want to call them) are more generally useful.

Take lesson 1, for example. Given the enormous increases in the efficiency with which we can gather and analyse scientific information in the last couple of decades, the productivity of the academy ought to have increased many-fold. I am not sure it has; even if the volume of output per researcher has increased, I doubt the depth has. This phenomenon is much discussed in the business literature under the name of the internet or Solow paradox: ‘You can see the computer age everywhere except in the productivity statistics’.6 For most academics, what the internet age has brought is mostly an increase in the available ways of treading water in low-quality work-related activities, without getting round to much real work. Apparently the average business email is read 6 seconds after being delivered. Given reasonable assumptions about the duration of undivided attention that useful thought requires, this means that on the modal day in a modern office (and probably university too), the amount of quality work done is…erm, none at all.7 So if we all started our days with a few hours in which the internet was shut off and curfew was enforced, I think our outputs would increase dramatically in both quantity and quality. Interestingly, over the years I have read literally dozens of institutional plans for improvement in output. They are full of meaningless statements like: ‘We will focus on our core themes whilst also responding to strategic opportunities’ or ‘We will expand our teaching offer whilst increasing our research capacity’. In other words, there is no strategy at all. Never once have I read one that said: we won’t hold any meetings in the morning, so that staff can actually get some work done. I wonder why, since it is what might actually make some difference, and it would probably make us all nicer people to work with.

Now consider lessons 2 and 3. People straining after high impact factors and flashy publications has a serious distorting effect on scientific knowledge, and reduces the efficiency of science. It means people over-invest in under-powered, cute exploratory studies, and under-invest in well-powered confirmations. It leads to serious publication bias away from the null hypothesis, and consequent falsehood of much of what we find in the textbooks. It motivates researchers to oversell their story, and exercise degrees of freedom in what they report; and peer reviewers to focus on grandstanding and subjective value judgments, rather than providing technical verification and assistance to the authors in better understanding their data. It encourages secretiveness and competitiveness, rather than what science should be about, which is open sharing and collaboration. A bit more indifference to the glittering prizes, and more of a focus on creativity, integrity and openness, would be to our common good.8

And then we come to lesson 4. I recognize in myself that I have been overly tempted to perorate, at the expense of detailed empirical or computational work. Looking around, I can see a number of eminent people in my field who have made precisely the same mistake (they tend to be men, interestingly). They accept the quick and dirty from their lab or field site, or just give up on having a lab or field site at all, and carve a niche of sitting in their studies putting the discipline (or several disciplines) to rights in a stream of long-form verbal salads. Of course I see what motivates them to do this—here I am trying to do the same, after all. Look at Darwin, they say: it’s the big ideas pieces that change the world. Look at Darwin, I respond. Literally thousands of hours of experiments on barnacles, worm-casts, and the germination of seeds immersed in salt water for every big ideas piece he wrote. It’s the careful artisanal practice that makes the big ideas pieces really good when they come. The idea that you might only write big ideas pieces seems like an athlete choosing only to run track finals, and never training runs.

When I look at eminent colleagues who rose above the level of getting their hands dirty and became full-time commentators in the field, it seems to me that their ideas contributions started to get less valuable around the time their direct involvement in primary research reduced. Where formerly their thought was taut and rigorous, it became vaguer, flabbier, more programmatic, and more self-referential. They cherry-picked their examples. Their ability to see both sides of the problem decayed. Empirical research, I like to think, is like an adversarial collaboration with reality. The mind is like the immune system; to function properly it needs to be constantly challenged by data. So if like me you are prone to covet big ideas and the freedom to spend all day pontificating, it would probably not be bad to force yourself to spend, say, two thirds of your effort gathering and analysing primary data. It is not that you shouldn’t write your big ideas pieces. You should. It is that these will be improved by grappling between times with the real concrete problems of the working researcher. I like to think that Spinoza’s lens-grinding did more than buy him the freedom to pursue his philosophy. I like to think it made his philosophy better.

On which note, I have spent too much time on this essay. I’ve got a data set that needs analysing.


1 For example, Murakami, H. (2008). What I Talk About When I Talk About Running (New York: Knopf); Hardy, G. H. (1940). A Mathematician’s Apology (Cambridge: Cambridge University Press).

2 For example, Newport, C. (2016). Deep Work: Rules for Focused Success In A Distracted World (New York: Grand Central).

3 Sahlins, M. (1968). Notes on the original affluent society. In Man the Hunter. (R. B. Lee and I. DeVore eds., New York: Aldine Publishing Company, p. 85–89); Sahlins, M. (2009). Hunter-gatherers: Insights from a golden affluent age. Pacific Ecologist Winter 2009: 3–8, downloadable from: https://pacificecologist.org/archive/18/pe18-hunter-gatherers.pdf

4 Devotees of foraging theory may recognise the spirit of David Stephens’ classic risk-sensitive foraging model here: Stephens, D. W. (1981). The logic of risk-sensitive foraging preferences. Animal Behaviour 29: 628–9, https://doi.org/10.1016/s0003-3472(81)80128-5. And there are also echoes of Dean Keith Simonton’s work showing that more successful creative people—whether in academia or the arts—are distinguished from less successful people mainly by producing more stuff overall, even though much of it is minor. See: Simonton, D. K. (1997). Genius and Creativity: Selected Papers (Greenwich, CT: Ablex Publishing).

5 Burton, R. (original publication 1621), The Anatomy of Melancholy, What It Is: With All the Kinds, Causes, Symptomes, Prognostickes, and Several Cures of It.

6 Solow, R. (1987). We’d better watch out. New York Times Book Review July 12, p. 36.

7 See Alter, A. (2017). Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked (London: Penguin).

8 For the critiques of current scientific practices on which this paragraph draws, see: Young, N. S., J. P.A. Ioannidis and O. Al-Ubaydli. (2008). Why current publication practices may distort science. PLoS Medicine 5, 1418-22, https://doi.org/10.1371/journal.pmed.0050201; and Higginson, A. D. and M. R. Munafò (2016). Current incentives for scientists lead to underpowered studies with erroneous conclusions. PLoS Biology, 14: e2000995, https://doi.org/10.1371/journal.pbio.2000995