I’m moving to Los Angeles in a couple of months!  Stay tuned for more.

Archives: Personal opinions

Science isn’t a pizza, so stop slicing it up.

Science isn’t a pizza, so stop slicing it up.

 

Praj, author of the blog Do I Need Evolution, drives me nuts. Don’t get me wrong, he seems like a nice guy and a well-meaning one at that. Yet as one of the new wave of commentators on the science vs religion battles, he appears to hold a view that basic science is just an indulgence that we should be quiet about in favour of the real science that puts satellites in the air and cures diseases:

You see two images when scientists speak about “science” (something I think we should avoid, but that’s another story). One is science as a useful tool: it helps us cure diseases, win wars, grow the economy, feed the planet, and so on. The other is science as a world-view: it imparts a sense of wonder, conquers fear, and reveals beauty. These images are a spectrum rather than distinct categories. Most scientists have some of both, though applied researchers are usually closer to the science as a tool view and basic researchers tend to be on the other end.

The problem is that the overwhelming majority of non-scientists, and especially the religious, don’t care very much about science as a world-view. They live on the very far end of the spectrum where science has almost zero intrinsic value. To those people science only matters because it helps them do stuff they care about.

Science lobbies appreciate this fact, which is why they focus on the concrete, tangible benefits of research. They know it would be ridiculous to ask for billions of dollars because some people think particle physics is beautiful. Policy experts also appreciate this fact. The standard “explain your thesis to your grandmother” interview question for my DC fellowship is judged on how well you make your research relevant. I suspect many academic scientists don’t appreciate this fact. Or if they do, they don’t weigh it as much as they should. Academics are especially prone to hyperbole about the wonders of science.

Praj would like us to think of science as more or less practical; some science will give us economic benefits, and can be explained in that light, so that’s good. Other science is ‘world-view’ science that only exists to satisfy the whims of a subset of curious people. But I’m going to disagree here and say that Praj doesn’t understand science very well. Despite lip service to a ‘spectrum’, he would like to slice science up into boxes that can be addressed independently. That way, we can focus on ones that are practical, and ignore ones that aren’t 1. But we can’t do that with science. Science is a process and a body of knowledge that is interconnected and historically contingent.

As an example of interconnection, we can look to Darwin himself. Putting aside for the moment the historical antecedents to his work on evolution (including Lamarck and his own grandfather), Darwin had to integrate ideas from all corners of biology with the work of the economist Thomas Malthus to arrive at his insight regarding natural selection. In order for this theory to be sensible, it required a much older earth, ideas that came in part from the volumes of the geologist Charles Lyell that he read while on the Beagle. One of the first serious scientific challenges to evolution came from the physicist Lord Kelvin, who calculated both the age of the earth and the age of the sun before concluding that both were too young for evolution to be valid. This phenomenon has only gotten stronger over time. We carve fields like biology, physics, chemistry, psychology, etc. up into separate fields because we have to have a way to award degrees, and topics can indeed be thought of as clustering together naturally. Most ecologists don’t study quantum field theory, because with our current understanding of science, it’s hard to see how to use it effectively in their work. But that doesn’t mean that we can take those lines in a course catalog as representative of some real and sharp division. What we call biology and chemistry are deeply interrelated, as anyone who’s spent time in a molecular biology lab will tell you. Neuroscientists spend a lot of time on the biology and chemistry (and by extension, biophysics) of the brain and nervous system. One of my favourite evolutionary biologists is John Maynard Smith, who trained as an aeronautical engineer, and we all know that physicists are math fetishists (I kid, I kid). Just look at the new interdisciplinary fields that are cropping up with increasing frequency: biophysics, neurochemistry, behavioural and neuroeconomics, agrophysics, systems biology, computational sociology. And I could do this all day, because science is a heavily connected graph of fields that reflect an underlying continuum in our study of nature. Apparent divisions in scientific fields usually reflect more about our lack of understanding than they do of any real separation.

This brings me to my second point, the historical contingency. Praj and people like him would like to focus on ‘practical’ science that we can make relevant for the public. Things that bring economic benefits now. ‘Applied’ science, new technology, and so on. But people who champion this division between basic and applied research are making a simple mistake of perspective, one that even serious historians of science are prone to making. What is considered applied science now relies directly upon research that used to be considered basic and impractical2. Applied science is simply science for which the next step on path is to an economic benefit that is clear and predictable, but ‘practical’ or ‘applied’ science doesn’t get to walk away and ignore this chain of connections and history. Putting a satellite into orbit relies on centuries of work in physics and mathematics that was once considered deeply impractical. It is the result of thousands of individual steps, some practical, some not, that have given us the ability to put things into orbit. Medical doctors rely on work in anatomy, biochemistry, and biology that has often been considered very impractical (Galen himself wasn’t allowed to work on humans because work on cadavers wasn’t permitted; he inferred from animals because of their anatomical similarities). Astronomy and cosmology are the prototypical basic and useless sciences now, but if we ever become a truly space-facing people, then our descendants will be very happy that we wasted our time on it for the simple pursuit of knowledge. And who knows? Astronomy and cosmology could suddenly become very useful before that; we can’t easily predict what will be applicable, but the history of science tells us that we can expect to be surprised. I’m sure that the Einstein of 1945 would have had some words to say on the topic to the Einstein of 1905.

Praj would like us to believe these things about science, because he wants to believe that Bill Nye is wrong when he says that creationism threatens our ability to understand the world and innovate in science and technology:

I’ve often wondered how people like Bill Nye can maintain this apocalyptic vision. As Saletan notes in the very next paragraph, there are actual, real-life engineers and scientists who reject evolution.

Praj himself relates that his parents were successful doctors and his dad doesn’t understand evolution, so evolution must not be relevant to medicine. He make similar claims several times on his blog, so it’s worth finishing off this post by addressing it. Yes, there are successful doctors and engineers and even other scientists (though few if any biologists) who don’t understand evolution. That is, of course, not the same as saying that they actively reject it and believe in young-earth creationism. There’s a difference here. I don’t understand much about particle physics, but though I consider that a failing it’s one I can live with. There isn’t enough time in the day for me to learn everything I would like to. And yes, doctors who haven’t learned about evolution aren’t necessarily bad doctors. I can even forgive those who ‘reject’ it because they’ve never been exposed to it properly. But doctors who actively reject evolution when taught it, and believe that the earth is 6000 years old? This requires that they actively read and reject the evidence from not just biology but physics, chemistry, geology, and so on. This requires that their critical thinking skills are so deficient that they cannot understand and assimilate anything of such a large and coherent body of evidence upon which there is broad and solid scientific consensus. How can this be a good doctor or engineer? Would you like your satellite designer to be a flat-earther? Would you be worried if your bridge engineer was proficient but convinced that physics and material science works as it does because of the action of ambitious fairies?

When my doctor asked what I did for a living and then launched into a tirade because he was a young-earth creationist, I asked him how he dealt with giving advice to his patients on vaccination or antibiotics. He replied that he didn’t believe in vaccination and that he didn’t think that antibiotic resistance was a problem, because viruses and bacteria don’t evolve. I changed doctors that day. This is a man who may actually kill patients with advice like this, and it stems directly from his religious beliefs. Are all creationist doctors and engineers bad at their jobs? No, but I submit an empirical hypothesis that doctors, engineers, and scientists who are actively creationist (especially YEC) and reject scientific understanding to protect their beliefs are more likely, on average, to be bad at their job and to have crucial deficiencies in their thinking that could prove harmful to their work.

Science isn’t something that can be cleanly chopped up into convenient portions and picked over for economic benefit or religious palatability. It is a method, the best method we have, for discerning the truth about the universe and everything in it. It is a deeply interconnected and historically contingent search for that truth. What we chose to do with that truth afterwards is up to us, but when we ignore those connections and history and our inability to predict the future, we do so to our own detriment.

  1. Like evolution.
  2. Note that I’m not making the same claim about technology, which as McClellan and Dorn persuasively argue, has had a habit of coming before the science that explains it

Nobel prize or GTFO.

After reading this great and thought-provoking post by scicurious on ‘failure’ in academia, I came across this comment (which was itself a reply to another comment):

Jenna, with respect, this is a description of how to do incremental science – but it’s NOT how discoveries are made. Not big discoveries, anyhow. Big discoveries DO require inherent creativity and vision .. and we are systematically weeding those qualities out of biomedical science, thanks to the hyper-conservatism exercised by NIH panels.

Please, go read Thomas Kuhn. I’m begging you.

Allow me to respectfully present the opposite view: please don’t read Kuhn. Or at least, don’t read him with the wide-eyed adoration that he’s afforded. Kuhn was dangerously anti-scientific and I place at his feet a great number of the current structural problems with science. His disdainful view of plodding ‘normal science’ and focus on revolutionary ideas is – IMO – a direct precursor to what I might call the ‘Nobel Prize or GTFO’ disease that permeates science now. You either get the cover of Nature and Science, or you’re a waste of time as a scientist.

And what’s wrong with incremental science? For one thing, it’s the way that a lot of science is done. As James Franklin1 points out, fields like ornithology or oceanography work on the incremental accumulation of knowledge and aren’t prone to paradigm shifts. Are they not science? Have we learned nothing from the explosion of scientific fraud in recent years as people are forced to come up with ever-sexier results or lose their funding? Or the recent push for replication, is that just plodding normal science? Kuhn makes us feel good – if no one’s listening to our ideas then it’s not because they’re bad, it’s just because of the establishment, man! – but his contribution to philosophy aside, he’s no basis for a useful approach to the work of science.

  1. In What Science Knows: And How It Knows It, a good book on the subject.
Hug an advisor today.

Hug an advisor today.

It’s inevitable.  Put two or more grad students1 in a room, and sooner or later the talk will turn to advisors;  wait a little while longer and chances are good that someone will start complaining about their advisor.  This shouldn’t be surprising.  Bosses all over are fair targets for griping, and the power imbalance inherent to the student-advisor relationship certainly doesn’t help matters2.  After being stuck in a lab together for several years, it’s a wonder that the average science department doesn’t have hallways littered with bodies.

But there’s a flip side to this, which I think might be under-appreciated:  advisors can also be awesome.  No relationship like this is going to be perfect, but a good advisor will teach you, guide you, and give you a leg up the academic ladder.  They’ll help you when you trip and faceplant, they’ll give you the advice you need (even if you don’t want to hear it), they’ll introduce you to the right people, they’ll make you better scientists.

Grad student

 

I’ve gotten very lucky along the way3.  Michael advised me on the undergraduate independent study that first inspired to take the idea of a research career seriously;  he also humoured me when I bit off a project that was far too much for me to chew and helped me sort through the resulting mess.  Pete, my M.Sc. advisor, got me into studying biological questions and gave me a place to work when my Ph.D. went sideways.  Luc-Alain, my Ph.D. advisor, turned me into an independent scientist and put up with far more than I had any right to ask for.  Finally, my postdoc advisor Mark has to be one of the nicest (and smartest) people that I’ve ever met and has been supportive in all the ways that have made being a postdoc enjoyable.  I’ve been privileged to work with a succession of people who’ve taught me and made me better at this science thing, and today I’d like to take a moment to be thankful for them.

Many people have poor relationships with their advisors, and there are a lot of valid questions to be asked about the grad school experience.  I don’t want to diminish the problems of anyone who has had a poor relationship with their advisor (I’ve known my fair share of bad advisors, even if I’ve been fortunate enough to avoid having them myself).  But if you have a good boss, now might be the time to take a moment and reflect on the things that they’ve done for you.  Give them a hug, a handshake, a friendly email, or a fresh data set.  And one day, when you’re in their position:  remember what it felt like to be the student.

  1. Honours students, postdocs, RAs, …
  2.  And yes, people have done research on this.  Are you surprised?
  3. Seriously lucky.  I didn’t do any of the things you’re supposed to do when picking an advisor.

Some tips for an academic job talk over Skype…

I recently had the experience of applying for a postdoctoral position at A Very Important University and made the shortlist to be interviewed.  Now, let’s face it, that’s a pretty terrifying thing to start with;  it wasn’t made better by the fact that I was doing it over Skype to a location most of the way around the world.  Visions of technical glitches, bad sound quality, and an overall horrible experience both interviewers and interviewed haunted me.  So, I spent several days polishing my talk, thinking up ways to make the Skype process smoother, and even reached out to Twitter for advice.  And boy, did I get it!  There were some great suggestions out there, a few of which really saved my bacon.  Thus, to save you the trouble of figuring all of this out for yourself, I’m going to share what I learned with you.

On the use of the word ‘rape’ for non-human animals…

I wrote earlier today about a post clarifying a misinterpretation of a study which was itself misinterpreted.  A short recap:  the original post was meant to correct a furore that had arisen around a paper on dolphin social networks;  the journalists covering the story ended up playing a version of the Telephone game that turned the phrase ‘bisexual philopatry’ into ‘Flipper the bisexual rapist’.  In writing the post, I was simply trying to untangle the mess, but somehow people mis-read the correction so badly that they believed I was painting dolphins as cuddly underwater teddy bears.  In fact, I was just pointing out that the original study had nothing to do with the sexual behaviour of dolphins, not that dolphins don’t display coercive sexual behaviours.  When I found out this morning that I was being linked to – again – in the same garbled way, I decided that it was time to put a lid on it.

Academentia and the iPad: an update from 2010.

iPaddr

Daniel Bogan via Compfight

 

Prompted by a conversation on Twitter yesterday, I revisited my old ‘how I use my iPad in academia’ post and since it’s been about 2.5 years, it seemed time for an update.  So, some quick notes on how I’m using it now:

  • Reading.  I don’t use iAnnotate any more, since Goodreader added annotations and nice Dropbox syncing.  I keep a Dropbox folder with my current ‘to read’ pile of articles and books, which is slurped up to my iPad by Goodreader.  I read articles on the iPad, annotating and highlighting as I go, and then I sync it back to Dropbox.  Back on my Mac, I file the papers away in BibDesk (I haven’t seen a use case for ‘social’ reference managers like Mendeley yet – at least for me – and I’m not sure if I’m going to any time soon).  Other ebooks, including an increasing number of textbooks, are read via the Kindle app and iBooks.  I prefer the iBooks software, to be honest, but right now I read it where I can find it.  Truth be told, I’ve gotten so used to ebooks now that when I can’t find something in digital format it noticeably irritates me.
  • Calendaring:  I sync my Google calendar to my iPad and iPhone, and read my calendar on the iPad with Agenda.  I prefer Agenda for the clean interface, though it’s a minor preference for me.
  • Organisation:  I wrote back in 2010 that Things was moving too slowly for my taste and that I was going to search for alternatives, but I never found one I was comfortable with.  I tried a lot of them:  Today, Remember the Milk, Appigo’s Todo, Wunderlist, and more.  All of them had some sort of problem that turned me off, be it bad syncing or subscription plans for useful services (hell, no) or something else that bugged me enough to make me switch back to Things.  I honestly don’t think that Cultured Code really deserves as much of my money as they’ve gotten, but I keep coming back to them for some reason.  This is a highly individual thing, though, and your mileage is going to vary.  A lot.
  • Social media, of course:  I still prefer the stock Twitter app on the iPad over alternatives so far, though if I do switch it will probably be to Twitterific.  I’ve written blog posts using Blogsy and I use the WordPress app to administer the blog.  I’ve made a few Skype calls with the iPad, which turned out all right (though I prefer wired connections for video calling), and the iPad is really the only way that I check Facebook any more.
  • News:  now that Google Reader is going the way of the dodo, I’ve switched to Feedly and I couldn’t be happier.  For saving stories to read later, I rely on Pocket.
  • Navigation: I’ve found that Apple Maps has gotten much better recently, so I’m no longer unhappy that Google Maps isn’t on the iPad (still not sure why that is, though).  I use maps more on my phone anyways.
  • Note-taking:  This is a category with a lot of change since 2010.  Nowadays I’ve switched largely to Notability for note-taking.  I find its handwriting set up easy to use when I’m jotting down notes in a meeting or a seminar, and it’s intuitive for scribbling on manuscripts and sending them back to colleagues.  I use a stylus for these tasks;  I’ve enjoyed the Pogo Connect, but my wife enjoyed it so much for drawing that she actually stole it from me.   So while I wait for the Adonit Jot Touch to shop (grrr, delayed), I’m using a $10 Dausen stylus that actually works quite well. I’ve also used Noteshelf as a notetaker for its nice writing tools and early integration with the bluetooth styli like the Pogo Connect;  when the Touch comes, I’m not sure exactly what I’ll end up using full time.   And when I’m looking to do more free-form scribbling, or I’m noodling with equations or just sketching something, I like Paper; it’s simple but pretty and powerful enough to get the job done.  I’ve also become more and more reliant on Corkulous to make notes in.  Unfortunately, despite protestations to the contrary, Appigo shows no sign of giving a crap about further development of Corkulous, and I’m reaching the limits of what the app will handle in terms of notes.  Also unfortunately, there doesn’t seem to be a good replacement out there, so I’m considering making one myself.
  • Information collecting:  some people would put apps like Evernote into the notetaking category above, and the Evernote + Penultimate setup works quite well for some people;  I haven’t looked at it in a while, but I may revisit it.  Until I do, though, I’m using Springpad as a dumping ground for random bits of info that I need (travel plans, receipts from conferences, paper work I may need to reference, books I want to buy, etc).
  • Mathematics and programming:  when I feel like playing around with a bit of math or I need to plot a quick graph, I use apps like SpaceTime1, PocketCAS, and Quick Graph.  Programming on the iPad is still a bit of a non-starter, though that’s starting to change a bit.  I’ve had fun playing with Codea, which embeds a Lua interpreter, and if you feel like learning Haskell, there’s iHaskell (must have an internet connection, though).  I recently used Codea to whip up a quick simulation of genetic drift (Fisher-Wright model), and it worked great.  I’ve seen a few Python apps and the like, but I haven’t had any experience with them;  if you had, please leave a comment!
  • Drawing / diagramming / presentation :  Another category with big changes to it.  When I last wrote about academic iPad usage, there wasn’t much to speak of here.  In the intervening time, though, this space has exploded.  Now, I use apps like Procreate (others like Sketchbook Pro) to sketch and draw with the Pogo Connect,  iDraw to create vector diagrams for talks and posters, and Omnigraphsketcher to work up quick hypothetical graphs.  Most of this gets fed into desktop apps like Keynote or Pages (or other design programs);  the iPad versions of these apps are good as well, and I use Keynote regularly to present with, but I’m still hamstrung by the lack of font support in Keynote for iPad.  Another long-awaited and massively useful tool to arrive is LaTeX snippet tools;  on the desktop, I use LaTeXiT pretty regularly, and now apps like Mathbot are serving the same purpose for me on the iPad:  I can write a quick line of LaTeX and copy the typeset equation into another app like Corkulous.
  • Writing: big changes here, too, driven by changes in my desktop workflow.  With my recent shift to using Markdown as a major format for writing, I’m now free to use some of the great cloud-syncing editors for the iPad to start things off.  So, a lot of my papers, blog posts, etc. now start their lives in Byword, which is incidentally the first app to really turn me on to iCloud syncing.  When I have to interact with Microsoft formats – yuck – I use still use Quickoffice.  LaTeX on the iPad has come a ways, with apps like Texpad, but I still find them too clunky for common use.  I’ve also gotten into collaborative writing of LaTeX through web apps like Spandex (and a new one that I’ve been meaning to try, Authorea), so I’m not really fussed about dedicated apps for LaTeX any more.
  • Misc: A few other apps I can’t live without include Dropbox, OPlayer HD for entertainment on the go, Calcbot for quick arithmetic, Convertbot to … well, convert stuff, Photogene / PS Express for quick photo edits (especially to screenshots I take to paste into other apps), and probably a dozen others that I use regularly but can’t remember right at this moment.

Going back to my old post, it’s clear that my usage of the iPad has changed significantly since I last wrote about it.  Some of the frontline, day-to-day apps that I use have changed or clarified (e.g. I use only Goodreader now instead of GR+iAnnotate), and entire new uses for the device have popped up, like drawing and writing in Markdown.  Increasingly, the iPad has become an indispensable part of my daily workflow, and though I could live without it, I certainly don’t want to!

What are your favourite apps and workflows for mobile devices (iOS or otherwise)?  If you have any thoughts, please leave a comment or let me know on Twitter.

  1. which is apparently now called MathStudio?

On the good and evil of scientific stories.

tl;dr: telling a good story is a vital tool in science communication, but it’s easy to go too far for a simple narrative.

If you’ve read this blog, attended a talk that I’ve given, or sat in on one of our lab meetings, you would know that one of my pet issues in science is communication. Scicomm, as it often goes by now, means more than explaining science to the public, though that is of course a large part of it. It’s also about how we communicate our science to other scientists, either in our field or ourside of it. Journal publications, conference talks, seminars, monographs, all of these things – and more – fall under science communication to me. And if you had found yourself as a fly on the wall when I was editing one of the Ph.D. students’ papers or critiquing conference slides, you would almost certainly hear me talk about story.

More precisely, you’d probably hear me say something like “what’s the story?” when I got through a rough draft of a manuscript, or after I watched a practice talk for an upcoming conference. When I say “story”, what I mean is the narrative and plot that ties together the work that you’ve done into a cohesive whole that the audience can follow and emphathise with. In the first chapter of his book Storycraft, Jack Hart cites this definition of story from Jon Franklin:

A story consists of a sequence of actions that occur when a sympathetic character encounters a complicating situation that he confronts and solves.

Story, as Hart says, consists of a recounting of a chronology of events (narrative), and the selection of arrangement of material so that a larger meaning can emerge (plot). Hart says:

For Eudora Welty “Plot is the ‘Why?’” Or, as the novelist E. M. Forster famously put it, the narrative is that “the king died and then the queen died.” The plot is that “the king died and the queen died of grief.”

I raise these issues because this is a problem that I’ve thought about at length when it comes to scientific communication. You might object that communicating science isn’t about a story, a narrative, or a plot, but I would strongly disagree. When you give a talk at a conference, you do exactly as Hart recounts: you construct a narrative and select material to form a plot (‘we identified some limit to our knowledge, we formulated some hypotheses, we did a test, we got some results, OMG science”), even if this looks nothing like what actually happened. You might be more familiar with this process in its rage form. Don’t fool yourself, this is story crafting. In its simplest form the scientist is the protagonist, the complicating situation is the unknown s/he is trying to banish as described in the introduction / methods, and the climax is wrapped up neatly in the results before the gentle falling action and dénouement in the discussion.

Story in formal scientific writing is often limited to the imposition of this narrative and plot structure, though stating it this way belies its importance; if you’ve ever reached the end of a journal paper and thought ‘what the hell was that paper about?’ (and we all have), chances are reasonably good that you’ve just experienced a failure of story. But when science is communicated to a wider audience, story begins to feature even more strongly. Whether written by scientists, science communicators, or journalists, it is easier to see this in action when the masters of the craft are in action. David Quammen, in his book Spillover structures his description of the hunt for Ebola and its reservoir around the story of the medical researchers who have tracked it through the jungles of Africa, winding in and out of their struggle to identify the source of the disease and the effects that it has on the people of Africa and elsewhere. It’s a detective story, which Quammen uses as a hook to lubricate the discussion of everything from molecular biology to mathematical epidemiology. But it’s the story that drives us through what would have otherwise been a textbook on epidemiology.

If I haven’t made it clear by now, I think that story’s important. Yet I also think that story has a dark side, one that we must be ever vigilant about as scientists, and it’s this: the push for a good story can obscure the truth. Science is messy, and full of complications and stumbles. There’s not always an answer, or a happy ending, and sometimes what we thought was right for a long time turned out to be incomplete, or even wrong. This fact is what makes writers like Quammen and science communicators like Carl Zimmer so valuable; they capture that messiness without letting it overwhelm the story, and in so doing make our science interesting to people. But if the push for a story goes too far, it can result in over-simplification and even simple and dangerous untruth.

I was reminded of this when I came across a post by one of my favourite writers on visual design, Garr Reynolds; Garr wrote the book Presentation Zen, and a series of other books like it, and I still recommend them to other scientists as a good way to get a handle on how to make your presentations suck less, visually. Recently, however, Garr wrote a post praising a video containing the work and narration of Paul Zak. The post, entitled “Neurochemistry, empathy & the power of story”, is itself curiously meta, as it disucsses work by Zak on neurochemical responses to the ‘dramatic arc’; in short, Zak claims that oxytocin and cortisol are part of the neurochemical suite that responds directly to the structure of a story, and can even be used in a predictive fashion (here, to predict the amount of donations that will be given when viewing a tearjearker story of father dealing with a young child dying of cancer versus the same father walking in the park with his son).

The irony of this, of course, is that Zak himself is an adept storyteller who has constructed a narrative around oxytocin as the ‘moral molecule’, reducing good and evil to the action of a single neurotransmitter. Here’s an excerpt from a Guardian article1 on Zak from last July:

What drives Zak’s hunger for human blood is his interest in the hormone oxytocin, about which he has become one of the world’s most prominent experts. Long known as a female reproductive hormone – it plays a central role in childbirth and breastfeeding – oxytocin emerges from Zak’s research as something much more all-embracing: the “moral molecule” behind all human virtue, trust, affection and love, “a social glue”, as he puts it, “that keeps society together”. The subtitle of his book, “the new science of what makes us good or evil”, gives a sense of the scale of his ambition, which involves nothing less than explaining whole swaths of philosophical and religious questions by reference to a single chemical in the bloodstream.

Here, we see the danger of story. In constructing a simple story with a compelling and digestible arc, Zak has swept the truth of this research under the rug, and the truth is that research on oxytocin is messy, contradictory, and provides few clear answers. As Ed Yong describes it, oxytocin can have distinctly contrasting effects depending on who receives it; some people may exhibit more social behaviour, while others in the same situation may exhibit more antisocial behaviour under the same dose of oxytocin. It can promote trust, or increase xenophobia. It may be that oxytocin is part of some motivator system: for example, people like James Goodson have worked to show that in birds like the zebra finch it2 is implicated in the ‘social behaviour network’ and may be instrumental in zebra finch flocking, though as in many other animals, this effect can be strongly sex-specific (usually to females).

All of this complication and mess is ignored in Zak’s story, which does a disservice to the reader who comes away with a simple view of the world that just doesn’t hold water. A friend of mine, a lawyer, asked me awhile ago if what he’d heard about this ‘cuddle chemical’ was true, and was visibly disappointed to learn that it was much more complicated than that. The problem here is that we are disposed to like a good, simple story; it has more emotional impact, which in turn makes it easier to remember and explain to others. Certainly, nobody wants to spend as much time reading journal articles and learning about nonapeptide hormones like oxytocin as I did for my PhD exam in order to tell a story at a party. This is why we have people like Ed, and Carl Zimmer, and Maryn McKenna, and all of the other great science communicators, writers, and science / scientist bloggers: they do the hard work of curating the facts and telling the story without losing the truth. Contrast Zak’s writing with Ed’s takedown of the oxytocin mess. It’s just as good a story, but it treats the truth with respect, and the truth is that we’re just not there yet. We have tantalizing ideas and scraps of evidence on how oxytocin affects us, but we can’t draw definitive conclusions. As Ed discusses, the hype around oxytocin has even led to people using it in an attempt to treat autism, with unknown and possibly harmful effects.

This isn’t an isolated problem. The TED talks have become a serious problem in this regard, and though I’ve seen some great TED talks over the years, they’ve grown to the point where the push for good stories has overwhelmed the ability of science to provide them. I saw the most recent example on Boing Boing when Maggie Koerth-Baker pointed to a problem in the widely-circulating story spun by 19-year old Boyan Slat on a plan to remove plastic from the oceans, namely, that it won’t work. Here again, we see the elements of story at work, this time surrounding Slat himself. A 19-year old phenom who rises to glory on the back of an award-winning school research paper, a hands-on problem-solver producing solutions and starting a foundation to implement them. It’s a feel-good story with a likeable protagonist who is tackling a problem that scares us all; it’s a shame that the scheme probably won’t work, and may even do more harm than good if ever implemented. The issue at hand, though, is that the story told by and about Slat is compelling but oversimplistic and potentially dangerous, just as the one told by Zak is3. As Maggie points out in her post:

Here’s a mantra to remember: TED Talks — interesting if true.

And the same is true with anything you read in the popular press about science. It’s interesting, if it’s true.

Now, I began this post by pointing out that I’m a big proponent of story in science, and I stand by that statement. Story is an important, and I would argue, necessary tool when we come to communicate the results of out work, for the same reasons that it can go badly wrong. A carefully crafted story draws the audience through the science, ties it together in a way that they can understand and remember, and adds punch to the work so that the audience cares enough to pay attention. Yet this process, while vital, needs to be kept in check by the demands of the search for the truth and the admission of messy detail and incomplete knowledge. The tension between story, which yearns to be complete, and science, where more research is always needed, must be respected and maintained lest you end up with bone-dry science or a compelling – but misleading – tale.

  1. or as Ed Yong puts it, ‘ad’
  2. under the name of mesotocin
  3. as an aside, I’d like to say that despite the problems inherent in Slat’s plan and how it ended up going viral, I hope that he keeps trying. He sounds like a smart guy, and failure is a great first step on the road to success.

I guess I’m just not a real man.

If you’re a man, and you really like [insert chosen thing here - I'll use Star Trek for this post], and you’re a fan and you talk to other people about it, and you spend time watching the TV shows whenever you can and you go to conventions and put effort into dressing up to have fun;  well, then, you’re a freak who should die alone.

Photo by Falashad, used under a CC license.

On the other hand, if you’re a man and you really like sports, and you’re a fan and you talk to other people about it, and you spend time watching matches on TV multiple times a week and you go to games wearing the team jersey and you get drunk and act like a jackass and maybe start some stuff on fire when you lose;  well, then, you’re a real man.

Photo by Matt Gibson (www.matt-gibson.org), used under a CC license.

Explaining the stupidity of this is left as an exercise to the reader1.

  1. Despite what it may look like, this isn’t about me.  I just heard about someone who’s a fan of Star Trek get rejected by a woman at the ‘should I contact him?’ stage for solely that reason, and it struck me as stupidly unfair

Memoir of an academic talk.

tl;dr … well, honestly, go read something else if you don’t like long form.  This is 3600 words of navel-gazing detail, and I’m not about to apologize for it.

A companion piece to my earlier post on the process of designing a poster, this post deals with the talk on the same material for a different conference (vastly different audiences, so I don’t mind overlapping).  As I said for the post on designing the poster, this is a snapshot, or series of snapshots, of my process for doing science and preparing talks.  It’s not the whole picture, and I’m deliberately  exposing the warts and bumps that go with doing science;  I don’t get to control the image you form of me as well as I otherwise might, but I feel that the resulting material is more honest and informative.

In any case, I hope you enjoy it.  Please leave feel free to leave comments or questions, and I’ll do my best to answer them.

 

Monday, July 23, 2012:  The ISBE 2012 conference is a couple of weeks away, so it’s time to start thinking about the talk.  The initial steps will be a little slow, but today I’ve created the presentation file as a symbolic step.  I haven’t yet conceived of the overall visual theme of the talk, so for now I’m adopting a simple black  on white approach. 

Thursday, July 26 4:30 p.m. I’ve got about a half an hour before I need to leave the lab to go meet my long suffering wife for dinner.  Time to outline some content!  I’m working quickly, creating new slides and just typing main ideas of the story I’m telling into them.

4:51 p.m. 20 minutes later, I’m done a really quick outline.

A couple of things to note.  First, considering that this is a 12 minute talk, you may be wondering if 22 slides is too much. Yes, and no.  For most people, 22 slides is too many for this length of talk;  a good rule of thumb is – depending on the density of your slides – allow for at least a minute for any slide you’ll be saying more than ‘hello’ over.  This is a mistake that I see people make time and time again:  they make hugely dense slides with dozens of graphs, and then leave themselves about 15 seconds per slide.  This won’t work.  They either end up blasting through slide after slide of results, or they go way over time1 .  Aim for simplicity, and remember that simplicity is hard.  Simplicity doesn’t mean dumbing down your message, it means presenting your message in as straightforward and audience-appropriate2  a fashion as possible.  On the other hand, I deliberately present more slides with fewer ideas on each one;  this is a conscious strategy aimed at controlling what the audience is seeing and thinking about on a more fine-grained level.  However, this is a more difficult approach, and you should be careful about adopting it.  Long story short, if you have more than about 1 slide for every 30 seconds to a minute, you should have a good reason why. Also, the outline is hardly set in stone.  As when I did the poster, it’s an iterative process which will lead to me adding and subtracting material as I get into the content and the design.  I’ve already got some ideas that may add in a few slides, so I’ll probably need to subtract some elsewhere.

Monday, July 30, 4:36 p.m.  Squeezing in a few minutes to work on the slides before I head for home.  I don’t have a cohesive plan for the design of the slides yet, so I’m going to iterate the content a little and see what suggests itself.

Tuesday, July 31, 11:30 a.m.  Only got a few minutes in on the talk yesterday before I got distracted by an ‘emergency’ (read: time-suck).  I just realised this morning that I really need to create two versions of this talk, because I’m going to be giving it at a couple places I’m visiting in Europe after the conference.  This means that I need a 12-minute version for ISBE, and a 45-minute-ish version for the seminars I’ll be giving.  This isn’t as bad as it looks, because creating the 12-minute version requires cutting out a lot of material that I would otherwise put in;  while it makes for more work creating slides for the longer version, it’s more relaxing because I can afford to go into details that I would have to otherwise avoid in the shorter version.  This post, however, will focus on the 12-minute version which I will create first.

4:20 p.m. It’s been a bit of a slow day, but some of the pieces are starting to come together.  I’ve got a few of the visual ideas worked out, and though there is a massive amount of work left to do, at least I’ve got a direction.

You may notice a few things.  First, I’ve littered the slides with notes to myself explaining where I want to go with that slide, reminders about content to add or delete, and even notations on which notes might be suitable to cut from the final version.  Second, if you look closely, some of these images are decidedly low-res.  That’s because they’re “comps” of stock photos (from iStockPhoto), which are super-low-res versions that are watermarked so as to be unusable in a production document.  They are, however, useful for trying things out and deciding what image works best before you lay down money for the final image file.  This lets me play with the slide deck before committing (an example is the image of the dog and the bat;  I’ll only use one when I discuss rabies, but I’m trying them out to see which I like better), and it might even be possible to find free alternatives to the images I’ve used.  The final thing of note is that I haven’t addressed the typography of the presentation yet;  the font used in the slides so far is Keynote’s default Gill Sans, but my next step is to choose some appropriate fonts now that I have a bit of content in place.

2:36 a.m.  I’ve been working for the last three hours transcribing every common name and genus-level-or-above taxonomic name from the index of Odling-Smee et al’s monograph on niche construction in an attempt to set the stage for why I’m giving this talk;  namely, that viruses are under-represented here.  To make this point visual, I’m turning it in a word cloud (you can see the placeholder I whipped up in the slides above).  I’ve reached the T’s and I have to stop now because otherwise I’ll be doing this all damn night.

Wednesday, 12:26 p.m  Back to it, and I’ve finally finished the index.  Now to throw it into R (using the “wordcloud” package), pretty it up, and insert it into the talk!  (And yes, I *will* go way too far for a detail no-one will care about).

1:06 p.m.  Here’s the new placeholder that I’ve created in R.  It’s still a placeholder because I’m going to try to match the fonts and colors to the rest of the slides;  making those decisions is the next step.

 

4:25 p.m.  I’m ‘auditioning’ some font and colour scheme choices.  To do this, I’ve duplicated my presentation and slides with in it, and I’m applying various styles to see how they work.  I’m looking for a bold, attention-grabbing combination, because I want this to stand out from a sea of similar-looking talks;  since I’m not adopting any sort of high-concept approach for this talk (mostly due to a lack of time!), I’m focusing on using typography and colour in a more aggressive way than is usual.  With that said, I could really use my wife’s designer eye on this, because I’m having anxiety attacks over what combinations might work.  I like the use of Bebas Neue and a script font for the headers, but I’m having trouble with a body font (because neither of those choices work well as body fonts).  I’m in a bit of a grey area because the presentation really only has a couple of blocks of text that need to be set, so I need to balance readability with mood.

Incidentally – and this is important – I’ve also been ducking into an unused conference room with a project to try this out on the bigger screens.  Always try your talk slides out on a setup that is as close to the final venue as possible.  You want to make sure that the colour combination that looks great on screen actually works when you project it!

Thursday, 4:36 p.m.  I’ve been working on the slides throughout the day, in and amongst other things on my todo list.  Today I’ve been focusing on the results section, which has seen some progress.

I’ve made some subtle modifications, including breaking the green color of the palette into a brighter green for text on black slides (like the title slide), and a softer green for backgrounds.  If you compare this snapshot to the previous one, you should be able to see what I mean.  Also, I’ve started redoing my figures to use the fonts that I selected for the talk.  It’s a small thing, and perhaps no-one would consciously notice, but I believe in minimising friction for the viewer;  different fonts and designs between parts of the talk can be jarring even if the audience can’t figure out why, and I want to avoid that as much possible.  It may not be entirely doable (I still have to figure out a better way to present that tree, for instance, and I’m not sure if I’ll be able to find a way to change the font on that), but I’ll go as far as I can to homogenise the design.

4:52 p.m. I’ve ducked into the  conference room to check on how the slides are showing up on the screen.  I’m generally happy with it so far, but projecting it makes it clear which version of the word cloud I’m going to keep;  the script version is painful at large sizes.

11:04 p.m.  I’m continuing to work on the slides.  I’ve been going back and forth between the bat picture and the dog picture for rabies (another potential example of viral niche construction, methinks), but now it finally occurs to me that the dog picture just doesn’t read well to anyone but me.  So, it has to go.

11:55 p.m.  I’m working on a slide that suggests a speculative link between viral niche construction and sociality;  this is based off of work on a cat virus, so I’m using a picture of kittens to illustrate the point3.  My first version, though, illustrates a design issue:  if you use a picture that has eyeballs in them, the rest of the slide has to relate to the eyeline (somewhat similar to the concept of eyeline matching in film editing) or else the viewer gets uncomfortable.

<

As you can see, the kittens are looking down and I have text above them;  this creates a visual tension that has no reason for being there. Putting the text below the kittens, besides looking bad  because of the shading at the bottom of the photo, also fails because the kittens are all looking in different directions.  Once I’ve identified this problem, I have to find a new photo;  thankfully, the internet seems to be big on cats (who knew?).

12:09 p.m.  I’m wrapping up for the night.  I’ve made reasonable progress today:  aside from a set of slides in the middle that I’ve engaged my wife to do drawings for, the last thing that I need to do for this first, rough version is to redraw the phylogenetic tree and find a way to present it.


If you’re paying attention, you’ll notice that I’ve still got too many slides.  I’m going to be practicing this talk (including a lab practice talk next week), but it’s almost certain that I’m going to need to cut some material.  Like any other content editing, there’s going to come a point where I have to kill my darlings. This doesn’t bother me as much as it normally would, because most – if not all – of what I cut will end up going into the longer seminar version of this talk, where I’ll be making the same case in greater depth.  You can see that I’ve already started doing this, as I’ve moved some slides after the acknowledgements at the end;  these will be included in the longer version unless I cut them entirely.

- August 6, 2:26 p.m.  I’ve been fiddling with the slides over the last few days, just trying a few things out and moving things around.  I’ve decided on one of the cat photos, the middle one, as it’s the most engaging; my wife pointed out that this is because of the way they’re looking, including the one staring straight at you.  I’ve got her working on producing a diagram for me to explain the way baculovirus manipulates its hosts, which goes in the blank spot in the middle, and I’ve placed images in there to help get me over the hump.  Today, I need to fix the phylogenetic tree and place it in;  whether I use it in the short or long version, I’ll need it at some point.  And I want to get the short version done tonight if I can, because I plan on practicing it tomorrow before I present it to the lab on Thursday.  So here’s the current state of affairs:

August 7, 2:15 a.m.  Small refinements now.  Unfortunately, even in consultation with my talented wife I couldn’t come up with a good illustration  for the slide I’ve been holding on the various genotypes;  thus, I’ve decided to break down and use (gasp) text.  I know, I know.  In the mean time, I’ve also managed to refine the tree diagram (which requires further refinement, but the pieces are there now).

1:02 p.m. I’m searching for images to illustrate the hypothetical genotypes (zombie, non-gooey;  non-zombie, but gooey).  I’m having trouble meeting the criterion of Creative Commons or stock that I can purchase as well as being the right image for the idea.

3:02 p.m. I’ve found images and replaced the phylogenetic trees. I’ve also replaced a slide that I apparently deleted at some point along the way without noticing;  you’ll notice that the second slide in the talk is missing if you compare the last two snapshots above.  Using OS X’s Versions, I was able to graphically browse to an old version from a couple of days ago, find the slide, and drag it and drop it directly into the current version of the talk.  It may not be git, but it’s still cool.  And it’s also a good lesson:  keep old versions!  Keep backups!

I think that the short version of the talk is in good enough shape now that I can practice it, so I’m going to go see if I can find a room with a projector to play in.  If you can, it’s best to practice talks under conditions that are as close to the real thing as possible;  that means standing up in front of a room, even if it’s empty, and playing your slides behind you as you address the room.  Muttering under your breath as you stare at the slides may seem like a good way to practice, but you’ll never find the timing problems and flow issues unless you force yourself to stand up and actually talk.

4:52 p.m.  I just finished practicing my talk for the first time.  As I expressed on Twitter:

Seriously, people.  Practice your talks before you give them.  Then, practice them again.  And then three more times.  What I’ve learned is that I need to do some rearranging, because the flow of ideas in the talk didn’t quite work;  I’m going to jettison a few slides and use them in the longer version, and I’m going to see if I can add a few elements to the text that I abruptly noticed were missing.

August 8, 12:24 a.m.  I’ve spent some time rearranging slides and writing down what I want to say on each slide.  I like to have my material memorized to the point where I can present it without notes, but I sometimes find that writing down key points of each slide when I’m practicing helps me to achieve that goal.  Here’s the current state of the short version, with changes incorporated.

I’m still struggling with some aspects of the design.  In particular the genotype slide (slide 17) is bugging me;  I had to add the model diagram because it was too difficult to explain the genotypes by referring to the parameters alone.  Now that I think about it, though, I may try playing with text instead spelling out the assumptions.  But that can wait until tomorrow, because I need some bloody sleep.

11:49 a.m. Back to the conference room to practice again!

12:42 p.m. I tried it three times, but I’m still coming in too long.  The talk is supposed to be 12 minutes with 3 minutes for questions, and I’m clocking in at 18-19 minutes.  It looks like I’ll need to pare some things down to put into the longer version.  It breaks my heart, but I think that I’ll have to put the word cloud into the longer version;  it’s a great image, but under time constraints it’s not pulling its weight.  When that happens, you need to kill your darlings.

1:07 p.m.  I’m cutting it to the bone, but I’ve got things down to 20 slides (simplicity is hard).  The room I was using is booked right now, I’m going to have lunch and do some work until it’s open and I practice again.

4:35 p.m.  I’ve practiced this thing backwards and forwards, but I can’t get the time down!  From 19m 28s to 14m 12s, I’m still two minutes over.  I may have to remove the phylogenetic results, though it kills me to do so.  I know that they’ll be in the longer version where I’ll have plenty of time to go over them, but it still pains me.

August 9, 12:49 a.m.  I’ve spent the last couple of hours finalising the design, including replacing all of the comp images with the full versions that I’ve purchased.  It’s pricy ($86 AUD for 50 credits on iStockPhoto), but worth it.  If you can’t afford to pay for good images, then find them under a Creative Commons license on Flickr, or take them yourself.  But always use high-resolution images!  And don’t steal them.

11:58 a.m.  Okay, further practicing yields no advances.  I’m going to have to cut the phylogenetic results in favor of asking people to talk to me if they’re interested.

- 12:45 p.m.  11 minutes, 58 seconds!  Finally, we’re ready.  Here’s the state of the talk before I give it to the lab this afternoon.  Don’t forget that I’ve got extra slides tacked on (after the slide with the big Thanks! on it).  I’ve also added a slide with photo credits;  again, acknowledge your sources and don’t steal other people’s work.

4:45 p.m.  Well, I gave it to the lab (and a distinguished visitor!), and things went pretty well.  It’s clear that the work I put into the design and practicing the talk has paid off, because I received multiple comments that it was a very polished talk.  There were some good questions, and a couple of good suggestions for minor improvements, but otherwise it’s done and dusted!

- August 11, 12:39 a.m.  I leave for the conference tomorrow afternoon, and I’ve just thrown my talk files onto my USB drive – and I’ve got them in my Dropbox, on my iPad, and in my email. You only have them in one place?  You’re begging for a disaster.  But, I digress.  At this point, it’s worth reviewing the lessons I learned while designing this talk.  First and foremost, as I wrote above, simplicity is hard, and you have to be prepared to kill your darlings.  I had more content than I could present, so I had to cut it down and make it as simple as possible.  Practice is king.  I practiced this talk no fewer than eight times to an empty room, and it paid off;  the people I finally gave it to were impressed at how fluent I was.  What they didn’t see was the hours I spent stumbling and swearing and fumbling my words.  If you suck in private, you’ll be great in public.  And finally, iterate, iterate, iterate!  To make good posters and good talks, you need to advance and revise, create and critique.  If you scan back through this post and look at nothing but the slide pictures I’ve included, I hope that you’ll get a feeling for this.

So, if you’re still reading after all of that, thanks for sticking with me!  I hope you learned a little something, and I welcome your thoughts.  But for now, I’m off to Sweden!

  1.  A minor rant:  if you go over on time on your talk at a scientific conference, you are being rude.  You’re holding up other presenters, you’re making it difficult for people to get between talks on time, and you’re generally making things worse for everyone.  I don’t really care about your excuses, because 95% of the time what they boil down to is ‘I didn’t care enough about my audience’s time to practice my talk and make sure that I could present it in the time allotted’.  I’ll cut students a little slack, but only because I’m going to whack their advisors over the head.
  2. What do I mean by ‘audience appropriate’?  I mean that you need to think hard about your audience and explain things they won’t be familiar with while avoiding long digressions on topics that are well-known to your audience.  Spending two minutes defining ‘genotype’ to an audience at a genetics conference will be a waste of your time, but it might not if you’re presenting to a science outreach high school event.
  3.  Yup, that’s right, kittens.  If that makes it into the final version for ISBE, I pity the poor fool who has to follow my picture of adorable kittens

If you've been paying attention to my Twitter feed or blog (and seriously, why aren't you?  /narcissismoff), you may have noticed that I've been reading a bit about Darwin lately.  I just finished Desmond and Moore's biography, Darwin, which I found really enjoyable, and when they mentioned his autobiographical musings on his rejection of Christianity, I sought out a copy of that to read.  In amongst his reflections, I saw this quote about the way he worked and the dangers of what we would now call confirmation bias:

I had also, during many years, followed a golden rule, namely, that whenever a published fact, a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once;  for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favourable ones.  Owing to this habit, very few objections were raised against my views which I had not at least noticed and attempted to answer.[1. Okay, so he loved commas.  Give the man a break, it was the 19th century.  Quote from p. 123 of The Autobiography of Charles Darwin, 1958, edited by Nora Barlow.  You can read it for free here.]
I don't know to what extent he managed to follow his own golden rule, but I think that the sentiment is quite important and useful to people in any field, scientific or not.  We should always strive to answer the strongest versions of the arguments against us, no matter how uncomfortable it makes us.  As scientists we tend to get this idea beaten into us by vengeful reviewers, after which we have to learn how to separate useful opposition and criticism from useless spite, but I think the reminder from Darwin's own hand is useful for us to remember.  Besides, it's just more fun that way;  being 'right' all the time (whether you actually are or not!) is boring.