Archives: Science

Science isn’t a pizza, so stop slicing it up.

Science isn’t a pizza, so stop slicing it up.

 

Praj, author of the blog Do I Need Evolution, drives me nuts. Don’t get me wrong, he seems like a nice guy and a well-meaning one at that. Yet as one of the new wave of commentators on the science vs religion battles, he appears to hold a view that basic science is just an indulgence that we should be quiet about in favour of the real science that puts satellites in the air and cures diseases:

You see two images when scientists speak about “science” (something I think we should avoid, but that’s another story). One is science as a useful tool: it helps us cure diseases, win wars, grow the economy, feed the planet, and so on. The other is science as a world-view: it imparts a sense of wonder, conquers fear, and reveals beauty. These images are a spectrum rather than distinct categories. Most scientists have some of both, though applied researchers are usually closer to the science as a tool view and basic researchers tend to be on the other end.

The problem is that the overwhelming majority of non-scientists, and especially the religious, don’t care very much about science as a world-view. They live on the very far end of the spectrum where science has almost zero intrinsic value. To those people science only matters because it helps them do stuff they care about.

Science lobbies appreciate this fact, which is why they focus on the concrete, tangible benefits of research. They know it would be ridiculous to ask for billions of dollars because some people think particle physics is beautiful. Policy experts also appreciate this fact. The standard “explain your thesis to your grandmother” interview question for my DC fellowship is judged on how well you make your research relevant. I suspect many academic scientists don’t appreciate this fact. Or if they do, they don’t weigh it as much as they should. Academics are especially prone to hyperbole about the wonders of science.

Praj would like us to think of science as more or less practical; some science will give us economic benefits, and can be explained in that light, so that’s good. Other science is ‘world-view’ science that only exists to satisfy the whims of a subset of curious people. But I’m going to disagree here and say that Praj doesn’t understand science very well. Despite lip service to a ‘spectrum’, he would like to slice science up into boxes that can be addressed independently. That way, we can focus on ones that are practical, and ignore ones that aren’t 1. But we can’t do that with science. Science is a process and a body of knowledge that is interconnected and historically contingent.

As an example of interconnection, we can look to Darwin himself. Putting aside for the moment the historical antecedents to his work on evolution (including Lamarck and his own grandfather), Darwin had to integrate ideas from all corners of biology with the work of the economist Thomas Malthus to arrive at his insight regarding natural selection. In order for this theory to be sensible, it required a much older earth, ideas that came in part from the volumes of the geologist Charles Lyell that he read while on the Beagle. One of the first serious scientific challenges to evolution came from the physicist Lord Kelvin, who calculated both the age of the earth and the age of the sun before concluding that both were too young for evolution to be valid. This phenomenon has only gotten stronger over time. We carve fields like biology, physics, chemistry, psychology, etc. up into separate fields because we have to have a way to award degrees, and topics can indeed be thought of as clustering together naturally. Most ecologists don’t study quantum field theory, because with our current understanding of science, it’s hard to see how to use it effectively in their work. But that doesn’t mean that we can take those lines in a course catalog as representative of some real and sharp division. What we call biology and chemistry are deeply interrelated, as anyone who’s spent time in a molecular biology lab will tell you. Neuroscientists spend a lot of time on the biology and chemistry (and by extension, biophysics) of the brain and nervous system. One of my favourite evolutionary biologists is John Maynard Smith, who trained as an aeronautical engineer, and we all know that physicists are math fetishists (I kid, I kid). Just look at the new interdisciplinary fields that are cropping up with increasing frequency: biophysics, neurochemistry, behavioural and neuroeconomics, agrophysics, systems biology, computational sociology. And I could do this all day, because science is a heavily connected graph of fields that reflect an underlying continuum in our study of nature. Apparent divisions in scientific fields usually reflect more about our lack of understanding than they do of any real separation.

This brings me to my second point, the historical contingency. Praj and people like him would like to focus on ‘practical’ science that we can make relevant for the public. Things that bring economic benefits now. ‘Applied’ science, new technology, and so on. But people who champion this division between basic and applied research are making a simple mistake of perspective, one that even serious historians of science are prone to making. What is considered applied science now relies directly upon research that used to be considered basic and impractical2. Applied science is simply science for which the next step on path is to an economic benefit that is clear and predictable, but ‘practical’ or ‘applied’ science doesn’t get to walk away and ignore this chain of connections and history. Putting a satellite into orbit relies on centuries of work in physics and mathematics that was once considered deeply impractical. It is the result of thousands of individual steps, some practical, some not, that have given us the ability to put things into orbit. Medical doctors rely on work in anatomy, biochemistry, and biology that has often been considered very impractical (Galen himself wasn’t allowed to work on humans because work on cadavers wasn’t permitted; he inferred from animals because of their anatomical similarities). Astronomy and cosmology are the prototypical basic and useless sciences now, but if we ever become a truly space-facing people, then our descendants will be very happy that we wasted our time on it for the simple pursuit of knowledge. And who knows? Astronomy and cosmology could suddenly become very useful before that; we can’t easily predict what will be applicable, but the history of science tells us that we can expect to be surprised. I’m sure that the Einstein of 1945 would have had some words to say on the topic to the Einstein of 1905.

Praj would like us to believe these things about science, because he wants to believe that Bill Nye is wrong when he says that creationism threatens our ability to understand the world and innovate in science and technology:

I’ve often wondered how people like Bill Nye can maintain this apocalyptic vision. As Saletan notes in the very next paragraph, there are actual, real-life engineers and scientists who reject evolution.

Praj himself relates that his parents were successful doctors and his dad doesn’t understand evolution, so evolution must not be relevant to medicine. He make similar claims several times on his blog, so it’s worth finishing off this post by addressing it. Yes, there are successful doctors and engineers and even other scientists (though few if any biologists) who don’t understand evolution. That is, of course, not the same as saying that they actively reject it and believe in young-earth creationism. There’s a difference here. I don’t understand much about particle physics, but though I consider that a failing it’s one I can live with. There isn’t enough time in the day for me to learn everything I would like to. And yes, doctors who haven’t learned about evolution aren’t necessarily bad doctors. I can even forgive those who ‘reject’ it because they’ve never been exposed to it properly. But doctors who actively reject evolution when taught it, and believe that the earth is 6000 years old? This requires that they actively read and reject the evidence from not just biology but physics, chemistry, geology, and so on. This requires that their critical thinking skills are so deficient that they cannot understand and assimilate anything of such a large and coherent body of evidence upon which there is broad and solid scientific consensus. How can this be a good doctor or engineer? Would you like your satellite designer to be a flat-earther? Would you be worried if your bridge engineer was proficient but convinced that physics and material science works as it does because of the action of ambitious fairies?

When my doctor asked what I did for a living and then launched into a tirade because he was a young-earth creationist, I asked him how he dealt with giving advice to his patients on vaccination or antibiotics. He replied that he didn’t believe in vaccination and that he didn’t think that antibiotic resistance was a problem, because viruses and bacteria don’t evolve. I changed doctors that day. This is a man who may actually kill patients with advice like this, and it stems directly from his religious beliefs. Are all creationist doctors and engineers bad at their jobs? No, but I submit an empirical hypothesis that doctors, engineers, and scientists who are actively creationist (especially YEC) and reject scientific understanding to protect their beliefs are more likely, on average, to be bad at their job and to have crucial deficiencies in their thinking that could prove harmful to their work.

Science isn’t something that can be cleanly chopped up into convenient portions and picked over for economic benefit or religious palatability. It is a method, the best method we have, for discerning the truth about the universe and everything in it. It is a deeply interconnected and historically contingent search for that truth. What we chose to do with that truth afterwards is up to us, but when we ignore those connections and history and our inability to predict the future, we do so to our own detriment.

  1. Like evolution.
  2. Note that I’m not making the same claim about technology, which as McClellan and Dorn persuasively argue, has had a habit of coming before the science that explains it

Who is Andrew Fabich?

tl;dr: Andrew Fabich is a creationist 'microbiologist' at Liberty University who isn't a great scientist.

Who is Andrew Fabich? This question has haunted me since I watched the debate between Ken Ham and Bill Nye. One of Ham's favourite tactics in that debate was to name-drop 'creationist scientists', as though a parade of Ph.Ds would somehow disprove evolution in a blaze of authority. Most of his name-drops were typical creationists: engineers, medical doctors, and the like. But then came Andrew Fabich. In an attempt to discredit the awesome work done by Richard Lenski and his lab on the adaptation of E. coli to use citrate as a novel food source, Ham suddenly trotted out a microbiologist to take a swipe at Lenski et al.

You can watch the video here, or starting at the relevant section here, but I've transcribed it for you (and so has the Lenski blog here):


Ham: There are those that say 'hey, this is against the creationist'. For instance, Jerry Coyne from the University of Chicago says, 'Lenski's experiment is also yet another poke in the eye for anti-evolutionsts,' he says 'The thing I like most is that it says you can get these complex traits evolving by a combination of unlikely events.' But is it a poke in the eye for anti-evolutionists? Is it really seeing complex traits evolving? What does it mean that some of these bacteria are able to grow on citrate? Let me introduce you to another biblical creationist who is a scientist.

Fabich: Hi, my name is Dr. Andrew Fabich. I got my Ph.D. from the University of Oklahoma in microbiology. I teach at Liberty University and I do research on E. coli in the intestine. I have published in secular journals from the American Society for Microbiology including Infection Immunity and Applied Environmental Microbiology as well as several others. My work has been cited, even in the past year in the journals Nature, Science, Translational Medicine, Public Library of Science, Public Library of Science Genetics, it is cited regularly in those journals and while I was taught nothing but evolution, I don't accept that position and I do my research from a creation perspective. When I look at the evidence that people cite of E. coli supposedly evolving over thirty years, over thirty thousand generations in the lab, and people say that is it now able to grow on citrate, I don't deny that it grows on citrate but it's not any kind of new information. It's .. the information's already there and it's just a switch that gets turned on and off, and that's what they reported, and there's nothing new.

Ham: See, students need to be told what's really going on here. Certainly there's change, but it's not change necessary for molecules to man.


I don't need to deal with Fabich's criticism of the E. coli work, for the simple reason that Lenski and his postdoc Zachary Blount have already crushed it over at their blog post. You can see it here, and I encourage you to do so. It's a great read for the biology alone, and it's pretty damning stuff for Fabich. As Zach says at one point:

Fabich went on to state that this “switch” is what we reported. That is emphatically not true. It beggars belief that anyone, much less a trained microbiologist, could actually read our 2012 paper, where we reported the genetic basis of Cit+, and come away thinking this.

So who is this guy? Who is this Ph.D. in microbiology that makes such obvious and simple errors, who presents himself as a creationist biologist and appears with Ham to misinterpret some great work?

Well, let's start with where he works. Fabich is an assistant professor at Liberty University, which is already ringing an alarm bell. Liberty University is a private Christian university located in Virginia, and its biology department openly teaches Young Earth Creationism (YEC). How about Fabich himself? He made a pretty big deal of his publication record during the debate, so I think that we should start by taking a look at it. Fabich has five publications listed on his profile, all dealing with E. coli and the most recent published in 2011 (a Google Scholar search shows the same thing, disregarding a couple of obvious false alarms). These articles have indeed been cited: 6, 56, 82, 16, and 12 in order of date of publication, but these are all large-team papers, with over a dozen authors for several of them and no less than four. This is not suggestive of a creative and robust scientific output on Fabich's part. Finally, the 2011 paper shows his affiliation as being with Oklahoma, which suggests that Fabich hasn't published a single thing since moving to LU. As far as track records go, it wouldn't get you tenure at Harvard (if that mattered to you). Hell, I have more than double the number of publications that he does.

If that was all there was to it, then I probably wouldn't be writing this blog post. But then, I came across this video. It appears to be another attempt to parade a 'creationist scientist' in front of a camera, but it's interesting for its content. In it, Fabich manages to revise the history of scientific thought, bag on the scientific method, quote-mine (incorrectly) a paper on evolution from the 1960s, and from this conclude that evolution is false. A choice highlight from the middle of the video:


The science is not the issue. When you look in the scriptures, even Jesus acknowledges this when he says in Luke 17 that 'the kingdom of God comes not with observation'. Why would you impose that worldview on me? Even Jesus says that you can't do an experiment to prove God or who God is. So, what is the scientific method. Actually, one of the points that I want to make here, right up front, is that modern science had its foundations in the Bible, in Christian Europe. Okay, there are some exceptions, like some people out in India and Turkey, they're isolated and rare exceptions. But the scientific method is based on Biblical presuppositions. I'm not going to go into all of those, but the scientific method, you realise it, you start out with an observation and then you go and you make a hypothesis, collect your data and then there's oh interpretation. You can't get rid of your bias. All scientists are bias [sic]. I'm guilty and so are you.

So, the problem with science that I have is that it never ends. Well, it might not be a problem, it's how I keep my paycheck. Okay, I get it. So one good hypothesis leads to another and so it goes on and on and on and it never stops. If you're not careful, you get trapped into worshipping the internal scientific method rather than the eternal creator of the scientific method who gave it to us so that we could receive it.

What it comes down to here is, our theory has become one which cannot be refuted. You know who said that? It must be a creationist, some big creationist you all recognise and you've got your short list of who it said. Because we're just, we're uncompromising and we're not based on facts and data. You know who said that? It was Paul Ehrlich and L. C. Burch. They, the evolutionists, said 'our theory of evolution has become one which cannot be refuted by any possible observation'. Are you meaning to tell me that it's not based about facts? Whoaaa, so evolution's not based on fact.


There's so much wrong here that I barely know where to start.

  • The history of modern science doesn't begin with the Bible. It is difficult to extricate Christians and their institutions from the matter, but to say that modern science started with the Bible is laughable at best. For instance, such a statement manages to ignore the entire history of scientific thought in Ancient Greece. Aristotle formed a significant, even commanding, aspect of scientific thought until the Scientific Revolution through the 16th to 18th centuries. It also ignores the important role that Byzantine and Islamic influences played, and grossly trivialises the achievements of civilisations in places like India and China. These are all recorded and established facts that Fabich blithely rolls over.
  • The scientific method is a large part of what gives science its power. And far from being a problem, the recursive nature of scientific progress is one of its greatest strengths. The comedian Dara O'Briain said it well when he said 'Science knows it doesn't know everything; otherwise, it'd stop. But just because science doesn't know everything doesn't mean you can fill in the gaps with whatever fairy tale most appeals to you.'
  • Yes, scientists are biased. We're all human. In fact, if Fabich cared to Google it, fields like the philosophy of science and the sociology of science exist to tackle exactly this question. But in general, the power of science is that it is self-correcting (though this is not without challenges, and needs constant work). And Fabich basically admits to paying lip service to the scientific method to keep a paycheck. I'll let you decide on how that reflects on him.
  • And of course, what would a creationist be without a cherry-picked quote? The quote from Ehrlich and Birch comes from a paper published in 1967 and – despite what Fabich has implied by leaving out the following sentences – is not some sort of anti-evolution screed. In fact, the quote in his video goes on to say: 'The cure seems to us not to be a discarding of the modern synthesis of evolutionary biology, but more scepticism about many of its tenets. In population biology, more work is needed in elucidating the general properties of populations, both those made up of one species of organism and those made up of two or more species without reference to dogmas or guesses about how they may have evolved.' So, in reality, the quote that Fabich has chosen is actually about a call for more empirical work to fill in the gaps in the data. And as anyone who's picked up a book in the last 50 will have noticed, they got what they were asking for. (For a longer fisking of this same quote and others, check out Peter Hutcheson from nearly thirty years ago. Way to stay current, Fabich).

So, who is Andrew Fabich? Well, the evidence suggests that he's a poor excuse for a scientist. It suggests that he's not an active member of the scientific community, and that he's interested not in helping the progress of science but in tearing it down to satisfy his worldview. And it suggests that we can safely ignore him.

But Fabich is only the symptom of a larger problem, one that Ken Ham exploited ruthlessly in his side of the debate. The problem is credentialism, or the over-reliance on credentials such as academic degrees. Ham was simply employing a time-honoured technique: parade out a bunch of 'doctors' and 'scientists' who are creationists in an attempt to get a pass simply because they have Ph.Ds. The problem with this is two-fold:

  1. When verified properly, credentials can be a useful tool in situations such as making hiring decisions (a Ph.D. minimum for a research position is probably a defendable, if not foolproof, criterion to use). But academic credentials are no proof of research savvy or even basic competence; anyone with a browser can surf their way to a Ph.D. at a diploma mill. And the problem with Ham listing scientists who happen to be creationists is that it misses the fact that the vast majority of scientists do not share their beliefs. In fact, Project Steve (of which I am a proud member!) was created to mock this very phenomenon of listing creationist scientists.
  2. Ham is essentially making an argument from authority, which is a logical fallacy. The fact that I have a Ph.D. in Biology doesn't mean that I'm automatically right about anything biology-related, even in my own area. What it signifies is that I've spent a long time studying and thinking about biology and related topics, and that my thoughts on the matter are probably more well-informed than the average person's. But if a precocious seven-year child wanders up to me and hands me a verifiable fossil of a Precambrian rabbit, then as a field we would have some serious re-thinking to do1. It doesn't matter that she's still learning to reliably write her own name, or that I have a Ph.D. The evidence is the evidence; we don't make the case for evolution based on our degrees, we make it based on our observations of the world around us.

You can see this at work with Fabich and the debate in general. Fabich shows up in the video, snows the audience under with his credentials, and then declares – based solely on his now-established authority – that Lenski et al don't know what they're talking about. If you watch the video, you'll notice that Bill Nye doesn't do any of that. What does he do instead? He presents evidence. He holds up physical objects, he shows records of observations of trees and ice cores, he discusses what we see in the Grand Canyon.

So why do Ham and crew do this? Because, unfortunately, it works. This is one of the great challenges of science communication: people don't have the time or inclination to become experts, so they rely on others to do it for them (a mental version of the division of labour). How is the average person to know who has a "real" Ph.D. and who doesn't, or which expert is trustworthy and reflects the broad scientific consensus? Creationists, climate-change denialists, anti-vaxxers: they all rely on the same method of inducing doubt. They agitate for 'balance' and 'teaching the other side', because they know that doing so legitimises the debate. In fact, it's one of the reasons that I don't support Nye's decision to debate Ham. I feel that he did a great job in the situation, but I still think that it was a mistake2.

Until we can find a better solution to this problem, though, we're stuck with how I started this post: we need to root out people like Fabich and bring them into the harsh light of good science. Now if you'll excuse me, I have to go do some science of my own.

  1. We might also want to investigate the awarding of a major scientific prize to a pre-teen, but that's another issue
  2. Though this is arguably an empirical question. I could be wrong. Perhaps, on balance, he did more good than harm.

Nobel prize or GTFO.

After reading this great and thought-provoking post by scicurious on ‘failure’ in academia, I came across this comment (which was itself a reply to another comment):

Jenna, with respect, this is a description of how to do incremental science – but it’s NOT how discoveries are made. Not big discoveries, anyhow. Big discoveries DO require inherent creativity and vision .. and we are systematically weeding those qualities out of biomedical science, thanks to the hyper-conservatism exercised by NIH panels.

Please, go read Thomas Kuhn. I’m begging you.

Allow me to respectfully present the opposite view: please don’t read Kuhn. Or at least, don’t read him with the wide-eyed adoration that he’s afforded. Kuhn was dangerously anti-scientific and I place at his feet a great number of the current structural problems with science. His disdainful view of plodding ‘normal science’ and focus on revolutionary ideas is – IMO – a direct precursor to what I might call the ‘Nobel Prize or GTFO’ disease that permeates science now. You either get the cover of Nature and Science, or you’re a waste of time as a scientist.

And what’s wrong with incremental science? For one thing, it’s the way that a lot of science is done. As James Franklin1 points out, fields like ornithology or oceanography work on the incremental accumulation of knowledge and aren’t prone to paradigm shifts. Are they not science? Have we learned nothing from the explosion of scientific fraud in recent years as people are forced to come up with ever-sexier results or lose their funding? Or the recent push for replication, is that just plodding normal science? Kuhn makes us feel good – if no one’s listening to our ideas then it’s not because they’re bad, it’s just because of the establishment, man! – but his contribution to philosophy aside, he’s no basis for a useful approach to the work of science.

  1. In What Science Knows: And How It Knows It, a good book on the subject.
Hug an advisor today.

Hug an advisor today.

It’s inevitable.  Put two or more grad students1 in a room, and sooner or later the talk will turn to advisors;  wait a little while longer and chances are good that someone will start complaining about their advisor.  This shouldn’t be surprising.  Bosses all over are fair targets for griping, and the power imbalance inherent to the student-advisor relationship certainly doesn’t help matters2.  After being stuck in a lab together for several years, it’s a wonder that the average science department doesn’t have hallways littered with bodies.

But there’s a flip side to this, which I think might be under-appreciated:  advisors can also be awesome.  No relationship like this is going to be perfect, but a good advisor will teach you, guide you, and give you a leg up the academic ladder.  They’ll help you when you trip and faceplant, they’ll give you the advice you need (even if you don’t want to hear it), they’ll introduce you to the right people, they’ll make you better scientists.

Grad student

 

I’ve gotten very lucky along the way3.  Michael advised me on the undergraduate independent study that first inspired to take the idea of a research career seriously;  he also humoured me when I bit off a project that was far too much for me to chew and helped me sort through the resulting mess.  Pete, my M.Sc. advisor, got me into studying biological questions and gave me a place to work when my Ph.D. went sideways.  Luc-Alain, my Ph.D. advisor, turned me into an independent scientist and put up with far more than I had any right to ask for.  Finally, my postdoc advisor Mark has to be one of the nicest (and smartest) people that I’ve ever met and has been supportive in all the ways that have made being a postdoc enjoyable.  I’ve been privileged to work with a succession of people who’ve taught me and made me better at this science thing, and today I’d like to take a moment to be thankful for them.

Many people have poor relationships with their advisors, and there are a lot of valid questions to be asked about the grad school experience.  I don’t want to diminish the problems of anyone who has had a poor relationship with their advisor (I’ve known my fair share of bad advisors, even if I’ve been fortunate enough to avoid having them myself).  But if you have a good boss, now might be the time to take a moment and reflect on the things that they’ve done for you.  Give them a hug, a handshake, a friendly email, or a fresh data set.  And one day, when you’re in their position:  remember what it felt like to be the student.

  1. Honours students, postdocs, RAs, …
  2.  And yes, people have done research on this.  Are you surprised?
  3. Seriously lucky.  I didn’t do any of the things you’re supposed to do when picking an advisor.

Some tips for an academic job talk over Skype…

I recently had the experience of applying for a postdoctoral position at A Very Important University and made the shortlist to be interviewed.  Now, let’s face it, that’s a pretty terrifying thing to start with;  it wasn’t made better by the fact that I was doing it over Skype to a location most of the way around the world.  Visions of technical glitches, bad sound quality, and an overall horrible experience both interviewers and interviewed haunted me.  So, I spent several days polishing my talk, thinking up ways to make the Skype process smoother, and even reached out to Twitter for advice.  And boy, did I get it!  There were some great suggestions out there, a few of which really saved my bacon.  Thus, to save you the trouble of figuring all of this out for yourself, I’m going to share what I learned with you.

Your voice here: social media for academics, objections edition.

“Twitter, huh?  That’s just a bunch of people talking about what they had for lunch, right?”  *headdesk*

Hands up if you’ve heard that one.  Or any of the other clichéd objections to the use of social media by academics:

  • “I could be writing papers instead of fooling around on Twitter!”
  • “I talk to too many people already.  Why would I go looking for more?”
  • “It’s all just noise.”
  • “I don’t want to have to deal with uninformed commenters.”
  • And many, many more.

Academics and social media: the talk that you helped me write!

When I asked about academics’ use of social media, I got a gratifying number of responses that helped inform the content of the talk I’ll be giving next week at the GSA 2013 conference here at UNSW.  I even used some direct quotes from people who responded, which was a great help in giving context to the benefits of social media for scientists.  Since I’ve finished the talk now, I thought that it would be appropriate to share what was essentially a crowd-sourced talk back to the community.  So, take a look, and if you have any thoughts or suggestions, please feel free to leave a comment.  One thing to note:  this is a longer version of the talk that I’ll actually giving;  my practice talks went over the 12 minute limit, so I was forced to cut some slides (which I anticipated).

View this on SlideShare, or page through below:

 

Academentia and the iPad: an update from 2010.

iPaddr

Daniel Bogan via Compfight

 

Prompted by a conversation on Twitter yesterday, I revisited my old ‘how I use my iPad in academia’ post and since it’s been about 2.5 years, it seemed time for an update.  So, some quick notes on how I’m using it now:

  • Reading.  I don’t use iAnnotate any more, since Goodreader added annotations and nice Dropbox syncing.  I keep a Dropbox folder with my current ‘to read’ pile of articles and books, which is slurped up to my iPad by Goodreader.  I read articles on the iPad, annotating and highlighting as I go, and then I sync it back to Dropbox.  Back on my Mac, I file the papers away in BibDesk (I haven’t seen a use case for ‘social’ reference managers like Mendeley yet – at least for me – and I’m not sure if I’m going to any time soon).  Other ebooks, including an increasing number of textbooks, are read via the Kindle app and iBooks.  I prefer the iBooks software, to be honest, but right now I read it where I can find it.  Truth be told, I’ve gotten so used to ebooks now that when I can’t find something in digital format it noticeably irritates me.
  • Calendaring:  I sync my Google calendar to my iPad and iPhone, and read my calendar on the iPad with Agenda.  I prefer Agenda for the clean interface, though it’s a minor preference for me.
  • Organisation:  I wrote back in 2010 that Things was moving too slowly for my taste and that I was going to search for alternatives, but I never found one I was comfortable with.  I tried a lot of them:  Today, Remember the Milk, Appigo’s Todo, Wunderlist, and more.  All of them had some sort of problem that turned me off, be it bad syncing or subscription plans for useful services (hell, no) or something else that bugged me enough to make me switch back to Things.  I honestly don’t think that Cultured Code really deserves as much of my money as they’ve gotten, but I keep coming back to them for some reason.  This is a highly individual thing, though, and your mileage is going to vary.  A lot.
  • Social media, of course:  I still prefer the stock Twitter app on the iPad over alternatives so far, though if I do switch it will probably be to Twitterific.  I’ve written blog posts using Blogsy and I use the WordPress app to administer the blog.  I’ve made a few Skype calls with the iPad, which turned out all right (though I prefer wired connections for video calling), and the iPad is really the only way that I check Facebook any more.
  • News:  now that Google Reader is going the way of the dodo, I’ve switched to Feedly and I couldn’t be happier.  For saving stories to read later, I rely on Pocket.
  • Navigation: I’ve found that Apple Maps has gotten much better recently, so I’m no longer unhappy that Google Maps isn’t on the iPad (still not sure why that is, though).  I use maps more on my phone anyways.
  • Note-taking:  This is a category with a lot of change since 2010.  Nowadays I’ve switched largely to Notability for note-taking.  I find its handwriting set up easy to use when I’m jotting down notes in a meeting or a seminar, and it’s intuitive for scribbling on manuscripts and sending them back to colleagues.  I use a stylus for these tasks;  I’ve enjoyed the Pogo Connect, but my wife enjoyed it so much for drawing that she actually stole it from me.   So while I wait for the Adonit Jot Touch to shop (grrr, delayed), I’m using a $10 Dausen stylus that actually works quite well. I’ve also used Noteshelf as a notetaker for its nice writing tools and early integration with the bluetooth styli like the Pogo Connect;  when the Touch comes, I’m not sure exactly what I’ll end up using full time.   And when I’m looking to do more free-form scribbling, or I’m noodling with equations or just sketching something, I like Paper; it’s simple but pretty and powerful enough to get the job done.  I’ve also become more and more reliant on Corkulous to make notes in.  Unfortunately, despite protestations to the contrary, Appigo shows no sign of giving a crap about further development of Corkulous, and I’m reaching the limits of what the app will handle in terms of notes.  Also unfortunately, there doesn’t seem to be a good replacement out there, so I’m considering making one myself.
  • Information collecting:  some people would put apps like Evernote into the notetaking category above, and the Evernote + Penultimate setup works quite well for some people;  I haven’t looked at it in a while, but I may revisit it.  Until I do, though, I’m using Springpad as a dumping ground for random bits of info that I need (travel plans, receipts from conferences, paper work I may need to reference, books I want to buy, etc).
  • Mathematics and programming:  when I feel like playing around with a bit of math or I need to plot a quick graph, I use apps like SpaceTime1, PocketCAS, and Quick Graph.  Programming on the iPad is still a bit of a non-starter, though that’s starting to change a bit.  I’ve had fun playing with Codea, which embeds a Lua interpreter, and if you feel like learning Haskell, there’s iHaskell (must have an internet connection, though).  I recently used Codea to whip up a quick simulation of genetic drift (Fisher-Wright model), and it worked great.  I’ve seen a few Python apps and the like, but I haven’t had any experience with them;  if you had, please leave a comment!
  • Drawing / diagramming / presentation :  Another category with big changes to it.  When I last wrote about academic iPad usage, there wasn’t much to speak of here.  In the intervening time, though, this space has exploded.  Now, I use apps like Procreate (others like Sketchbook Pro) to sketch and draw with the Pogo Connect,  iDraw to create vector diagrams for talks and posters, and Omnigraphsketcher to work up quick hypothetical graphs.  Most of this gets fed into desktop apps like Keynote or Pages (or other design programs);  the iPad versions of these apps are good as well, and I use Keynote regularly to present with, but I’m still hamstrung by the lack of font support in Keynote for iPad.  Another long-awaited and massively useful tool to arrive is LaTeX snippet tools;  on the desktop, I use LaTeXiT pretty regularly, and now apps like Mathbot are serving the same purpose for me on the iPad:  I can write a quick line of LaTeX and copy the typeset equation into another app like Corkulous.
  • Writing: big changes here, too, driven by changes in my desktop workflow.  With my recent shift to using Markdown as a major format for writing, I’m now free to use some of the great cloud-syncing editors for the iPad to start things off.  So, a lot of my papers, blog posts, etc. now start their lives in Byword, which is incidentally the first app to really turn me on to iCloud syncing.  When I have to interact with Microsoft formats – yuck – I use still use Quickoffice.  LaTeX on the iPad has come a ways, with apps like Texpad, but I still find them too clunky for common use.  I’ve also gotten into collaborative writing of LaTeX through web apps like Spandex (and a new one that I’ve been meaning to try, Authorea), so I’m not really fussed about dedicated apps for LaTeX any more.
  • Misc: A few other apps I can’t live without include Dropbox, OPlayer HD for entertainment on the go, Calcbot for quick arithmetic, Convertbot to … well, convert stuff, Photogene / PS Express for quick photo edits (especially to screenshots I take to paste into other apps), and probably a dozen others that I use regularly but can’t remember right at this moment.

Going back to my old post, it’s clear that my usage of the iPad has changed significantly since I last wrote about it.  Some of the frontline, day-to-day apps that I use have changed or clarified (e.g. I use only Goodreader now instead of GR+iAnnotate), and entire new uses for the device have popped up, like drawing and writing in Markdown.  Increasingly, the iPad has become an indispensable part of my daily workflow, and though I could live without it, I certainly don’t want to!

What are your favourite apps and workflows for mobile devices (iOS or otherwise)?  If you have any thoughts, please leave a comment or let me know on Twitter.

  1. which is apparently now called MathStudio?

On the good and evil of scientific stories.

tl;dr: telling a good story is a vital tool in science communication, but it’s easy to go too far for a simple narrative.

If you’ve read this blog, attended a talk that I’ve given, or sat in on one of our lab meetings, you would know that one of my pet issues in science is communication. Scicomm, as it often goes by now, means more than explaining science to the public, though that is of course a large part of it. It’s also about how we communicate our science to other scientists, either in our field or ourside of it. Journal publications, conference talks, seminars, monographs, all of these things – and more – fall under science communication to me. And if you had found yourself as a fly on the wall when I was editing one of the Ph.D. students’ papers or critiquing conference slides, you would almost certainly hear me talk about story.

More precisely, you’d probably hear me say something like “what’s the story?” when I got through a rough draft of a manuscript, or after I watched a practice talk for an upcoming conference. When I say “story”, what I mean is the narrative and plot that ties together the work that you’ve done into a cohesive whole that the audience can follow and emphathise with. In the first chapter of his book Storycraft, Jack Hart cites this definition of story from Jon Franklin:

A story consists of a sequence of actions that occur when a sympathetic character encounters a complicating situation that he confronts and solves.

Story, as Hart says, consists of a recounting of a chronology of events (narrative), and the selection of arrangement of material so that a larger meaning can emerge (plot). Hart says:

For Eudora Welty “Plot is the ‘Why?’” Or, as the novelist E. M. Forster famously put it, the narrative is that “the king died and then the queen died.” The plot is that “the king died and the queen died of grief.”

I raise these issues because this is a problem that I’ve thought about at length when it comes to scientific communication. You might object that communicating science isn’t about a story, a narrative, or a plot, but I would strongly disagree. When you give a talk at a conference, you do exactly as Hart recounts: you construct a narrative and select material to form a plot (‘we identified some limit to our knowledge, we formulated some hypotheses, we did a test, we got some results, OMG science”), even if this looks nothing like what actually happened. You might be more familiar with this process in its rage form. Don’t fool yourself, this is story crafting. In its simplest form the scientist is the protagonist, the complicating situation is the unknown s/he is trying to banish as described in the introduction / methods, and the climax is wrapped up neatly in the results before the gentle falling action and dénouement in the discussion.

Story in formal scientific writing is often limited to the imposition of this narrative and plot structure, though stating it this way belies its importance; if you’ve ever reached the end of a journal paper and thought ‘what the hell was that paper about?’ (and we all have), chances are reasonably good that you’ve just experienced a failure of story. But when science is communicated to a wider audience, story begins to feature even more strongly. Whether written by scientists, science communicators, or journalists, it is easier to see this in action when the masters of the craft are in action. David Quammen, in his book Spillover structures his description of the hunt for Ebola and its reservoir around the story of the medical researchers who have tracked it through the jungles of Africa, winding in and out of their struggle to identify the source of the disease and the effects that it has on the people of Africa and elsewhere. It’s a detective story, which Quammen uses as a hook to lubricate the discussion of everything from molecular biology to mathematical epidemiology. But it’s the story that drives us through what would have otherwise been a textbook on epidemiology.

If I haven’t made it clear by now, I think that story’s important. Yet I also think that story has a dark side, one that we must be ever vigilant about as scientists, and it’s this: the push for a good story can obscure the truth. Science is messy, and full of complications and stumbles. There’s not always an answer, or a happy ending, and sometimes what we thought was right for a long time turned out to be incomplete, or even wrong. This fact is what makes writers like Quammen and science communicators like Carl Zimmer so valuable; they capture that messiness without letting it overwhelm the story, and in so doing make our science interesting to people. But if the push for a story goes too far, it can result in over-simplification and even simple and dangerous untruth.

I was reminded of this when I came across a post by one of my favourite writers on visual design, Garr Reynolds; Garr wrote the book Presentation Zen, and a series of other books like it, and I still recommend them to other scientists as a good way to get a handle on how to make your presentations suck less, visually. Recently, however, Garr wrote a post praising a video containing the work and narration of Paul Zak. The post, entitled “Neurochemistry, empathy & the power of story”, is itself curiously meta, as it disucsses work by Zak on neurochemical responses to the ‘dramatic arc’; in short, Zak claims that oxytocin and cortisol are part of the neurochemical suite that responds directly to the structure of a story, and can even be used in a predictive fashion (here, to predict the amount of donations that will be given when viewing a tearjearker story of father dealing with a young child dying of cancer versus the same father walking in the park with his son).

The irony of this, of course, is that Zak himself is an adept storyteller who has constructed a narrative around oxytocin as the ‘moral molecule’, reducing good and evil to the action of a single neurotransmitter. Here’s an excerpt from a Guardian article1 on Zak from last July:

What drives Zak’s hunger for human blood is his interest in the hormone oxytocin, about which he has become one of the world’s most prominent experts. Long known as a female reproductive hormone – it plays a central role in childbirth and breastfeeding – oxytocin emerges from Zak’s research as something much more all-embracing: the “moral molecule” behind all human virtue, trust, affection and love, “a social glue”, as he puts it, “that keeps society together”. The subtitle of his book, “the new science of what makes us good or evil”, gives a sense of the scale of his ambition, which involves nothing less than explaining whole swaths of philosophical and religious questions by reference to a single chemical in the bloodstream.

Here, we see the danger of story. In constructing a simple story with a compelling and digestible arc, Zak has swept the truth of this research under the rug, and the truth is that research on oxytocin is messy, contradictory, and provides few clear answers. As Ed Yong describes it, oxytocin can have distinctly contrasting effects depending on who receives it; some people may exhibit more social behaviour, while others in the same situation may exhibit more antisocial behaviour under the same dose of oxytocin. It can promote trust, or increase xenophobia. It may be that oxytocin is part of some motivator system: for example, people like James Goodson have worked to show that in birds like the zebra finch it2 is implicated in the ‘social behaviour network’ and may be instrumental in zebra finch flocking, though as in many other animals, this effect can be strongly sex-specific (usually to females).

All of this complication and mess is ignored in Zak’s story, which does a disservice to the reader who comes away with a simple view of the world that just doesn’t hold water. A friend of mine, a lawyer, asked me awhile ago if what he’d heard about this ‘cuddle chemical’ was true, and was visibly disappointed to learn that it was much more complicated than that. The problem here is that we are disposed to like a good, simple story; it has more emotional impact, which in turn makes it easier to remember and explain to others. Certainly, nobody wants to spend as much time reading journal articles and learning about nonapeptide hormones like oxytocin as I did for my PhD exam in order to tell a story at a party. This is why we have people like Ed, and Carl Zimmer, and Maryn McKenna, and all of the other great science communicators, writers, and science / scientist bloggers: they do the hard work of curating the facts and telling the story without losing the truth. Contrast Zak’s writing with Ed’s takedown of the oxytocin mess. It’s just as good a story, but it treats the truth with respect, and the truth is that we’re just not there yet. We have tantalizing ideas and scraps of evidence on how oxytocin affects us, but we can’t draw definitive conclusions. As Ed discusses, the hype around oxytocin has even led to people using it in an attempt to treat autism, with unknown and possibly harmful effects.

This isn’t an isolated problem. The TED talks have become a serious problem in this regard, and though I’ve seen some great TED talks over the years, they’ve grown to the point where the push for good stories has overwhelmed the ability of science to provide them. I saw the most recent example on Boing Boing when Maggie Koerth-Baker pointed to a problem in the widely-circulating story spun by 19-year old Boyan Slat on a plan to remove plastic from the oceans, namely, that it won’t work. Here again, we see the elements of story at work, this time surrounding Slat himself. A 19-year old phenom who rises to glory on the back of an award-winning school research paper, a hands-on problem-solver producing solutions and starting a foundation to implement them. It’s a feel-good story with a likeable protagonist who is tackling a problem that scares us all; it’s a shame that the scheme probably won’t work, and may even do more harm than good if ever implemented. The issue at hand, though, is that the story told by and about Slat is compelling but oversimplistic and potentially dangerous, just as the one told by Zak is3. As Maggie points out in her post:

Here’s a mantra to remember: TED Talks — interesting if true.

And the same is true with anything you read in the popular press about science. It’s interesting, if it’s true.

Now, I began this post by pointing out that I’m a big proponent of story in science, and I stand by that statement. Story is an important, and I would argue, necessary tool when we come to communicate the results of out work, for the same reasons that it can go badly wrong. A carefully crafted story draws the audience through the science, ties it together in a way that they can understand and remember, and adds punch to the work so that the audience cares enough to pay attention. Yet this process, while vital, needs to be kept in check by the demands of the search for the truth and the admission of messy detail and incomplete knowledge. The tension between story, which yearns to be complete, and science, where more research is always needed, must be respected and maintained lest you end up with bone-dry science or a compelling – but misleading – tale.

  1. or as Ed Yong puts it, ‘ad’
  2. under the name of mesotocin
  3. as an aside, I’d like to say that despite the problems inherent in Slat’s plan and how it ended up going viral, I hope that he keeps trying. He sounds like a smart guy, and failure is a great first step on the road to success.