Searching for a promising scholar

  • Are you deeply interested in insect ecology?
  • Are you excited about exploring the intersection of plant defenses and insect overwintering survival using both traditional field work and cutting-edge molecular tools?
  • Would you like to do your graduate degree research at a small, research-intensive university in a great community surrounded in all directions by forests?
  • Are you looking for an opportunity to develop a network of collaborators from a variety of other institutions during your graduate studies?

Our lab is looking for a graduate student (M.Sc. or Ph.D. level) or a postdoc, and a promising scholar of natural history, to join us in studying how pine host defenses affect mountain pine beetle larval overwintering success in its normal hosts (e.g. lodgepole pine) and a novel host (jack pine) across its expanding range. What are the tree chemical defenses that larval mountain pine beetles experience in their early development? How long after a tree has been killed do those defenses remain in the tree tissues? How do those defenses affect larval growth, development, and physiological preparation for surviving the extreme cold of a northern winter? What are some of the specific genes that are instrumental in helping the larvae obtain enough nutrition from the toxic environment of their host tree to allow them to survive the winter? What are the effects of climate and a new host species in a new geographical range on larval survival? How might these affect the spread of this invasive insect?

Work on your project will take place wherever the beetles and the host trees live. That will mean extensive field work in the summers, including trips to other parts of British Columbia, into Alberta, and perhaps beyond. In addition, there will be plenty of lab work throughout the year using techniques ranging from analytical chemistry to RNAi.

This work is funded by a major grant and is an extension of five years of previous successful and highly collaborative work. That means a number of things…

First it means that there is secure funding for several years of your research. This includes funds for materials, travel, conferences, and publication fees. It also includes funds for student stipends. However I would strongly encourage applicants to look for their own funding as well. More on that in a moment.

Second, it means that there is a preexisting network of institutions and researchers who you will be able to work with during the course of your degree. The “network” grant ensures that we continue to maintain close collaborative relationships with other scientists at the University of British Columbia, the University of Alberta, the Natural Resources Canada Canadian Forest Service, and the University of Minnesota. A strong collaborative network, such as this one that we have developed over several years, is beneficial to you as a graduate student both by providing research opportunities and because it may lead to further career opportunities beyond your graduate studies.

Third, it means that you will be working on a solid foundation of results, data, and ideas. You will have the opportunity to push hard against the envelope of our current knowledge. You can find a fairly up-to-date list of papers that have come out of our collaborative research so far at this link. I would specifically recommend that you carefully read those papers that I linked a few paragraphs above – as well as this, this, this, and one more to appear here shortly – prior to considering whether this position might be for you.

Graduate studies are not easy by any stretch. Ask anyone who has done them, or fellow students who are in the midst of their work. But they can be the most richly challenging and rewarding time of your life. While doctoral degrees are awarded to successful candidates for their ability to develop and defend new ideas, explore hypotheses, and communicate findings to various audiences in a robust manner, those things are only an outworking of something deeper. A course of graduate studies is, more often than not, a journey of maturation as a scientist and a scholar. So I am looking for someone to join our research program who can demonstrate that they are ready to embark on that road. Specifically, I am looking for someone:

  • who has shown that they are capable of committed work over an extended period of time,
  • who can work equally well in the lab or in the field,
  • who has shown that they are capable of scholarly output (e.g., papers, presentations at conferences, etc.) even early in their scientific career,
  • who is able to develop novel hypotheses and pursue them with passion,
  • who has a sincere and scholarly interest in insect ecology, and
  • who wants to explore the very edges of what we know about the natural world.

While this project is well-funded, I will be looking for applicants who either have their own funding in hand, or who show the potential to pursue and receive their own funding. As noted above, our grant will allow for a suitable and livable graduate stipend. But finding your own funding is an important part of the graduate degree process, it looks great on your CV, and it provides you with one more layer of security during your time as a student. It also serves to free up some money that can then be used to support more experimentation, to hire valuable research assistants to help with your project, and to allow more trips to present your findings at conferences, etc. UNBC awards entrance scholarships for excellent students, maintains a number of other awards, has a tuition rebate for Ph.D. students, and provides a large number of TA-ships to supplement your income with pay for teaching experience.

If you come to work in our lab, you will find a pleasant group of people excited about their research. You will become a part of a close-knit group of researchers who are interested in many of the things that you are working on. You will also find UNBC to be a vibrant community with lots of great things going on. The surrounding city, Prince George, is a great place to live with many cultural opportunities in town and fantastic outdoor activities all around. And you can’t beat the reasonable rents or the five-minute commutes – or commute by bike in the summer and skis in the winter!.

If you are interested in this position, please email me at huber@unbc.ca for further details or to ask the questions that you probably have.

Thank you for reading this, and I look forward to hearing from you.

I’m not Grandpa Simpson (although I may sound that way)

Pretty much every morning I check the Google News page, and I generally scroll pretty quickly to the science headlines. Today one of the big headlines was about a fascinating new PLOS ONE study that shows quite conclusively that insects from several orders detect and respond to changes in barometric pressure. Such behavioral reactions in insects make sense as pressure changes usually indicate important changes in the weather that could jeopardize an insect’s reproductive success.

My antennae (no pun intended) immediately popped up because my hazy memory seemed to recall something like this being studied in bark beetles quite some time back. A quick search brought up this 1978 paper. I then went back to the PLOS ONE paper, and thankfully found the authors had cited the older paper in their final reference. The 1978 paper itself cites several other studies dating as far back to 1955 that hint at this kind of phenomenon. My bet is that this general phenomenon was observed prior to 1955, and further digging would take us quite a distance into the past.

The new result is extremely cool, of course. Hopefully it goes some way to reviving an old idea for some new and fruitful study. I don’t fault the study authors for the general tone of the 24 hour news cycle hype that seems to suggest that this is a brand new idea. The media are like that, and once a story gets into the hands of a journalist it can take on a life of its own no matter how careful the interviewed scientist was to state the full case. I suspect that most reporters rarely closely read reference sections.

That said, this little episode started me thinking a bit – with help from a few of my Twitter friends – about how we do science and how much, or how little, we pay attention to the past. Interestingly Chris Buddle at McGill University wrote a very prescient blog post just today that provides a perfect example of what I’m talking about. Specifically, there are certain aspects of the way that science is currently done that cause us to rush forward into the future without paying enough attention to the past.

I can think of three reasons for this, and perhaps you can think of others:

1. The tyranny of high-speed novelty – This comes in at least two flavors, but they’re both mixed into the same underlying ice cream. First, science news, like other news, comes to us from all directions. University communications offices and journal PR departments are eager to capitalize on this for what amounts to free advertising. Although scientists who read these interesting accounts know that science, in general, moves at a modest pace at best, there is likely a subconscious tweak that says “hey, you need to move faster, everyone else is.”

Second, university tenure and promotion committees and granting agencies require ongoing productivity. This makes sense, of course. But the main measure of productivity is the peer reviewed paper. This means that there are likely many papers that escape into the wild from the lab or field before their results are fully mature. Contrast this to, for instance, Charles Darwin who spent years studying barnacles to the point where he wrote in a letter:

I am at work on the second vol. of the Cirripedia, of which creatures I am wonderfully tired: I hate a Barnacle as no man ever did before, not even a Sailor in a slow-sailing ship. My first vol. is out: the only part worth looking at is on the sexes of Ibla & Scalpellum; I hope by next summer to have done with my tedious work.

This type of longterm commitment to a study still happens today, but often in special circumstances. Besides the fact that getting funding for Darwin-style barnacle research these days would be woefully difficult, proposing to conduct such a long study with no predetermined timeline or outcome would sink any proposal.

So there are career pressures on one side that push scientists into grant-cycle-length (or shorter) studies, and there is the constant barrage of news stories and CSI-like shows on the other other side that give the impression to the public that science moves at a furious pace. Both together add up to at least some degree of myopia across the board.

2. Referencing software – In this case I’m talking about products like EndNote and RefWorks and others that are very useful, but also potentially damaging in the development of a good vision of the past. To this day I still do all of my reference work in papers that I write or edit without these tools. I have tried various software solutions in the past, but I have found that they tend to distance me from the literature and dull my ability to remember what has been done before.

When I physically type in a reference, or even copy-and-patse a citation from a previous paper and revise to fit the journal standard, it forces me to think about that paper and the foundation on which it was built. This often leads me down a reference rabbit trail that can, on occasion, help me to contemplate the topic in a deeper manner. If, on the other hand, I simply input “(Smith et al. 1997)” and the software does the rest, my brain just carries on with what it’s doing and doesn’t necessarily make any new connections. The writing process should inherently be a learning process, and by letting software do parts of the work for us, I fear that one part of that process is going by the wayside.

3. Online access – At this point I’m going to start to sound a bit like this guy, so please just roll your eyes for a moment and then hear me out. First, note that I think that the fact that much of the scientific literature in the world is now online is a great thing. Even better is the fact that a lot of it is either open access or is heading that way.

Now here’s the part where I will start to sound old. When I started my Ph.D. in 1995, the internet as we know it today had just barely gotten off of the ground. Prior to that I had been using things like GOPHER, ELM, and PINE… many of you probably have no clue what I’m even talking about here. The long and short of it was that virtually all scholarly outputs were on paper in the library. When I was researching a subject, I would go to the library, use an index based on a mainframe (or even extensive tomes of the paper version of Biological Abstracts), and then get a rolling cart that I’d push around the library. My cart and I would head through the stacks, picking up volumes along the way. Then I’d go to the central photocopying area to copy the articles that I wanted to read. Later, back at my desk or lab bench I’d read the articles and circle any references that I needed in order to delve back further into the literature. Then I’d make my way back to the library and restart the process. Chasing references was a process that took time and allowed for thought.

Today I fire up my web browser, point it at Google Scholar, do a few quick searches, and then I’m off to the races. If, while I’m reading a paper, I see a reference that interests me, there’s usually a hyperlink there to take me right to that paper. Within minutes my virtual desktop can be full of PDFs, enough to keep me busy reading for weeks. Reference chasing now takes no time at all.

The problem with the old process was that it was painfully slow and labor intensive. The problem with the new process is that those silicon brains are so fast that they don’t allow time for our human brains to stop to really think. In 1998, when I was standing by the photocopier, I was also mulling over the papers that were dropping into the copier tray. The process forced the time to think on me because I really couldn’t go anywhere else while the copier was doing its job or while I was wandering through the stacks. In 2013 the process should provide me with more time to think because it is a lot faster. But that extra time does not necessarily get filled with contemplation unless I make it so. And there are many pressures – and temptations – that all of us face that can easily reduce that technologically found time into lost time in no time.

It should also be noted that while there used to be a “demographic” gap in the papers that were online, most journals have now archived almost their entire collection (e.g., a shameless plug for a fine journal here). This is to our benefit, if we take advantage of it.

 _____

So what can be done about this? There is not much that we can immediately do about media, administration, granting agency, or public expectations and perception because these are all an ingrained part of the current culture. Cultural shifts take time.

The challenge to scientists, then, is to work to change that culture, one researcher and one act at a time. I am not blaming referencing software or online journals. Far from it. Both are vital parts of the process in our era, and both bring benefits that were hardly imagined a couple of decades ago. But with this technology comes a responsibility to ensure that we are doing things like undertaking longterm studies; reading deeply into the literature; spending time contemplating instead of getting caught up in a Red Queen scenario; making sure that both our students and ourselves understand and explore the deep foundations of current breakthroughs; and doing our level best to get it across to the media that our results are only possible because of work that has been done by others.

We owe it to our students, to the public, to our scientific “ancestors,” to our current colleagues, and to ourselves.

Open access… Canada?

Today marked a major milestone for open science. Specifically, the Obama administration announced a directive that all US federal agencies which receive over $100 million in funds for research and development work on creating a plan to ensure open access to all research outputs within a reasonable time frame.

To quote from the Obama administration memorandum:

“To achieve the Administration’s commitment to increase access to federally funded published research and digital scientific data, Federal agencies investing in research and development must have clear and coordinated policies for increasing such access.”

You can read more about it here, and here.

A number of other countries, including Canada, have mandatory open access policies for some of their taxpayer-funded research, but for the most part the policies apply to health-related research. And in many cases you can also find research stemming directly from federal scientists freely available on the web.

In some cases (e.g. the UK and Australia and a few others) open access is mandated for all federally funded research. And now that the US has taken this step to full openness, I think that it’s fair to say that there is a lot of pressure on countries that haven’t done the same to get moving down that track.

I’m looking at you, Canada!

Like many other countries on that list, Canada has some mandatory open access policies, but they mainly pertain to health sciences. There have been rumblings of more openness from the Canadian government, as noted by one of my Twitter contacts:

…but the steps taken by the UK, Australia, and now the US are good indicators that Canada’s steps so far have been baby steps at best. It’s time for that to change.

Why should we, as Canadians, call for a mandatory open access policy for all federally funded research? Here, in brief, are a few reasons that come to mind, and I know that there are more:

  • Fairness. Taxpayers paid for the research. Why should they also have to pay to access the results of the research?
  • Open access accelerates the pace of discovery. Although I’m at a small university, the UNBC library is well-stocked with many journals that the folks in my research program and I use. But we occasionally come across articles that we need that are unavailable. The choice then is to keep looking for the information elsewhere, pay up at the paywall, or go through the interlibrary loan process. Our librarians are superb at getting access to individual journal articles that we need, but not everyone is so lucky to be affiliated with a good library at a good institution. There are many scientists who do not have access to these kind of services, and they either have to pay or hope to find the information elsewhere. And most members of the general public have absolutely no access to such services at all. Open access removes those barriers and allows research to move ahead more efficiently.
  • Open access makes research more relevant and reduces the temptation to “hoard” data. Open access allows other researchers and the general public to look at research outputs in all sorts of unpredictable ways. Full accessibility lets the full diversity of interests see and think about the work and, hopefully, take it to new and unpredictable places. In addition, while my little corner of the scientific endeavor (forest entomology, for the most part) is generally not beset by researchers afraid of being “scooped,” this tendency is present to some extent in all fields, and to a large extent in certain fields. Hoarding of data in order to hopefully glean the research glory results in competitive, rather than collaborative, use of research dollars. Replicated efforts in several competing labs may drive research to move faster, but it also sucks up declining research dollars in identical endeavors. Open access, and particularly the tendency toward open data that comes along with it, erodes these tendencies and promotes collaboration instead. The rise of biological preprint servers such as PeerJ PrerPrints and the biological portion of Arxiv also facilitate the erosion of meaningless competition.
  • Open access makes research institutions more relevant. In an era when universities are struggling with funding and, in some cases, public perception, the ability to freely disseminate the useful products of research to the public provides incentive for taxpayers to pressure governments for better funding of postsecondary education. If research results are behind paywalls, they remain mainly unknown to the public and, thus, irrelevant. If the results are irrelevant, so are the institutions in which they were produced.
  • Open access allows the public to see firsthand the evidence-based results that should be driving public policy. Ideally, all governments should consult honestly with scientists about medical, environmental, social, and other issues as they create policy. Realistically, most governments do this only as much as is optimal for their own political agenda. By removing all restrictions to access to research outputs – combined with a growing tendency for scientists to explain their research results to the public – governments will also have to be more transparent in their consultations with researchers. Perhaps we can move to a time when research drives policy rather than seeing policy attempt drive research.

It is, indeed, fantastic to see the US take this big step. And, as noted above, the US is not the first country to do this. It’s now time for the Canadian public to ask our government to start to take this issue more seriously as well, too.

PeerJ, today!

Along with being Darwin’s birthday, 12 February 2013 marks the official launch of the first articles on PeerJ.

In case you haven’t heard about it already, PeerJ is a brand new open access journal, with a twist. Or, actually, a few twists.

For instance, instead of a pay-per-article fee, PeerJ has all authors buy a lifetime membership in the journal. There are several levels of membership, depending on how much publishing you think that you might do on a yearly basis. And there are no yearly renewal fees. Instead, you maintain your membership by taking part in journal activities. For instance, if you review one article a year, your membership will stay active. This fee/membership model allows for an ongoing revenue stream (when members publish with new co-authors who are not yet members), and also stimulates ongoing and growing involvement in the journal by a diverse group of scientists.

Another welcome innovation that some other open access journals are also embracing is the insistence that authors co-publish their data with their paper in a repository such as figshare. This concept is not new to many disciplines. Genomics researchers have been publishing data along with their papers for years using repositories such as those provided by NCBI. But with the growth of the internet, there is no reason that all data associated with a paper can’t be publicly and permanently available in a citable format. By making data public in this way it is easy to anticipate that others will be able to use and build on the data in new and exciting ways.

PeerJ also commits to publishing any work that is rigorous, no matter how “cool” or “sexy” it is… or is not. To quote: “PeerJ evaluates articles based only on an objective determination of scientific and methodological soundness, not on subjective determinations of ‘impact,’ ‘novelty’ or ‘interest’.”

And one last twist that I’ll mention (please see this launch-day blog post from PeerJ for more information), authors can choose to publish the full peer review documentation alongside their accepted article. Besides giving some great insight into the review process, it also allows readers to study other expert opinion on the work and come to their own decisions.

PeerJ has an impressive advisory board that includes five Nobel laureates. It also has a huge and diverse board of academic editors, of which I’m a member (no Nobel Prize for me yet, however). I also have the honor of having been the handling academic editor on one of the first thirty articles in PeerJ.

And, one last note. PeerJ PrePrints is also going to come online in a few weeks as well. If you are familiar with physics and mathematics, you doubtless have heard of preprint servers such as Arxiv. Researchers in those fields have been publishing their preprints (nearly final draft) papers online for years. This is a constructive practice as it allows the larger community to see and comment on results as they come out. This both strengthens the eventual manuscript for final publication and it allows the research community to use the results immediately instead of waiting for the final publication. Of course, it also helps the researcher to establish priority for the work.

Historically, many journals in biological fields have had issues with the use of preprint servers as they have considered such early deposition of a manuscript as “prior publication.” This, too, is changing and I expect that the growing use of PeerJ PrePrints, and others like it, will make the change final.

I am under no illusions that the shift to a more open publishing and data sharing paradigm will be completely smooth sailing. As with anything new, there are going to be challenges and opposition from some corners to doing things in a new way. But the internet has changed the way that we do everything else in our society, often for the better. There is no reason that academic publishing and dispersal of research outputs should remain in the era of the printing press. PeerJ, and other publishers, are working diligently to guide our larger research community through this process of continual innovation.

Exciting times!

—–

Update: Some great coverage here, here, and here.

Twitter JAM

I just returned home last night after spending a few days in Edmonton at the Joint Annual Meeting of the Entomological Society of Canada and the Entomological Society of Alberta. It was a well-organized meeting with lots of great talks and posters. And, of course, lots of time to reconnect with colleagues from other universities.

A number of entomologists at the meeting, including myself, have Twitter accounts, so we “live tweeted” some of the sessions that we attended. The conference hashtag was #ESCJAM2012, in case you want to take a look at the Twitter record of the event.

From my perspective, live conference tweeting was generally a positive experience, although I say that with a few caveats. Here are my brief thoughts on the Twitter JAM:

1. I enjoyed being able to read about what was going on in other concurrent sessions. My fairly packed schedule this year did not give me much leeway to move from session to session. With so many concurrent sessions, I would have ended up missing interesting talks regardless. So it was good to have at least a taste of what was going on elsewhere. Some of the conference tweets encouraged me to talk to others about research presentations that I didn’t get to attend.

2. I can imagine how this practice is useful for professional and citizen scientists who are not able to attend a meeting. I know that if I were not at the #ESCJAM2012, I would have been following along from my office desk. I plan to virtually attend conferences like this in the future.

3. I noticed that live tweeting can be distracting in a number of ways. First, I often worried that I was causing distraction to neighbors when I would pull out my iPad to compose and send a tweet during a talk. Although I tried to sequester myself near the back edges of rooms (not great for face-to-face networking), I would sometimes get glances when my iPad lit up. Second, the act of composing and sending a tweet distracted me for a few moments from what was going on up front. There were a few times that I knew that I had missed an important point. And third, I know that a few of my followers found the stream of insect tweets to be a bit of a hassle. None of these are insurmountable, but all are issues that we need to be aware of.

4. Some tweets are better than others, including tweets at a scientific conference. Was every one of my tweets useful? I doubt it. Did every one of my tweets fairly represent the talk that I was listening to? Is that even possible in 140 characters? Obviously not. As Marshall McLuhan famously intoned, the medium is the message. Ultimately, is Twitter the best medium for science?

5. To expand on point #4, the best tweets were the ones that contained added value. A great example of this was a “toy” built by David Shorthouse that “caught” tweets with the #ESCJAM2012 hashtag and a species name and then pulled up a bunch of related references.

This is but one example of how Twitter can, in fact, punch above its 140 character weight.

In a much less technical fashion, in one or two instances I dug up new or classic papers related to a presentation and provided the URL(s) in a tweet.

Of course, that whole process took even longer than a regular tweet because I had my nose buried in Google Scholar; so we’re back at point #2. Some form of automation, perhaps similar to that also envisioned by David, could do what I did more effectively without me actually having to poke away at my iPad while only partly paying attention to someone who spent a lot of time putting together a good presentation.

6. Science is becoming more and more open, and that is a good thing. Journal articles and conferences were originally intended to increase the flow of information, ideas, and data. For many, many years both have done just that. But the web-connected world means that those vehicles don’t always do that as well as they used to in their fully traditional form; nor do they do it as well as they could considering the available technology. Just as paywalls at journal sites act to slow the flow of information compared to innovative open access options, conference travel and fees represent a paywall as well. We now have the technology to tear down those conference walls so that all of our colleagues and the general public can benefit and build on our ideas. Twitter might be part of the paywall wrecking crew, at least in the near term.

7. What if each session at a conference had a designated tweeter (DT)? Sessions already have a moderator and a projectionist, and I can imagine adding a DT to that mix. Each DT in each concurrent session would tweet into one unified conference account (e.g. @ESCJAM2012). Each session would have its own separate hashtag (e.g. #ESCforestry, #ESCbiodiversity, #ESCevolution, #ESCecology). The choice of DT for a session would be based on their interest and expertise in order to make the tweets as relevant as possible. In other words, thought would go into the choice of a session DT; the DT wouldn’t necessarily be the first available volunteer Others in the sessions would be encouraged to participate as well, but general participants would not feel like the tweeting burden was on them. General participants could maintain good focus – why even meet in person if your nose is in your device half of the time? – and could tweet from time-to-time if they felt a reason or had the expertise add value to the online conversation. But whatever the general participants decided to do, the session would be broadcast in an effective manner by an engaged and expert DT.

Do you have other thoughts on this practice? Where do you see this going in the future? Is live tweeting simply a road stop on the way to standardized full broadcasts of conferences? What, if anything, does tweeting bring to the table that is missing from face-to-face interaction or that couldn’t be realized through other non-electronic means? What hesitations to you have about this practice? How has live tweeting been a benefit to you or to others who you know?

Live tweeting, or something like it, seems to be the direction that we’re heading. It’s time for some frank discussion about the best ways to make scientific conferences more open to all. So tweet away!