Tag Archives: Academia

“I feel thin, sort of stretched, like butter scraped over too much bread” – Why scientists need to learn to say “no” to themselves

20190131_091055.jpg

Earlier this week I had to write an email to some potential collaborators that I really wasn’t looking forward to sending.  I’ve been doing some hard thinking since Christmas and had decided not to go ahead with a grant submission for a project that was my idea, that I had initiated.  I was now pulling back from that and feeling as though I was letting people down.

The fundamental reason is lack of time, of being really over-stretched at the moment.  Just before the Christmas break I received word that two grants that I’m involved with, one funded by NERC, the other by the Australian Research Council (ARC), were both successful. This is on top of four existing projects, funded by NERC, BBSRC, Heritage Lottery Fund and Butterfly Conservation. Plus the non-funded work I’m doing.  One of my tasks this week was to add a Current Projects and Collaborations page to this blog, so I can keep track of what I’m doing as much as anything!  Although I’m a minor partner in many of these projects, it’s still a lot of work to keep on top of everything, plus teaching,  the Research Excellence Framework for which I’m departmental lead, etc.  I’m also trying to complete a book which I’ve promised to deliver to the publisher soon.  And blogging of course….

There’s a line in the Lord of Rings in which Bilbo tells Gandalf that “I feel thin, sort of stretched, like butter scraped over too much bread.” Intellectually that’s how I’m feeling at the moment.

It’s my own fault, I say “yes” to things too readily, something which a lot of academics do and which is being widely discussed on Twitter and in other blogs.  Most of this discussion focuses on saying “no” to other people, to manuscript and grant reviews, to offers of collaborations, and so forth.

But I think it’s just as important that we learn to say “no” to ourselves.  We need to realise that, however great an idea that we’ve had is or however enthusiastic we are about a project or a paper or a book or organising a conference, if we don’t have the time and energy to follow through and do it properly, we are selling ourselves and our collaborators short.

Of course this is easy to say but not so easy to put into practice.  There are a lot of external pressures on academics to write more grant proposals and papers, to do more work on the impact of their research, to take on tasks within and without their institutions, and thus spread themselves too thin.  Being a scientist and teacher in a university is a great job and I feel very fortunate to be doing what I do.  But in the long term we’re doing no one any favours, not least our employers and our families, if we burn out early.

9 Comments

Filed under Biodiversity, University of Northampton

Which h index should I use? UPDATED

2018-09-16 10.04.28

UPDATE:  Thanks to my kind commentators (below) who pointed out that one can change Google Scholar, taking out papers that don’t belong, merging variants, etc.  It had been a while since I looked at Google Scholar and perhaps I knew this in the past but had forgotten.  However I had an issue with it linking to my Google account and so had to delete the old profile and set up a new one.  That seems to have worked OK, I have got rid of the publications that weren’t mine, and my h-index looks to be fairly accurate at 38.  I have adjusted the text below to reflect this.

————————————————————————-

Despite some (well founded) criticism as to its usefulness, the h-index seems to be with us to stay.  In a couple of posts I’ve articulated some of its advantages and disadvantages – see for example What’s the point of the h-index? and How does a scientist’s h-index change over time? – and it’s clear that more and more funding agencies are using it to evaluate the track record of applicants.  Just this afternoon I finished the second of a couple of grant reviews in which the applicant was asked to state their h-index.  What they were not asked was which h-index they should state, i.e. the source of the value, though I think that this is important information.  Why?  Because it varies so much depending on where the it comes from.  I’ll give you an example – here’s my own h-index values taken from a few different sources:

Google Scholar: h = 38

ResearchGate: h = 36

ResearchGate (excluding self citations): h = 34

Web of Science (all databases): h = 34

Web of Science (Core Collection): h = 29

Scopus: h = 29

There’s a 10 point difference (almost 25%) between the largest and the smallest values.  So which one should I cite in grant applications, on my CV, etc.  Well the largest one, obviously!  Right?  Well maybe, but not necessarily.  In fact none of these values are completely accurate, though some are more accurate than others.

Web of Science includes papers and book chapters that don’t belong to me, and I can easily shave a couple of points off that value.  Some of these mis-attributions are chapters from a volume that I co-edited.  Some are papers that I edited for PLoS ONE and which have been assigned to my record.  Others are for the two or three other researchers named “J. Ollerton” who are out there.  Google Scholar had some entries which are just bizarre, such as “The social life of musical instruments” by Eliot Bates, which Google Scholar seems to think I wrote and has credited me with its 102 citations.  However, as you can see form the update, I’ve corrected this.

Web of Science and Scopus don’t pick up as many citations in books or reports as does Google Scholar which is a deficiency in my opinion.  Being cited in a peer-reviewed journal is often thought of as being the gold standard of citation but frankly I’m very happy to be cited in government and NGO reports, policy documents, etc., which themselves may often be peer reviewed, just by a different type of peer.

Poised in the middle of this range, ResearchGate may be most accurate but it lacks transparency: as far as I can see there isn’t a way to look at all of your citation data per paper in one go, you have to look at each publication individually (and who has time for that, frankly?)

As far as calculating an accurate h-index is concerned I don’t think we will ever come to an agreement as to what should be considered a publication or a citation.  But systems like Google Scholar and Web of Science should at least try to be accurate when assigning publications to an individual’s record.

So which h-index should you use?  In the interests of accuracy and honesty I think it’s best to state a range and/or add a proviso that you have corrected the value for mis-attribution of publications.  In my case I’d say something like:

“Depending on source my h-index lies between 29 (Scopus) and 38 (Google Scholar)”.

If the h-index is to have any value at all (and there are those who argue that it doesn’t and shouldn’t) then it requires us as scholars to at least try to make it as accurate as we can.  Because frankly I don’t think it’s going to go away any time soon.

 

 

12 Comments

Filed under Biodiversity

Do reference management systems encourage sloppy referencing practices?

Over at the Dynamic Ecology blog there’s an interesting discussion going on about “how to keep up with the literature” that’s relevant to all fields, not just ecology.  Spoiler alert: it’s impossible to “keep up” if “keep up” means “read everything”.  But do check it out as there’s lots of good advice in that post.

One of the topics that’s arisen in the comments is about the use of reference management systems such as Endnote, Refworks, Zotero, Mendeley, etc. Everyone has their own preferences as to which to use, and there seems to be advantages and disadvantages to all of them.  However a minority (so it seems) of us don’t use any kind of reference management system, which strikes those who do as very odd.  Personally, I tried Endnote a long time ago, it was ok, then I lost the database when an old computer bit the dust.

I’m not sure how much more efficient/effective I would be as a publishing academic if I was to get back into using a reference management system. One of the supposed advantages of these systems, that they will format references to the specific requirement of a particular journal, seems to me to be a double-edged sword.  I actually find re-formatting references quite relaxing and I think (though I may be wrong) that it develops attention-to-detail and accuracy skills that are useful in other contexts.

Also I suspect, but have no proof, that reference management software is responsible for perpetuating errors in the reference lists of papers that then result in mis-citations on Web of Knowledge, etc.  My suspicion is that this has got worse over time as people rely more and more on reference management software rather than their brains.  These citation errors can have an impact on an individual’s h-index, as I mentioned in a post last year.

By coincidence yesterday I spotted a hilarious example of just this kind of mis-citation that I think can be blamed on a reference management system. This paper of mine:

Ollerton, J., Cranmer, L. (2002) xxxxxxx Oikos xxxxxx

was rendered in the reference list of another paper as:

Ollerton, J., Cranmer, L., Northampton, U.C., Campus, P. (2002) xxxxxxx Oikos xxxxxx

The last two “authors” are actually from the institutional address – University College Northampton, Park Campus! [UCN is the old name for University of Northampton].

Now in theory that shouldn’t happen if an author’s reference management software is doing its job properly, and information has been correctly inputted, but it does happen: errors are not uncommon.  In addition (it seems to me) authors often don’t check their reference lists after they have been produced by the reference management software. That’s sloppy scholarship, but I can understand why it happens: people are busy and why bother if the software is (in theory) getting it right every time?  It also shouldn’t happen at the editorial production end of things, because references are usually cross-checked for accuracy, but again it does, even for top-end journals (in this case from the Royal Society’s stable!)

Again it’s anecdotal but I’m also noticing that reference lists in PhD theses that I examine are getting sloppier, with species names not in italics, various combinations of Capitalised Names of Articles, unabbreviated and abbrev. journal names, etc. etc.

Does any of this really matter?  Isn’t it just pedantry on my part?  Whilst the last statement is undoubtedly true, I think it does matter, because attention to detail at this very basic level gives the reader more confidence that attention has been paid at higher levels, such as citing accurate statistics from primary sources to back up statements, rather than relying on secondary sources, as Andrew Gelman discussed in an old blog post on referencing errors.

But maybe I’m a lone voice here, I’d be interested in your thoughts.

21 Comments

Filed under University of Northampton

How many non-peer-reviewed publications should a scientist produce?

Peer-reviewed writing moves science forwards; non-peer-reviewed writing moves science sideways.  

That’s my publication philosophy in one sentence.  In other words, when scientists write research papers and book chapters that are peer-reviewed, the underlying rationale is that we are adding to the sum total of human knowledge, providing insights into a topic, and moving a field forwards. When we write non-peer-reviewed articles we are generally writing about science for a broader audience, with little original content (though perhaps with some original ideas).  This moves concepts out of a narrow subject area and into the purview of wider society, which can be other scientists in different fields, or government agencies or policy makers, or the general public.

There can be exceptions to the rule, such as the IPBES pollinators and pollination report that I’ve been discussing this year. The report was widely peer-reviewed but is intended for a much broader audience than just scientists.  Conversely, non-peer-reviewed critiques and responses to published papers can clarify specific issues or challenge findings, which will certainly move science forward (or backwards into muddier waters, depending on how you view it).  However, in general, the principle stated above holds true.

This raises the (admittedly clunky) question I’ve posed in the title of this post: just how much non-peer-reviewed publication should a scientist who is an active researcher actually do?  How much time should they spend writing for that wider audience?

It’s a question that I’ve given some thought to over the 30 years1 that I’ve been writing and publishing articles and papers.  But a couple of posts on other blogs during the past week have crystalised these thoughts and inspired this post.  The first was Meghan Duffy’s piece on Formatting a CV for a faculty job application over at the Dynamic Ecology blog. There was some discussion about how to present different types of publications in the publication list, and notions of “sorting the wheat from the chaff” in that list, which seemed to refer to peer-reviewed versus non-peer-reviewed publications.

One of the problems that I and others see is that the distinction is not so clear cut and it’s possible to publish non-peer-reviewed articles in peer-reviewed journals.  For example the “commentary” and “news and views” type pieces in NatureScience, Current Biology, and other journals are generally not peer reviewed.  But I’d certainly not consider these to be “chaff”.  To reiterate my comment on Meghan’s post, all scientific communication is important.  As I’ve discussed in a few places on my blog (see here for example) and plenty of others have also talked about, scientists must write across a range of published formats if they are going to communicate their ideas effectively to a wider audience than just the scientists who are specifically interested in their topic.

Peer-reviewed publication is seen as the gold standard of science communication and it is clearly important (though historically it’s a relatively recent invention and scientific publications were not peer reviewed for most of the history of science).  So why, you may be asking, would scientists want to write for that wider audience?  One reason is the “Impact Agenda” on which, in Britain at least, there’s been a huge focus from the Research Excellence Framework (REF) and the Research Councils. Grant awarding bodies and university recruitment panels will want to see that scientists are actively promoting their work beyond academia. That can be done in different ways (including blogging!) but articles in “popular” magazines certainly count.  I should stress though that this wider, societal impact (as opposed to academic impact, e.g. measures such as the h-index) is not about publishing popular articles, or blogging, or tweeting. Those activities can be part of the strategy towards impact but are not in themselves impactful – the REF would describe this as “Reach”2.

The second recent blog post that relates to the question of peer-reviewed versus non-peer-reviewed publications is Steve Heard’s piece at Scientistseessquirrel on why he thinks it’s still important to consider journal titles when deciding what to read.  He makes some important points about how the place of publication says a lot about the type of paper that one can expect to read based just on the title.  But the focus of Steve’s post is purely on peer-reviewed journals and (as I said above) it’s possible to publish non-peer-reviewed articles in those.  I think that it’s also worth noting that there are many opportunities for scientists to publish articles in non-peer-reviewed journals that have real value.  Deciding whether or not to do so, however, is a very personal decision.

Of the 96 publications on my publication list, 65 are peer-reviewed and 31 are not, which is a 68% rate of publishing peer-reviewed papers and book chapters.  Some of the peer-reviewed papers are fairly light weight and made no real (academic) impact following publication, and (conversely) some of the non-peer-reviewed articles have had much more influence. The non-peer-reviewed element includes those commentary-type pieces for Nature and Science that I mentioned, as well as book reviews, articles in specialist popular magazines such as New Scientist, Asklepios and The Plantsman, pieces for local and industry newsletters, and a couple of contributions to literary journal Dark Mountain that combine essay with poetry.  This is probably a more diverse mix than most scientists produce, but I’m proud of all of them and stand by them.

So back to my original question: is 68% a low rate of peer-reviewed publication?  Or reasonable?  I’m sure there are scientists out there with a 100% rate, who only ever publish peer-reviewed outputs.  Why is that?  Do they really attach no importance to non-peer-reviewed publications? I have no specific answer to the question in the title, but I’d be really interested in the comments of other scientists (and non-scientists) on this question.


I had to double check that, because it seems inconceivable, but yes, it’s 30 years this year. Gulp.

Impact is how society changes as a result of the research undertaken.  So, for ecologists, it could be how their research has been translated into active, on-the-ground changes (e.g. to management of nature reserves, or rare or exploited species), or how it’s been picked up by national and international policy documents and then influenced policies on specific issues (invasive species, pollinator conservation, etc.)

15 Comments

Filed under History of science, Poetry

Research is Writing is Research is Writing is Research

“this is an interesting approach because it collapses the distinction between doing ‘research’ and writing ‘it’ up”

For years I’ve tried to impress this idea upon my PhD students and postdocs, that writing IS part of the research, and that “writing up” research is, at best, an inaccurate way of describing the process, even in the sciences. It’s had mixed success because it’s a difficult message to get across until they experience it for themselves and appreciate the importance of writing as they go along, even if much of what they write doesn’t end up in the thesis or research paper.

The quote comes from a recent post on Stuart Elden’s Progressive Geographies blog, and he in turn highlights a post by Raul Pacheco-Vega called “What counts as academic writing?”  Both are well worth reading, though Pacheco-Vega’s discipline of writing for two hours every day certainly won’t suit everyone (myself included).

 

5 Comments

Filed under Biodiversity

Advice for senior scientists and the importance of first-author publications

The internet is awash with bloggers and dedicated sites giving advice to early-career scientists and graduate research students (what I’ll collectively refer to as ECRs).  Much of it is very good (see for example The Thesis Whisperer, any number of posts over at Dynamic Ecology and Small Pond Science, and the University of Northampton’s own Research Support Hub), though sometimes it’s contradictory and comes down to matters of taste and opinion (see for example the differing comments on a post of mine about giving effective conference presentations).

There are also any number of books, including Peter Medawar’s Advice to a Young Scientist and James Watson’s Avoid Boring People (hopefully to be followed up with a sequel entitled Avoid Alienating People With Crass Statements)*.

But there is very little guidance and advice out there for more senior scientists who are mid- to late-career.  I did a quick search and found only one article that mentioned this topic, specifically about mid-career mentoring, and that was from 2012.

Why is this?  Is it because (as I suspect) more senior scientists are assumed to have their careers sorted out, they “know the ropes”, they are networked and publish, and have only a bright sunny future in academia to look forward to?  Clearly this is nonsense; as that article I linked to stated:

Do complicated career issues evaporate after tenure** and/or do we all magically know how to deal with everything that academe throws at us? No, and no

So I’d be interested in hearing any bits of advice or guidance, or links to useful resources, and would encourage new posts by other bloggers, related specifically to more senior scientists in academia.  To get the ball rolling, my contribution would be: make sure you keep publishing as a first-author (and preferably single-author) throughout your career.

In academia it’s easy to get lost as to what it actually is to be a scientist (idea generator/data collector/analyser/writer) in amongst all of the other requirements and pressures of the job at a senior level (grant writing/committee memberships/teaching/administration and paperwork/manuscript and grant reviewing/editorial duties/ECR supervision and line management/external meetings and advisory groups/etc.)

As a senior scientist it’s possible to publish good papers frequently as last author (indicating seniority as head of the research group and/or ECR supervisor), and as mid author in amongst tens or hundreds of other scientists with whom you are collaborating on some level.  In these papers other people are conducting the bulk of the “science”, and that’s fine, I publish in both of these ways myself.  But the question then arises, that if this is all that a senior scientist is currently doing, have they lost something of themselves as scientists?  Have they become something more akin to a science-manager than a “real” scientist (whatever that actually means)?

Personally, I try to publish at least one first-author output (not necessarily a peer-reviewed paper, could be a commentary or a popular article) each year, and have succeeded in most years.  I believe (though I may be fooling myself) that it keeps me in touch with what it is to be a scientist and why I became one in the first place.  For reasons I can’t fully articulate it feels important to me to be involved in research and writing in which I do the bulk of the data collection, analysis, and/or writing myself, and to see an output through the editorial process from manuscript preparation to submission, dealing with reviewers’ comments, and to final publication.

Is this a reasonable goal/expectation for a senior scientist?  It’s important for me but I can well understand that other scientists will have other priorities, different things that they focus on.

Coincidentally, as I was finishing off writing this post, Dr Kath Baldock drew my attention to this short piece by Kaushal et al. entitled Avoiding an Ecological Midlife Crisis that’s just been published in the January issue of the Bulletin of the Ecological Society of America.  Although specifically focused on professional ecologists, their advice to “nurture the original connection to nature” will surely resonate with scientists from all fields if we substitute “nature” for other, discipline-specific words and phrases.

 

*This is all very positive and as it should be: ECRs need advice and guidance as to how to navigate their profession, and that needs to come from multiple sources because sometimes (often?) their own institution doesn’t give adequate guidance.  However I do have some misgivings about more senior scientists advising their more junior colleagues based on their own experiences: the world of academia is a fasting-moving place and what applied to a previous generation may not necessarily apply to the current one.

**It’s an American article: British universities don’t even know how to spell “tenure”.

9 Comments

Filed under University of Northampton

Building a blog readership takes time revisited; and seven good reasons for academic blogging

Almost 12 months ago I wrote a post entitled “Building a blog readership takes time” and summarised how the audience for my own blog had increased slowly at first and then seemed to rapidly take off after about 18 months.  The post received a lot of interest and more comments/pingbacks than usual, including a comparison with the first year of posting by the Ülo Niinemets’ Lab blog.  So I thought I’d update the figure to look at what has happened in the intervening 11 months; here it is:

Blog stats - January 2016

As you can see the upward course of monthly views has continued, increasing from 1000-2000 on average in autumn/winter 2014 to 3000-4000 on average at the moment.  However the variance has also increased and over this time scale has become less predictable; for example, views for December 2015 were actually lower than for the same month in the previous year.  The >7000 views for August 2015 is clearly an outlier, an anomaly caused by a deliberately provocative post entitled “Who is feeding the honey bee bullshit machine?”  It will be interesting to see if this variability continues and I’ll report back in another year (!)

Meanwhile over at the Times Higher Prof. Pat Thomson from the School of Education at the University of Nottingham, has written a piece on “Seven reasons why blogging can make you a better academic writer“.  The seven reasons are:

Blogging can help you to establish writing as a routine*

Blogging allows you to experiment with your writing “voice”

Blogging helps you to get to the point

Blogging points you to your reader*

Blogging requires you to be concise*

Blogging allows you to experiment with forms of writing

Blogging helps you to become a more confident writer

Those I’ve marked with an asterisk* are the ones that chime most with my experience, but this is clearly very personal and it’s worth reading the whole piece for yourself.  Happy Blogging in 2016!

 

1 Comment

Filed under Uncategorized

The Altmetric Bookmarklet – an instant measure of the reach of academic publications [UPDATED]

Academics seem to be obsessed with metrics of all kinds at the moment, and I’m certainly not immune to it as my recent post on the h-index demonstrated.  So I was intrigued by a new (at least to me) browser plug-in that gives you instant altmetrics such as number of times mentioned on Twitter, Facebook or on news outlets, or cited in blogs, policy documents, Wikipedia, etc.  It’s called the Altmetrics Bookmarklet and can be downloaded (or rather dragged from the screen to the bookmark bar of your browser) from here.

I’ve given it a spin and it seems to do what it says it can do, within narrow publisher and time limits (2011 onward for Twitter, for instance).  It’s very, very simple.  Just find a paper that you are interested in, on the publisher’s official website; here’s a recent one by my colleagues Duncan McCollin and Robin Crockett – click on the Altmetric Bookmarklet (circled):

Altmetric 1

That gives you a drop-down of the current summary altmetrics for the paper which tells us it’s been tweeted by 14 people and mentioned on one Facebook page:

Altmetric 2

(As an experiment I’m going to see if it picks up this blog post once it’s live and will update below*).

If you select “Click for more details” you go to a new page that gives you…. more details:

Altmetric 3

And by selecting the different tabs you can see, for instance, exactly who has tweeted the paper:

Altmetric 4

It also gives you an altmetrics score for the paper (in this case 10) but it’s unclear to me how that’s calculated.  Does anyone know?

That’s all there is to it.  Is it possible to waste a lot of time playing around with this?  Yes.  Will it prove to be useful?  Only time will tell.  But it’s an interesting way of tracking the reach (and potential future impact) of your publications.

*UPDATE:  The Altmetric Bookmarklet had picked up the mention of the paper on this blog in less than 24 hours.

11 Comments

Filed under Uncategorized

The uneasy academic and the importance of dipping outside your discipline: reflections on The Urban University conference

Uneasy Academic 20141011_144756

It’s important for academics to occasionally move out of their disciplinary comfort zones and to interact with academics and practitioners from beyond their own silos, experiencing approaches that are alien and hearing voices that are not repeating the normative values of their own subject area.  Time spent in this way can be both stimulating and mundane, enlightening and boring, exciting and frustrating.  Above all, unpredictable.  At an ecological conference I know what I will experience; drop me into one devoted to the arts or social sciences, and anything can happen.  It’s an uneasy experience.

With that in mind I spent the end of last week attending a conference at which I was the lone scientist speaker, and indeed one of the very few people with a science background in the audience, as far as I could tell. The Urban University was sub-titled “Universities as place makers and agents of civic success in medium sized towns and cities” and was largely aimed at urban planners, architects, policy makers, and social geographers.  Not muddy boots ecologists.  However I’d offered the organisers (the University of Northampton’s Collaborative Centre for the Built Environment) a 30 minute talk about the monitoring work we’ve been doing on the bird assemblage at Northampton’s new Waterside Campus, which I discussed in an earlier post. The abstract for my talk is below, co-authored with my colleagues Janet Jackson and Duncan McCollin, plus two of our undergraduate students, Jo Underwood and Charlie Baker.

I had hoped that providing a very different perspective on the role of an urban campus, one focussed on the biodiversity it can potentially support and the ecosystem services that stem from it, might be of interest to this broad-based audience.  In the back of my mind I also thought it might be fun to reverse roles and, for 30 minutes, make them the uneasy ones.  It’s always hard to judge but I got the impression afterwards that the talk was well received and it elicited some discussion and questions.

Overall it was a stimulating couple of days and (I think) I’ve learned a lot, or at least learned more about the approaches and priorities of academics and practitioners beyond my immediate field. The talks ranged from the rather abstract to the very practical, from theoretical discussions to local activism. Particular highlights for me were:

John Goddard‘s overview of the relationship between the university and the city, and the fact that many academics don’t feel a personal link, or responsibility, to the urban centre in which they work.

Allan Cochrane discussing the unintended consequences of a university’s economic and social power, including gentrification and studentification of local residential areas.

Robin Hambleton on universities as a corrective to “placeless power”, i.e. multinational firms that can facilitate enormous social and economic change in an area despite having no geographic connection to the place.  Of course the internationalisation agenda of most UK universities means that they may themselves be in danger of wielding placeless power overseas.

Michael Edwards recounting how UCL academics and students have engaged in local activism in North London, for example fighting destructive planning applications, and sometimes positioned on the opposing side to the university itself.

Wendy Cukier on the experience of her Canadian university’s role as a “changemaker”, and the value of the Ashoka U Changemaker Campus programme, to which the University of Northampton is committed.

Cathy Smith on the medieval origins of the original University of Northampton, which was dissolved in 1265.  By happy coincidence 2015 is both the 750th anniversary of that dissolution and the 10th anniversary of the current University of Northampton’s full upgrade to university status in 2005.

The conference strongly impressed upon me the fact that academics sometimes take their institutions for granted in the sense that they don’t reflect on, or even challenge, the role of higher education within their geographical location. There may even be a danger of this becoming more pronounced as, in the rush to internationalise and chase overseas student fees, we in fact forget the physical and historical roots of our institutions.

Above all the two days I spent trying to navigate these unfamiliar waters reinforced my belief that it can be very dangerous for academics to isolate themselves within their disciplines, no matter how comforting and familiar that may be.  If the only voices that you are hearing (audibly and on the page) are the ones that are telling you stories that you already know and understand (even if you don’t agree with them) then it can be very easy to drift into a kind of disciplinary complacency in which you take the (self) importance and role of your own subject area for granted, without any external perspective on how it might be perceived by those beyond your academic boundaries.

Taking the occasional disciplinary leap could involve as little as going to a seminar in another department, or widening your reading to include areas beyond your subject.  Attending and presenting at a two day conference involves a greater commitment of time and energy, but it’s worth the effort.  It’s an approach to academia that I’ve tried to follow over the past 25 years and I’d recommend it as a way of broadening perspectives.  Sometimes it’s good to feel uneasy.

Many thanks to the organisers of The Urban University conference, particualrly Sabine Coady Schaebitz and Bob Colenutt, for their hard work in putting together such a great couple of days.  Here’s the details of my talk:

Biodiversity monitoring on urban university campuses

Jeff Ollerton, Joanne Underwood, Janet Jackson, Charles Baker & Duncan McCollin

Biodiversity, the variety of species and habitats to be found in a defined area, is a critical component of the natural world, and the ecosystem services that it provides supports modern society in economically tangible ways.  Urban campuses have long been acknowledged as supporting significant biodiversity, as evidenced by the many universities that have written biodiversity action plans.  However there has been relatively little quantitative research published on the biodiversity of British urban campuses, and how that diversity changes over time, particularly with respect to large-scale infrastructure development.  Academics and students in the Department of Environmental and Geographical Sciences have been collecting data on the biodiversity of Park and Avenue Campuses for more than 20 years, including plants, invertebrates, mammals, and birds.  This talk focuses on bird diversity as birds are an indicator group for assessing ecosystems, and are arguably the best understood group of species in the UK.  We present data on the birds that have been recorded on these campuses from 1993 to 2015, assessed in terms of their UK conservation status.  We then discuss the potential impact of the new Waterside Campus on the existing bird assemblage of the site, and present preliminary data showing how bird diversity has changed since building work began.  We end by discussing whether it is possible to maintain or even enhance bird diversity and abundance at the new campus.   The location of Waterside Campus, within the Nene Valley Nature Improvement Area and in close proximity to internationally important wetland bird sites, means that the University of Northampton has a civic duty to maintain the biodiversity of its campuses.

Note: in the end I actually didn’t include the data from Park and Avenue campuses, there wasn’t time to fit everything in!

6 Comments

Filed under Biodiversity, Birds, University of Northampton

Some basic tips for PhD students for giving effective conference presentations

One of the advantages of working in a relatively small, non-research intensive institution such as the University of Northampton is that events such as yesterday’s Annual Postgraduate Research Conference expose staff and research students to a diversity of topics and approaches, beyond the narrow silos of our own disciplines.  I spent the day listening to talks as varied as 21st century Gothic fiction, sediment sources in an English river; automating traffic control systems; and the role of younger grandmothers in raising their grandchildren.

It was a great day, very stimulating. However it struck me that whilst the research was sound, in some cases (certainly not all) the presentation of the research could have been improved.  I’m writing this in the spirit of a recent discussion over at Dynamic Ecology about the “advice” posts that they present and why they are so popular.  It seems like the consensus in the discussion is that often supervisors (or advisors as they are called in North America) often neglect to give some basic advice to their graduate students, assuming that they already know those “basics”, and focus instead on very detailed, technical advice.

Conference presentation skills are one such set of basics which, although covered by many universities in their training programmes (Northampton included) need to be reinforced by supervisors and by practice.  In my opinion, the following advice applies to all disciplines, not just the sciences:

1.  Always use a visual aid, such as PowerPoint, even if you think you don’t need it. Even a single slide with your name and the title of your presentation provides a context and focus for your talk.  Better still would be some information about the structure of your talk  on a second slide – “free-form” talks rarely work unless you’re an exceptional presenter.

2.  On PowerPoint, etc. use an absolute minimum text size of 18pt, preferably larger.  This includes graphs, tables, figure legends, screen shots, etc.  Anything smaller is not going to be visible beyond the third row of the audience unless you’re projecting onto a giant screen.  And do you know for certain that the venue has a big screen?

3.  Related to that, use the full width and height of your slides, don’t cram text and figures into the middle.

4.  Keep text to a minimum – just bullet points to give you a prompt as to what you’re going to say.  These bullet points don’t need to make complete sense to the audience as long as they give you that prompt: you’re talking, not reading.

5.  Ask someone (a friend, your supervisor) to double-check all your spelling, grammar, spacing, formatting, etc.  It’s well known that we read what we think is there, not what is actually there.

6.  A good rule of thumb for most presentations is to use more-or-less one slide per minute.  So if your allotted time is 10 minutes, use about 10 slides; if it’s 30 minutes, use about 30 (not including introduction and acknowledgements slides).  Do not try to fit 15 minutes worth of material into a 10 minute talk, it will irritate your audience and your conclusions at the end will be garbled and unclear as you rush to finish.  You can ALWAYS effectively summarise your research into the time allotted – don’t complain that you can’t.

7.  PowerPoint etc. provide all sorts of fancy backgrounds for slides, wonderful fonts, including shading and 3D, tricksy text animations, etc.  Don’t use any of them.  Keep your backgrounds a plain neutral colour, your font a basic sans serif (Trebuchet is my personal favourite), and don’t use animations unless they are really adding to what you’re saying, or building suspense by introducing elements one at a time.

8.  Tell a story.  Start with a broad introduction, take the audience on a journey through your work in logical order, and sum up at the end.  Then let your audience know that you’ve finished by saying something like: “Thanks for your attention, I’d be happy to answer any questions”.  Don’t just stop talking and stare at the audience.

9.  Stay focused and relevant to what you’re presenting.

10.  Enjoy yourself.  It’s hard the first few times but it gets easier.

Finally I should add that I’ve seen mid- and late-career researchers make these kinds of errors, it’s not just PhD students! Feel free to share your own tips, or to disagree with any of these.

39 Comments

Filed under University of Northampton