Showing posts with label science as process. Show all posts
Showing posts with label science as process. Show all posts

Friday, August 04, 2017

Meiosis kills! (Now in print, and video)


Scientists are rarely dispassionate about their research. Why spend years trying to figure out the fine details of something you have no interest in? Before my wife and I lost two pregnancies, I had thought abstractly about the question of why developmental failure is so common across plants and animals, but it wasn't personal. I was interested in the fact that dying before reproductive age means an individual does not get to pass on whatever traits caused it to die. In other words, natural selection should quickly remove any heritable trait that commonly causes developmental failure. At the same time, pretty much any organism loses some of its offspring, implying some broad based mechanism WAS commonly causing developmental failure. I even went so far as to publish a review article focusing on what this mechanism might be.

But once developmental failure became personal, I wasn't just interested, I needed to know.  

The result of that impulse was just published by Proceedings B, one of my favorite scientific journals. 



In addition, I worked with Sarah Friedrich, the extremely talented Graphics Specialist in my department, to make this video explaining the science in public-friendly ways:

Wednesday, July 26, 2017

I, lichen.

This fungal individual makes lots of complex structures, on which microbes live. So do you.
A lichen is an ecosystem. It consists of a multicellular fungus that provides the gross structure of the lichen, and a community of microbes that live in and on that structure, including photosynthesizers.

A human is an ecosystem. It consists of a multicellular animal that that provides the gross structure of the human, and a community of microbes that live in and on that structure. A human, unlike a lichen, generally cannot photosynthesize. 

We usually see humans as individuals, but lichens as ecosystems. In the last few years some scientists have advocating deep thought about humans as ecosystems. There has been very little deep thought about lichens as individuals.

Friday, May 19, 2017

Me as a peer reviwer

I find it painful to read a very critical peer review, even if it clearly intended to be constructive. This is somehow especially true when I am the reviewer. That I've just written a rejection-worthy critical review, and seem to do so fairly often (always accompanied by specific suggestions for improvements) is part of my job, but not one I relish.

Thursday, February 02, 2017

Friendly Advice for Your First NSF Pre-proposal

It is going to be okay. Very okay.

Let's start with the important information that until last month I, like anyone who needs to read this post, had never applied for, let alone received, US National Science Foundation research funding. Having been out of the US for most of my time since earning my Ph.D., and other extenuating circumstances, kept me from applying to this extremely important source of funds. As with my posts on applying for NIH and ERC funding (which I didn't get), I'm writing this not as an expert, but because most people who write advice on applying for NSF funding are to varying degrees experts, and have been doing it for so long that they have no idea what us newbies might not know. I've never been on an NSF panel, I've never been to NSF, and all my attempts to talk to NSF employees have been unsuccessful. I am, like you, an outsider. So I learned a lot along the way, much of it later than I should have, and I'd like to share some key points. Before we begin, let's take a moment to contemplate this bodacious caterpillar I found in the University of Wisconsin Arboretum last summer:

Wow!

Alright, now that you remember that you love science, and brim with important and exciting questions about the world, let's delve.

1. NSF, we can agree, does not have the funds to support more than a small fraction of the grant applications they receive. Other huge governmental funders of science, like NIH, are in a similar situation. (I am assuming here, perhaps in vain, that the US government under El Presidentisimo continues to invest in funding basic research at some meaningful level.) This is a problem not only in terms of almost-everybody-doesn't-get-funded, but in terms of the phenomenal amount of time that scientists squander writing unsuccessful grant applications. In many cases, months of each year are spent applying for funding that is not available. On the flip side, NSF has to harness huge amounts of scientists' time to serve on committees, wading through the reams of applications finding reasons to reject as many of them as possible.

NSF has attempted a partial solution to this problem: the preliminary application. Basically, one starts by writing a short (four pages of text, plus many ancillary documents, including a one page Project Summary) version that has to be approved by a committee before one is invited to submit a full application. Most applicants (about 75%, in the program I'm applying to) only have to write the short version before being rejected, and the committees mostly have to read piles of short applications, with relatively few longer ones coming after that.

2. Writing the preliminary application was honestly not that bad. Several reasons: No budget is required, and many of the ancillary documents that NSF needs before they can fund anyone don't get submitted until your preliminary proposal is approved. More importantly, I have good collaborators, who are in practice writing these things. There is a huge amount to know about NSF specific 'grantsmanship.' When the committee gets to our application, after having already scanned scores of others, they will be both eager to find something really interesting that keeps them awake, and eager to find some reason that the thing can be rejected, so that they can get it over with. They will be looking for key phrases that everyone should have, and possible pitfalls indicated by things only people who have served on these committees (or maybe only that particular committee) know about. In short, it would be tremendously surprising if someone who just had phenomenal scientific ideas but no insider guidance as to the evaluation process ever got funding. I am lucky to have had that guidance; while it means I did a lot of rewriting to try to conform to a culture I've never encountered, it also, hopefully, means we have some chance of being invited to submit a full proposal. If so, that's what I'll be doing this summer, again with about a 20-25% chance of success.

3. As with any funding application, reading past successful applications to the same program is important. Notice not only the language used, the level of methodological detail given, and the structure of the proposal, but also the scope and scale of the proposed science.

4. Write, and revise, the one page Project Summary, and make sure everyone in the project agrees on it, before bothering with the longer Project Description. I made the mistake of drafting the Project Summary, receiving only minimal feedback on it from one of my collaborators, then writing the rest of the grant. By the time I got more extensive feedback from this collaborator, I had only a few days to reconsider the scope of the work being proposed and extensively rewrite. The proposal ended up much better for it, but I could have gotten a lot more sleep if I'd pushed for more feedback after writing just the summary.

5. The whole thing really isn't that much writing. Given that one knows what one is doing, and has good communication with collaborators, a good draft can be banged out in a couple of solid days.

6. In most cases, the one page Summary has to be uploaded as unformatted text, and it takes up more space the way NSF automatically formats it than it would if you or I formatted it according to their rules. I, and a few people I've talked to, ended up hacking down the one page summary very shortly before the deadline when we figured this out.

7. In order to apply, you need an "NSF ID." Looking on the NSF web page, you will find lots of information on retrieving your NSF ID, what to do if you have two NSF IDs, whether NSF ID is the same thing as various other identifiers NSF has used in the past, and so on. You will not find any information about how to get an NSF ID if you don't already have one. If you call NSF to ask how to get one, they will be so confused by you that they won't be able to help you, as if you had called to ask how to breath in before speaking. You just do it, and you must already know how. So I will tell you the secret: somewhere in your university or other approved research organization, there is some individual with the official power to communicate with NSF to get you your very own NSF ID. You will have to find out who that person is and request the NSF ID several days before the application deadline. My collaborator at another university didn't get a reply to his request until a few hours before we submitted, and we were actively discussing what would happen if we had to leave him off the PI list. Don't let that happen to you.

8. Which reminds me of another important point. I was advised to contact the NSF Program Officer for my application when I had a nearly final Project Summary worked up. As mentioned above, that didn't happen until less than a week before the grant deadline, at which point the Project Officers were all swamped by others trying to meet the same deadline. I emailed the Program Officer, and got back a very short email, but never got to talk with her. Finishing your Project Summary early will allow you to get input from your Project Officer.

9. Be smart, work very hard, and have extremely good luck.

Friday, January 13, 2017

Adventitious knowledge

Joseph Grinnell, eminent ecologist and zoologist of the early 20th century, and founding Director of the Museum of Vertebrate Zoology, where I was a graduate student, wrote some hilarious stuff. Google Scholar lists 376 publications in his name, most of which are actually by him, including foundational papers in niche ecology, bio-geography, museum science, and so forth. He took so many thousands of pages of detailed, elegant, highly legible, informative, rigorous, lyrical, systematic field notes that if I told you how many, I'd have to look it up again, and it is getting late here. And he is rightly revered as a founder and role model in the MVZ. But perhaps my favorite thing of all about Joseph Grinnell is his nearly forgotten, mildly disturbing, profoundly droll paper, "A Striking Case of Adventitious Coloration." I have no memory of how I first encountered this paper, other than that it was in my former life as an ornithologist at the MVZ. I have never come across another paper that cited it, and Google Scholar lists none. It is, at core, a mystery.

The whole story is available here. The first two sentences:
On February 8, 1920, I spent the afternoon with my family at a point in Moraga Valley, Contra Costa County, California, some five miles, airline, northeast of Berkeley. My son Willard undertook to exercise the shotgun for the purpose of securing some specimens of local birds such as happened to be needed at the Museum.
So ornately informal, so precisely vague, so informatively not-to-the-point, I am in love with this opening. No reputable journal would publish it these days, and that is a shame. He's storytelling, and quite well. The story strides on: Willard blasts a mated pair of Oak Titmice, both of whom have bright yellow breasts. Why bright yellow? Oak Titmice are grey, never yellow.  Grinnell rushes the five airline miles back to Berkeley, marches into the botany department with his dead birds, and tells them to figure out what kind of yellow pollen these birds have got on them. Not pollen, say the botanists. Grinnell marches into a mycology lab and tells them to figure out what kind of yellow spores his birds have on their breasts. Maybe slime molds, hard to tell, say the mycologists. Grinnell concludes that possibly bird feathers could be an important means of dispersal in slime molds! He finishes by pointedly mentioning that if anyone is interested, these two birds "and their loads of spores, constitute Nos. 40,391 and 40,392 in the bird collection of the Museum of Vertebrate Zoology." He publishes the whole thing in The Auk, and that is the end of that, for almost a century. A century, by the way, is how long Grinnell said some of the MVZ's specimens might have to wait before they would be put to some as yet un-imagined use. It has been 97 years.

Which brings me to that wonderful word in Grinnell's title, "adventitious." It means something on the order of "acquired by chance" which is how those titmice presumably got their yellow, and how I came to the knowledge that in a drawer in Berkeley two spore laden titmice waited.  What did I intend to do with this knowledge? Keep it in a drawer, until, perhaps after one hundred years, it proved valuable.

Now, I unexpectedly find myself with colleagues interested in spore dispersal ecology, and somebody mentioned spore dispersal via bird feathers. Which started me rummaging around in my dusty rusty musty gusty fusty drawers of ornithology, hunting for memory of birds with spores on them. All I remembered clearly was Willard undertaking to exercise the shotgun, but eventually (this morning) I found first memory, then paper, and shared both. And by afternoon my new mycology colleagues had requested access to Nos. 40,391 and 40,392 from my old ornithology colleagues at the MVZ, so that we can collect some of the long faded yellow dust. A little bit of molecular genetics wizardry (our lab is set up for sequencing DNA from dried fungal museum specimens) and we may finally be able to discover what made Grinnell's birds so yellow. If it is a new species of anything, we must surely name it after Willard.

Thursday, November 17, 2016

How not to respond to unhelpful peer reviewers

For as long as I've been a scientist, and longer, there has been extensive discussion on the many ways that peer review is broken. Peer review is how, in theory, science gets evaluated and hopefully improved, before publication, and therefore hard to dispense with, despite being widely seen as inefficient, biased, and corrupt. It goes like this: Author submits manuscript to journal, journal sends it out to independent experts for feedback, these experts (the scientific peers of the author) decide whether they are expert and independent enough to give appropriate feedback carefully read it, think about it, identify its flaws, make constructive detailed suggestions, and finally recommend to the journal whether it should be published as is, fixed and then reevaluated, or just rejected. That is, at least ideally, how is supposed to work.

There are a great many ways in which that ideal can fail. I draw a great deal of schadenfreude from reading Retraction Watch, which is effectively a blog about cases where peer review failed in one of many ways, something was published, and mistakes or misdeed were later found out. I, like most scientists, know a few people whose work may show up all over Retraction Watch some day.

Which brings me to the fact that I am currently figuring out how to respond to a review that has failed with regards to independence, expertise, detail, fact, specificity and constructiveness. I would have suggested to the journal that this person could not be an independent reviewer, except that it never occurred to me that anyone would consider him to know anything about the topic of the paper. Explaining the long history of this interaction to the journal, we have now been assured that our re-submission would be sent out to different reviewers. Even so, in resubmitting, I have to respond to all the reviewer's comments, even those that are wildly counterfactual, have nothing to do with the current manuscript, or are just complaints about the reviewer's own work not being cited more extensively. And it has to be done politely and factually. So one must never include responses like these:
  • This highlights a fundamental difference in approach to science. Reviewer's comment, and publications, suggest that scientific papers should be fishing expeditions in which everything that can be gotten out of a data set is analyzed and those results that test significant published breathlessly. We started with one, a priori original question, gathered all of the available data to address it, and got a clear result, which we state concisely. While some authors would stretch the results section out to numerous exploratory paragraphs, expounding upon questions that were tailored to fit the results of the numerous analyses, that would surely be a disservice to science.
  • It is not clear what this means. Perhaps the reviewer did not find our Methods section. It is in there between the Introduction and the Results.
  • It does not seem that the Reviewer has any idea what kind of data we are using, despite the six paragraphs on the topic.
  • Furthermore, a reading of the manuscript would have revealed that no matrix models are employed. Reviewer's comments would seem to be hastily copied and pasted from review of an unrelated paper.
  • The Reviewer's publications are not relevant or useful here. Perhaps they were relevant to the paper for which most of this review was written?
  • This is counterfactual and the Reviewer has excellent reason to know that.
  • These quotes of the journal's rules are from an entirely different journal that the Reviewer often reviews for.
  • Not only can we find no mention of this statistical rule anywhere, we note that Reviewer's own papers don't follow it. We asked an expert in these methods about this 'rule.' She called it, "hilariously made up." 
I need some empenadas.



Wednesday, November 16, 2016

Ghosts of papers that may some day be


The world is full of science that only half exists: Experiments done but not written up, manuscripts waiting for revision, results too unimpressive to prioritize for publication. Where fetuses are gestated for months but born in hours, data sets often take longer to put out into the world than they took to create. Until it is published, academic research is only a nascent fluffy squishy wispy gelatinous downy larval effervescent ephemeral eufloccinaucinihilipilificatable translucent apparition, neither seen nor heard nor here nor there. Once published, research gains visibility, permanence, and perhaps even value.

While most scientists have things they would like to get around to publishing,  I feel like I've accumulated a particularly long list of research projects I need to push out. This summer and fall I've actually had some time to dedicated to that. I've made a goodly dent, but the list is still long, and new tasks and projects emerge like mosquitoes from an abandoned hot tub.

I've published four good papers this year, another is ready to go as soon as my coauthor has time to look at it, and a sixth just needs a few final touches, and should be submitted in a week or two. Both of those 'full term' papers will, hopefully, come out next year. I think that's pretty good considering I spent most of the last year on intensive teaching, had a months-long battle with epidemic keratoconjunctivitis, have moved my family four times in the last year and a half, and have three children five and under. There are days I wonder why I am so tired, and then there are days I remember why I am so tired. And on those days, I don't feel the least bit bad about keeping all those manuscripts, and coauthors, waiting.

Monday, March 16, 2015

Progress


For years, I've been worrying about about my chronic backlog of papers I should have written a long time ago and just haven't had time for. I'm very happy to report that my to-write list is getting a lot shorter. Three of the papers on that list will come out in the first half of this year. Number four is currently out for review, 5&6 need to be revised and resubmitted, a seventh is written and currently with colleagues awaiting their comments. The eighth has figures made and large chunks of text in their second or third drafts. If all goes well, all eight should be at least submitted by the end of the year, and I'm guessing that six will have come out. Of course there are several more that I need to get to, and new projects being planned, but it feels good to be clearing the backlog a bit. Especially nice is that after spending too long on methods papers, incidental discoveries and other tangents, the manuscripts I am working on now actually address the central points that motivated the research in the first place.

1. Levitis DA. (2015) Evolutionary Demography: a synthesis of two population sciences. In: International Encyclopedia of Social and Behavioral Sciences, 2nd Edition. ed.: J.D. Wright. (Coming out in May)
I am an evolutionary demographer, and while encyclopedia articles are not my bread and butter, this is very much on topic.

2. Olsen TB, Christensen FEG, Lundgreen K, Dunn PH, Levitis DA. (2015) Coelomic transport and clearance of foreign bodies by sea stars (Asterias rubens). Biological Bulletin. (Coming out in April)
This started as a student project to develop methods for studying the evolutionary demography of starfish, but when it became clear the animals wouldn't stay tagged, my students decided to investigate why. Their result was cool enough that we're publishing it.

3. Oravecz Z, Levitis DA, Faust K, Batchelder WH. (2015) Studying the existence and attributes of consensus on psychological concepts by a cognitive psychological model. American Journal of Psychology 128: 61-75.
My most cited paper (on the biological meaning of the word behavior) is one I started as a graduate student, even before it became clear I would be an evolutionary demographer. It got a nice write-up in the New York Times. Many of those citing it are in philosophy or psychology. A couple of years ago I was contacted by some psychologists who wanted to work with me to reanalyze those data. I never expected to publish in a psychology journal.

4. Zimmerman K, Levitis D, Addicott E, Pringle A. (2014) Maximizing the mean nearest neighbor distance of a trait to choose among potential crosses and design a fully crossed mating experiment.
This methods paper, currently out for review but with an earlier version already archived online and therefore available (journals are increasingly okay with this) grew out of a collaboration that is part of my ontogenescence project. In trying to answer my evolutionary question, my collaborator invented a new method for designing mating experiments, and we wrote it up. 

5. On raven populations in the Eastern US. One reviewer loved it just as it was, the other made numerous (and useful) comments on how to improve the analysis. Being worked on by my colleagues who are primarily responsible for the analysis.

6. Part of the same ontogenescence collaboration as #4, this was just rejected by a high impact journal on the basis that they rejected it (no reason or feedback given, as is common with such journals) and will be submitted to another in April.

7. Another ontogenescence paper, this time in a marine ecology context. Our plan is to submit in May. Between now and then the main order of business is to get feedback from colleagues and use it to improve the text.

8. Same project as #s 4 and 6.

9. On post-reproductive lifespan, building on the papers and methods that came out of my dissertation. We have cool results proving an interesting point, but it still needs a fair bit of work.

They probably won't be submitted in exactly this order, as a lot of it depends on factors beyond my control, but this is more or less the order I'm prioritizing them in. Beyond that it is hard to predict. Some older things I still do really need to write up, some fruitful student projects on post-reproductive lifespan that are looking good, some vague ideas. 

One thing I've decide is that at least for the moment, no papers that are outside the main foci of my research program (evolution of pre-reproductive mortality and post-reproductive survival) are going to make the list. Numbers 1-3 & 5 above don't directly address either of these topics, and 4 is tangential. That is a bad habit, and one I'm going to break.

Wednesday, October 22, 2014

Asimov on Creativity

Isaac Asimov's essays have been favorites of mine since I was a teenager, and while I can't claim to have read them all (he was the most prolific writer in the history of the world, if one excludes 'writers' who have computers write for them) I have read a lot. So I was excited to hear that a previously unpublished essay of his, On Creativity. And like many of his essays, this is spot on.

To summarize his conclusions, intellectual creativity (creation of startlingly new scientific ideas in particular, but not only that) tends to occur when previously unconnected ideas are examined together by a person in a conducive situation. And, he argues, a key feature of that conducive environment is the freedom to be playful, to unabashedly look foolish, to pursue ideas that don't seem likely to go anywhere with people whose expertise has no obvious connection to one's own. He implies, and it is at least as true now as when he wrote it in 1959, that the structure and strictures of science-as-a-business (including in academia) tend to discourage this. Connecting previously unconnected ideas is less likely when everyone is a specialist in her own field, not only unaware of the big ideas in other areas of science, but obligated by the strictures of specialist journals, specialist department, etc. to not wander too far afield. In the world of reputation building and publish or perish,  things like playfulness, acceptance of foolishness, and exploration of uncertain goals is potentially fatal. Funding applications not only require that you know exactly where you will end up, but also that you already have a significant portion of the data needed to get there.

At previous jobs, and in previous stages of my life, I often felt (and was told) that my intellectual creativity was my greatest strength. As things now stand, I have surprisingly little space for creativity, and when I do come out with something really original, I get something along the lines of, "Huh. That's different. What about this other thing that we all know about?" So the question I must ask myself is, how (and where) can I find a place where my creativity is an asset, not only for me, but for science and the world?

Saturday, September 13, 2014

Student scientists study sea stars, produce plaudable publication

A good university science education should give students the opportunity to engage in scientific research. This is widely agreed upon, and most of the biology position announcements I consider state that the successful applicant's research should present opportunities for student participation. The general model is that the professor has the research program, and a question that needs answering, and a plan for answering it, and the student gets to see how research happens by carrying out, or at best refining, that plan. I'm all in favor of this, but I'd like to take it a step further.

My first publication, way back in 2003, was with one of my professors at Bennington College, based on work that she planned, and I, as an undergraduate, helped carry out. I made alterations to the experimental protocols, did a lot of lab work with minimal supervision, and chose to work on this project rather than others that were available, but I can take no credit for any of the ideas in the publication. In retrospect the one thing I would add to my own undergraduate education, if I was to be my own professor, was working through the entire process of generating a primary paper, from initial observations and idea generation to publication.

Implausible you say? Impracticable? If generating publishable science is so easy, why doesn't every professional scientist publish a paper a week? Well, I've just finished doing it with two of my students, and I'll tell you about it.

'Finished' is vague. We've submitted the paper, and I think it good, but we have to wait to hear if the reviewers agree. I'm not going to give you too much detail on what we found because you'll have to read the paper (or a future post) when it comes out.

It happened like this: At SDU, where I currently work, all natural science students in their second semester have to complete a group project. A group of students (four in my case) are assigned a faculty mentor who gives them a question to answer and guides them in answering it. In my case, the question was, "Can we use PIT-tags (like a vet puts in your cat) to mark starfish for a long-term demographic study?" We brought some starfish into the lab, talked about animal care and experimental design, showed them how to inject the tags and pretty much let them do their own thing.

They did great, but the tags just kept coming out. After a few weeks, all the tags were out. They answered my question with confidence: No, PIT-tags cannot be used to mark starfish long term. But the thing is, they didn't stop there. With no pay, no additional course credit, no requests for recommendation letters or such, two of the four students just decided to keep going. We met occasionally and I offered encouragement and comments, but little more.

They presented their results to the Evolutionary Demography Society, and long after the course was over they kept doing more experiments to figure out how the starfish were ejecting the tags. Notice that this is their own question. I asked, "Do the tags stay?" and my students answered this then asked, "How do they get rid of the tags?" And when we had an open house at the laboratory, they presented what they had learned to the public. Just by chance, one of the visitors they talked to had access to an ultrasound machine. This let them repeatedly image exactly where within the starfish the foreign body was moving. A year after they started, they convinced me that they had discovered, and had the data to back up, a previously unknown mechanism by which starfish can eliminate foreign objects from within their body cavities. "Okay," I said, "write it up for publication, and tell me now by what date you will have a finished draft." They missed their self-assigned deadline. They needed more help with data analysis than they expected. They put in all the wrong references in all the wrong places, and the flow of the article was terrible. English is not their first language. But not so long after they said they would, they sent me a draft that had most everything I needed to make it good. With the co-authorship of a couple of marine biologists (did I mention that I know next to nothing about starfish and have no other starfish research ongoing?) and with the continued input of these two students, we made a respectable manuscript out of it.

What lessons do I draw from this? Motivated undergraduates, with just enough guidance, can basically have their own successful research programs. The paper we produced still took a bit of my time to write up, and isn't the most important paper in the world, but they discovered something completely new (answering a question that someone who knew the literature would never think to ask), and they learned. They learned a lot. Refining questions. Starfish anatomy and function. Experimental design and practice. Ultrasound imaging. Cox regression in R. Scientific English. Literature searching and use. Collaboration. Communicating science to peers and the public. Preparing and submitting a manuscript for publication. Now they will get to see how peer review really works, or doesn't. These students, just starting their third year as undergraduates, have a fuller experience of what goes into making a scientific publication than I did when I started my third year as a doctoral student. Chew on that for a minute.

It is important here to think about these students' motivation. Judging by their grades, they are not academic stars. Neither of them has described a lifelong fascination with starfish. They did this, so far as I can tell, because it was their first chance to truly be scientists rather than just science students.

I told them early on that:
A) That they would have a strong say in the direction of the research and
B) that if they produced something publishable, I would help them submit it for publication.

These are not promises to be made lightly. Publishing things, especially things outside one's own central line of research, is time consuming. Giving first year undergraduates even this limited version of academic freedom in their research is, understandably, not common practice. But it seems to me to be damn good educational practice, and I plan to continue offering this type of opportunity to students when possible. Students will do much better, and more, work when they are exercising agency and following their own curiosity. Even if they don't choose careers in science, they know how science happens from start to finish, and that is surely something science students should be given the chance to learn.

Monday, August 04, 2014

To-write-list


I keep a running list of papers to write on my whiteboard.

This list currently has 15 entries.

Two of these have been accepted for publication this year but not yet taken off my list because it is nice to see them there.
 
Another was submitted, rejected, and is waiting rewriting.

Two have nearly complete manuscripts written and have been sent to colleagues for comments.

Five more are actively being worked on by me or co-authors and will hopefully be finished by the end of the year.

Four others are collaborations that are currently not moving. One of those is a project died for some specific reasons (like the data couldn't answer the question) and I just haven't entirely decided what to do about it. The others are things that neither I nor my collaborators are currently prioritizing that I hope we will get back to.

Finally, I have one project on my list that I want to write, but I don't even know exactly what data I need or the approach I want to take. I just know it is an important paper to write.

There are roughly a million other ideas I could put on my list, but I'm having a moratorium. I have decided I am not allowed to start on anything else until my list gets a lot shorter. As I have mentioned, I am applying for jobs, and I'd like to have most of this backlog cleared up before I go.

Thursday, May 08, 2014

One way to avoid writing negative reviews is to decline to act as reviewer

"Please tell us why you have declined to review this article"

I am afraid I am unable to decipher the meaning of even the title of this paper, and the abstract implies a communicative style that could charitably be called idiomatic. I know all the words being used, (and am familiar with the probable subject matter) but couldn't tell you what most of them mean in this context. Were I to review a paper written in this manner I would spend an excessive amount of time on these linguistic issues, and might or might not ever get around to figuring out the embedded science.

Wednesday, March 19, 2014

Apozygotic agamospermic apomictic agamospory

I'm making a table. Not the tisch, bord, tavolo, mensa kind. I'm making a table of comparisons of offspring viability between sexually and asexually produced offspring. This is polychallenging. Part of it is that the literature is scattered, so it takes a lot of hunting around, but that is a usual and interesting sort of challenge. Part of it is that I want to include lots of different kinds of organisms, and find basically comparable comparisons for each, and the measure one might use for offspring viability for a lizard is necessarily different than that used for an insect, plant or mold, but this is arises from the real diversity of biological process, and so is also interesting. The part that I am finding frustrating and difficult is the choking miasma of obfuscatory terminology. Some terms, like amictic, are used to mean different things by different authors. Clear concepts (e.g., what portion of seeds open and something live comes out) are referred to by a dozen different terms. Frequently a single author or group of authors will have a term that does not seem to be defined anywhere and isn't used by anyone else. Apozygotic, for example, seems to be used only by eastern European sugar beet scientists to mean agamospermic, which is a term botanists use to describe reproduction via diplospory, apospory or nucellar embryony, which are all (I think) non-automictic kinds of agamospory, which is close to what a zoologist would call apomictic parthenogenesis, which basically means that offspring are coming out of eggs (or seeds or spores) produced without any genetic recombination or changes in chromosome number along the way. There are various places in the literature or on the web where good intentions have tried to straighten all of this out and discard the duplicate or ambiguous terms, but of course they come to different conclusions and are frequently ignored.

Saturday, March 15, 2014

Knee deep in the fetid pools of evolution


I am, as I may have told you before, an evolutionary biologist at heart. And one of the things I love about evolution is how messy, random and complicated it is. Evolutionary outcomes aren't just survival of the fittest, but also reproduction of the luckiest and replication of the not overly deleterious. Natural selection often doesn't get its way and the optimal trait often doesn't exist or can't quite win out. Evolution is a box of dirty tooth-marked mismatched Legos with no instructions all in clumps from previous projects, and that is how I like my Legos

So I always enjoy talking with colleagues who really think about evolution in depth, not as a nice neat optimization process (which it isn't) or a collection of family tree (which it can be, but this misses the forest) but rather as the beautiful mucky anarchic tangled mess of genes, lineages, mutations and highly fallible biology that it is. Sure, there is a lot of phylogeny in there, and a bunch of natural selection, which to an extent can optimize things, but it is like optimizing the design of a boat when all you have to work with is coconut husks, maple syrup and a swarm of fire ants. It isn't so much optimization with constraints as a bowl of constraints with optimized sprinkles on top. To really capture the beauty of it you have to do away with the basically creationist notion that organisms are perfect for their niches and the anarchic view that biology follows the rules we write in text books. Organisms only breed with members of their own species, except when they don't, and clones are genetically identical to each other, unless you look closely.  Only changes to DNA are heritable, except those non-DNA heritable traits. Rules, broadly defined, do not apply to fungi. The dissertation that was defended from me this week showed that in real wild populations, genetic drift is sufficient to speed aging and shorten lifespan. These populations aren't short lived because there is something optimal about it, but rather because drift isn't letting selection have its own way. I'm oversimplifying, and you'll have to wait for the details to come out, but it is a wonderful example of evolution in its slip-shod Rube Goldberg glory.

I'd like spend more time with colleagues who think deeply about the gorgeous multi-layered sub-optimality of evolution. That's where the fun is.

Tuesday, February 11, 2014

Coming home now




-->
I have been asked to review another paper. It is for a very good journal. I have published with them and thought their editorial office did a good job. It is on a topic I am very interested in and know the literature on quite well. It is a professional responsibility.

I have no time. I have a million projects and papers I am behind on. I have a bunch of papers I could get out relatively quickly if only I had the time, and I really need to get papers out. When I was a kid and my father came home late every night, I swore I would never "accept a bunch of extra work responsibilities just because my colleagues relied on me." My daughter is so damn cute and changing so fast. Every time I see her she says, "I need you Daddy, I need you!" (actually more like, "I ne-Jew Daddy, I ne-Jew!!!"). How can I ignore an adorable two year-old who not only needs me, but tells me so? I can't.

I said yes to the review, because that is what I am supposed to say. I will do it, and do it right. Sigh. But not tonight.

Monday, February 11, 2013

Better measures of tiny bits of living jelly

My student and I had a paper accepted last week. I like this paper a lot. First some official details, then a short story about how the paper came to be.


Forthcoming in "Marine and Freshwater Research" 

The consistent, non-destructive measurement of small proteiform aquatic animals with application to the size and growth of hydra 

Daniel Levitis and Josephine Goldstein

Abstract: Hydra (Cnidaria), the basal metazoan most often studied in cellular, molecular and developmental biology, is difficult to measure because it is small, proteiform and aquatic. To facilitate broader organismal and ecological study of Hydra, we evaluate three methods whereby a polyp's body column can be measured by means of photomicroscopy. The volume, cylindrical surface area and surface area corrected for changes in body shape are all highly repeatable methods (r=0.97) when shape varies little. However, shape changes alter volume and cylindrical surface area. Repeated corrected surface area measures of the same individuals in markedly different positions yield standard deviations that are less than 5% of the mean measured area. This easy, non-lethal means of individual size measurement explicitly accounts for the flexible morphology of a polyp's hydrostatic skeleton. It therefore allows for the elucidation of how growth and size vary over time, age and food intake. We find that hydra change size dramatically day to day, and that while food level influences adult size, it has little effect on the early growth of recently detached buds. Finally, we discuss ecological and biological applications of this method. 

Part of why I like this paper so much is that my student did most of the hard parts, and the reviewers didn't ask for a lot of changes, and the editor (Dr. Russell Death) moved things along quickly and efficiently, so it is the first paper I've published that didn't feel like pulling my own teeth. It is a solid methods paper. It says, "here is a method for doing something that many scientists may want to do, that we didn't know a good way to do before."

In particular, it is a simple way to accurately and easily measure a tiny transparent aquatic animal with no hard parts or consistent shape without harming it.  Take a picture of it and with a bit of simple math calculate its surface area. Simple, elegant, inexpensive, non-invasive, biologically meaningful, everything I hoped it would be.

But the method presented in the paper is a substantially different, and better, method than the one we started out with. In fact, we were almost ready to submit the paper when we changed the method drastically.

I started out just wanting to measure a hydra polyp because it was necessary for another research project. Nobody had a method that worked halfway decently without killing the hydra. A hydra's body, although it shortens and lengthens, bends and twists is almost always roughly a deformed cylinder. So I said, "Hey Josi" (that's my student), why don't we develop a method to measure a hydra by photographing it and estimating its volume as though it was cylinder?" She agreed, and off we went, taking photos of hydra during our whole research project, for almost two years. We finished gathering our data, and found that indeed we could measure hydra this way. It didn't work great, but it kinda worked. You could use it to tell the difference between a really huge hydra and one that was just kind normal. Viola, unimpressive but probably publishable methods paper.

We wrote it up and were pretty close to submitting when I decided to see what hydra focussed papers had come out recently, and found that someone else had just published almost the exact same method in a nice paper with an interesting biological point. Our formulation of that method didn't work any better than hers did, and we had no point beyond "here is how to measure." We couldn't publish the same method again, even if we came up with it independently. After some cursing and self-recriminations for being so slow, I decided to see if it was possible to salvage anything of the methods paper. Josi took another set of photos and I used them to repeatedly adjust my formula, with limited biological reasoning, until something worked better than the formula just published. In fact, one formula I came up with by eyeballing my graphs worked much better than the method just published. So much better that the calculated value hardly changed at all even when the shape of the hydra changed drastically.

Now you are probably saying to yourself that this is blatant cheating. Trying different formulas (perhaps 100 of them) until something gives you the result you want is a pretty sure way to get the result you want, if you are persistent enough. But two things combined to make this not just okay, but beautiful. First, the formula I stumbled upon made obvious biological sense, even before I knew it worked. This formula represents a hydra as a roughly cylindrical bag whose skin stretches as it elongates itself, or folds and ripples as it contracts. In other words, it describes a hydra accurately and reveals something I didn't previously know about the way a hydra moves. Secondly, and as importantly, when I applied the formula to the main data set, which I didn't use to develop the formula, it still gave a highly consistent measurement for any individual, even as the shape of the individual changed. The method in fact works for completely different populations of hydra. You could take two hydra that looked the same size under the microscope and conclusively decide that one was bigger than the other. You could tell how much a young hydra grew each day. You could really measure the buggers, eliminating most of the noise inherent to previous methods.

Still more lovely, the method Josi and I had just developed could make use of the photos and measurements we had already taken, so redoing all our calculations and figures required Josi to write only a few extra lines of code (Thank you Josi, thank you R). We did a little rewriting to compare our method favorably to our other method, blamed attributed the other method on to the person who had just published it, made our biological argument to explain why the method worked, and we had a drastically improved paper ready to submit. There is a lesson in this somewhere about the scientific method.

Friday, November 23, 2012

The Evolutionary Demography Society is born

We are pleased to announce the formation of the

Evolutionary Demography Society (EvoDemoS)

and to invite interested researchers to join. While many societies include life-history evolution or evolutionary demography within the range of topics they consider, no active society focuses on these topics across taxa and disciplines. EvoDemoS is intended to fill this gap.

EvoDemoS is an interdisciplinary scientific society dedicated to the study of the interactions of ecology and evolutionary biology with demography, including but not limited to patterns of mortality, reproduction and migration over age, stage and state and the evolutionary processes that produce those patterns. All taxa and methodologies are of interest. Our primary goal is to facilitate communication between researchers, and as such we are pleased to offer free membership for 2013 to any interested researcher. We invite members from students to established experts. We will organize yearly meetings to provide a specific forum for evolutionary demography. Our first meeting will be in Odense, Denmark in October of 2013, and will be open only to society members. Membership can be gained by emailing your name, preferred email address, affiliation and a sentence describing your research interests to:
evodemo-list@demogr.mpg.de

Questions and comments can be addressed to this same address.

Please feel free to distribute this announcement broadly.

Sincerely,
The Board of the Evolutionary Demography Society


President
James W. Vaupel, Max Planck Institute for Demographic Research and University of Southern Denmark

Vice President
Shripad Tuljapurkar (Tulja), Stanford University

Secretary/Treasurer
Daniel A. Levitis, Max Planck Institute for Demographic Research and University of Southern Denmark

Board Members
Anne M. Bronikowksi, Iowa State University
James R. Carey, University of California, Davis
Hal Caswell, Woods Hole Oceanographic Institution
Charlotte Jessica E. Metcalf, University of Oxford
Tim Coulson, Imperial College London
Timothy Gage, State University of New York at Albany
Jean-Michel Gaillard, Université de Lyon and Centre national de la recherche scientifique
Thomas B. Kirkwood, Newcastle University
Daniel H. Nussey, University of Edinburgh
Fanie Pelletier, L'Université de Sherbrooke
Deborah Roach, University of Virginia
Rudi G.J. Westendorp, Leiden University

Monday, October 15, 2012

In which I use some strong words on the topic of a botched editing job

So this big long review article my collaborators and I sent to a very good scientific journal was sent for one last copy-edit before it was published. This is normal, most journals have a copy editor look things over right before typesetting. This particular paper had already been professionally copy edited, but hey, there is always room for improvement. So we get back the typeset version from the production editor (who works for the corporate publisher of the journal and is in charge of turning the accepted paper into a formatted, typeset publication), and an hour later, we get an email from the scientific editor (who is a scientist in charge of deciding what gets published and makes decisions about scientific content) that the copy editor (who works for the production editor and is only supposed to correct grammar, punctuation and such) is an "aggressive copy editor" and we should check the paper carefully. So we sit down to read through the paper and figure out what this means when HOLY SHIT! I notice that the very first sentence of our paper directly contradict the rest of its content. And then YOU ARE FUCKING KIDDING ME?! I notice that my coauthor's name is spelled wrong. Then LET LOOSE THE DOGS OF WAR! I see that the very central sentence of the paper, the one that defines the really important concept that we are introducing and talking about, no longer means anything at all. And then, ARE YOU TRYING TO KILL ME?! I see that a paper written by a very senior colleague who I've only slightly corresponded with is now attributed to me, as though I wrote it. To top it off ARE YOU SOME KIND OF IDIOT?! there are now all sorts of punctuation errors, formatting errors, random characters inserted in the middles of words. So I think maybe that's the worst of it, and then HOW HAVE YOU NOT BEEN FIRED YET?! I figure out that the copy editor went through and sorta tried to rewrite the paper, adding a sentence here, taking out a clause there, HOW IS THIS POSSIBLE inserting parenthetical phrases with no closed parenthesis. (I DON'T UNDERSTAND! Oh, and MAKE IT STOP! the names of the demographic phenomena have been changed to something the copy editor thought sounded nicer.
So we figure maybe the copy editor was high on crack, and we write to the production editor, whose job it is to make sure the copy editing was done right, and we politely explain the problem, and ask that we would like to make sure that we don't miss any errors that may have been introduced, and so could we please have the list that he has of the changes that were made, and he SATAN! SATAN! writes back a one sentence email telling us we just need to follow the instructions he already sent. Considering I spent more than a year in total working on this paper, I think I am handling it rather well. DOOM! DOOM!

I have now written a somewhat less polite email to the production editor demanding that the pre-vandalized version be used, as we can't possibly find and mark all of the hundreds of places where the paper was damaged. My hope is to end up working with a different production editor, one who is not so BLATTANTLY STUPID unconcerned about the quality of the product.

Wednesday, August 29, 2012

When animals aren't 'animals.'

I know a guy who has fishing licenses in about ten states. He doesn't fish, but he does study salamanders, and according to the fishing regulations in many states, salamanders are fish and you need a fishing license to mess with them. Salamanders are of course not fish, unless you are a hard-core cladist who thinks that all vertebrates are fish. State fishing officials are not generally hard-core cladists, just people who write and enforce regulations and don’t really care if salamanders aren’t fish.
A similar situation arises when it comes to laws governing ethical animal research. If a scientist wants to passively observe a bunch of animals in the wild, she needs to go through all kinds of ethics boards and paper-work to make sure she is complying with these laws. If another scientist wants to slowly dissolve a bunch of live insects in acid, he just needs to buy some acid, because legally, invertebrate animals aren’t ‘animals.’ Animal ethics laws generally don’t apply to them. I say generally because their are particular exceptions. Switzerland and Norway consider lobsters and their relatives to be animals, so you can’t just drop them in boiling water (at least not in a scientific context), you have to kill them humanely. England extends animal protection laws to the Common Octopus, but apparently not to other less common octopuses, so it pays to be common.

For a researcher like myself, who studies invertebrates in the lab, this is a very convenient absurdity. It means that when I want to feed live brine-shrimp to my hydra, I don’t have to ask any committees to review whether the feeding is humane to the brine-shrimp or the hydra. I don’t need to get official approval for the size of container I keep barnacles in.
I approve of laws, regulations, forms and committees that require the ethical treatment of animals in research, and I try hard to follow the principles they are intended to enforce. I am also very glad I don’t personally have to deal with the red tape.

Tuesday, April 03, 2012

Accepted, but...

We have set ourselves a difficult task. Some months ago, my friends and I submitted a review article, clarifying some common (within the scientific literature) misconceptions, to a very good anthropology journal. Today we heard back from them. The editor explained that it took longer than usual because he had sent it out to "several" reviewers, and then "several more" and was waiting to get comments from all of them.

The good news is that all the reviewers seemed to like it, and the editor knows which issue of the journal he wants to put it in, which we take as the paper being accepted. The bad news (or at least time-consuming news) is that all of the several and several reviewers made long lists of things they would like to see changed, added, clarified or reorganized. I have not yet finished reading all these comments, but I don't see a lot of repetition, meaning that we have hundreds of distinct comments and criticisms to deal with. In the months we were waiting, we also showed the draft to a couple of colleagues, who gave us still different but also very useful comments. The good news is that the editor has given us permission to go well beyond the usual page limit for this journal in order to deal with all the reviewer comments. The bad news is that we now can't use space limitations as an excuse for not dealing with relevant points or citing relevant literature. So we have a great deal of rewriting to do.

Part of the problem with writing an article pointing out places where other people's thinking or language has been unclear is that one has to live up to very high standards for clarity in one's thinking and language. We've already extensively rewritten this paper a few times, and each time it has gotten clearer. Nevertheless, a large portion of the reviewers' comments are right on, and further clarification is needed. By the time this thing comes out it will either be brilliant or a total muddle. I'm not clear which.