Monday, November 22, 2010

More fails... (aka: no VR grant awarded)

I failed to get a VR grant in 2010. The arguments are interesting:

My competence
    The applicant had published 15 papers in mostly low impact journals.

True, the top journals in my field (chemometrics, cheminformatics) do not have very high impact factor, because the field is less eager to add 100+ citations to each journal paper, nor is the field know to be popular enough for Nature, Science, etc.

    Two of these are highly cited.

Indeed. I recently blogged about that. Mind you, 46 citations is not highly cited, even though it exceeds the impact factor of Nature and Science.

    This is quite impressive for such a young scientist (PhD in 2008).

But, of course, that does not matter. It's surely not about impression, right?

    His PhD work (in Netherlands) and his postdoctoral work (in Uppsala) is actually all on the same project ...

This is where the reviewers show some disrespect, I believe. Apparently, they have not taken it as one of the responsibilities to actually check what I have done. My PhD work was partly in the UK (Cambridge), and I have done postdoctoral work in the Netherlands and Germany too.

On the same project? Well, depends on how you look at it. Surely, cheminformatics, QSAR, statistics, etc, is all the same. Same for crystallography, NMR, etc. One big pile of science. Again, I feel the reviewers took their responsibility of reviewing very narrow.

    ... and with the same collaborators.

Wow... that's impressive, right? And I was always thinking that international collaboration was positive. But apparently not if you have long term, successful collaborations. WTF??

    The role of the applicant in relationship to Prof X and the other developers is not clear.

OK, I should have made it clearer how the other scientists are involved.

    However the backgrounds is definitely adequate for the suggested project but the applicant lack in independency.

(Carefully transcribed.)

This is an interesting point, and nicely outlines how the current academic system works. As post-docs you are forced to hop around from one funded project to another, hoping to get funding. Until you do, you are working on other PIs project with predefined topic.

Project quality
    The main focus of the project is software development of Bioeclips in collaboration with X and others.

No, if you read the proposal, the project is about statistical method development, and Bioclipse (not Bioeclips) is used as platform to make it look like Excel so that the average scientist understands it. That distinction is difficult, even for scholars.

    The application is mainly about managing errors in observations and processing, annotation and propagation of these.

Indeed! Well copied from the proposal's abstract.

    Expected outcome is identification of processing errors, and potentials for improvement in the data handling.

The reviewers got it almost right. I have not written up clearly enough that the main improvement is finding the source of the error, which we all know is the (biological) experiment and the average scholar inadequacy to do data handling (think Excel).

    Although the development might be of real importance the application does not show a significant scientific component, neither from a computational not from a life science perspective.

This quite puzzles me, as we had very strongly written all over this proposal: metabolomics, metabolomics, metabolomics! With applications including metabolite identification, with experimental partners.

Have they actually read the proposal?

Project quality

    The background and competence of the applicant should ensure success.

So, why not fund me? Read on...

    The applicant requires compliance of users and buy-in from scientific community.

I guess my work is not cited enough to show that my work is actually used. Anyone using the CDK here?

    Although some indication that this will happen is provided...

I guess this reflects to the international collaborations I listed in the proposal.

    ... this is not ensured

Therefore, rejected. Scores: bra (2 out of 5) and låg (2/5).