Opinie – Europese subsidies voor wapenonderzoek (be)dienen vooral militaire industrie

Nieuw opiniestuk van Slow Science in Mo*

Europese subsidies voor wapenonderzoek (be)dienen vooral militaire industrie

Vandaag, 27 juni, lanceren bezorgde wetenschappers en vredesorganisaties het Europese initiatief researchers for peace. Meer dan 700 onderzoekers uit 19 EU landen roepen hun collega’s op om zich uit te spreken tegen een Europees militair onderzoeksprogramma, dat donderdag en vrijdag op de agenda staat van de Europese regeringsleiders.

Met het Slow Science-netwerk ijveren we voor open, democratisch en duurzaam wetenschappelijk onderzoek. De huidige Europese plannen staan haaks op deze waarden.

Met de lancering van de “Preparatory Action on Defence Research” zette de Europese Unie in 2016 de eerste stappen naar het uitwerken van een Europees militair onderzoeksprogramma. De Europese Commissie heeft plannen voor een Europees Defensiefonds waarbij de komende jaren miljarden euro naar onderzoek en ontwikkeling van nieuwe wapens moet gaan. Uit een recent rapport van Vredesactie blijkt dat de wapenlobby een bepalende invloed heeft gehad op het proces dat leidde tot deze beslissing. Zoals te verwachten zijn de plannen dan ook op maat gesneden van de wapenindustrie.

De Europese Commissie stelt het Europees Defensiefonds voor als hét antwoord op de nood aan een Europees veiligheids –en defensiebeleid. Maar de EU slaat een belangrijke stap over: bepalen wat voor beleid ze voor ogen heeft. Dat is een politieke vraag. Pas als die beantwoord is, kan men de verdere vraag stellen hoe dit beleid het beste in praktijk wordt gezet. Hiervoor is kennis nodig, waar wetenschappelijk onderzoek een bijdrage kan leveren. Het soort kennis dat men hiervoor nodig heeft, is niet alleen militair van aard, maar ook economisch, sociologisch (en afhankelijk van het antwoord op de eerste vraag, ook ecologisch).

Het is alsof men de auto-industrie zou vragen om een mobiliteitsbeleid uit te werken voor Europa.

Wat men nu echter ziet is een compleet gebrek aan politieke visie. Iedereen lijkt het er over eens te zijn dat er een Europees veiligheidsbeleid moet komen, maar niemand stelt zich de verdere vraag wat dit beleid dan zou moeten inhouden. Het wetenschappelijk-technologisch onderzoek dat in de pijplijn zit, is bovendien ook niet van die aard dat het kennis oplevert die bijdraagt aan het beter uitvoeren van een politiek beleid of helpt in het uittekenen van ervan. Het vertrekt eerder van de aanname dat militair onderzoek door de privéindustrie het enige mogelijke antwoord is. Het is alsof men de auto-industrie zou vragen om een mobiliteitsbeleid uit te werken voor Europa. Uiteraard zal het “ideale” beleid er dan uit bestaan om te investeren in wegenbouw en geld te pompen in research & development uitgevoerd door de industrie zelf.

De belangen van de militaire industrie

De details van de financiering tonen bovendien hoe de huidige plannen enkel de belangen van de militaire industrie dienen. De EU financiert het onderzoek voor 100 %. De industrie zelf hoeft dus geen financiële bijdrage te leveren, maar krijgt wel de volledige intellectuele eigendomsrechten op de ontwikkelde technologie. Onderzoek houdt altijd een risico in: verwachte resultaten blijven soms uit, experimenten kunnen mislukken.

De samenleving betaalt de kosten van het risicovolle onderzoek, eventuele winsten worden volledig opgestreken door privébedrijven.

De motivatie die vaak gegeven wordt voor het toekennen van intellectuele eigendomsrechten is dat dit bedrijven en onderzoekers aanspoort om toch dit risico te lopen. Een geslaagde en gepatenteerde ontdekking kan namelijk de mogelijke mislukkingen meer dan compenseren. Wat we in het Europese plan echter zien, is een socialisering van het risico en het privatiseren van de winst: de samenleving betaalt de kosten van het risicovolle onderzoek, eventuele winsten worden volledig opgestreken door privébedrijven.

Als Slow Science onderzoekers roepen we de EU dan ook op om de belangen van haar burgers voorop te stellen, niet die van de militaire industrie. Om het beleid niet te laten bepalen door de militaire lobby, maar zelf politieke verantwoordelijkheid te nemen en democratische controle toe te laten. Om onderzoek te financieren dat bijdraagt tot een duurzame en veilige toekomst van de EU en niet louter de bankrekeningen van de militaire industrie spijst.

https://www.mo.be/opinie/europese-subsidies-voor-wapenonderzoek-bedienen-vooral-militaire-industrie

 

Slow interview #2 – An interview with Paolo Cherubini (part 2)

Author: Sofia Pagliarin

Why journal editorials are disappearing, and why we should care

Paolo Cherubini is a senior scientist at the Swiss Federal Research Institute WSL. He has been previously interviewed by Sofia Pagliarin as a collaborator of Slow Science when he shared some of his critical thoughts about the Impact Factor.

He is Editor-in-Chief of Dendrochronologia, an international scholarly journal publishing tree-ring science. Recently, he published an editorial about the rates of changes in scientific publishing and the “extinction” of editorials.

 

Sofia: Paolo, first of all thank you to be again with us.

Paolo: Thanks to you. Already 10 years ago I was thinking about the need to create a network about “slow science”, so I’m glad that it emerged and that I can contribute to it.

Sofia: It’s our pleasure! Anyway, let’s talk business. So are editorials as an introduction of a journal issue disappearing?

Paolo: In the old days, when journals were published on paper, editorials were an important part of a journal because they gave an opinion of the editor(s) on a certain topic possibly of interest for the readership of the journal. So you went to the library, you skimmed through editorials published on paper journals, and through published articles, too. Now, literature search is different. It’s digital, and scientists search and find exactly what they’re looking for. A readership of a journal almost doesn’t exist anymore, nobody takes a hardcopy of a journal in their hands, and nobody reads editorials anymore. The way of reading is also different. Everything is changing!

Sofia: …Changing?

Paolo: Well, the fact that editorials are disappearing also frees up editors’ time, so you do not lose time anymore by writing an editorial that nobody will ever read. What I am a bit worried about is the decrease of serendipity in academic research. But first I want to make a premise: I am one of those scientists who, despite having a high environmental awareness, still likes to print out papers, take notes and highlight stuff on them. Some of my colleagues make fun of me, but I think that reading papers digitally is quite another thing than reading them on paper. It’s not only a different feeling, it’s a different way to “study”, to get the content of the paper in your mind, and to use it possibly in your own research.

That said, when you looked at the editorial of a journal in your domain in the past, you also checked the table of contents. You certainly found stuff related to your own research, but you could also look through published articles in other topics, for instance about the flight trajectories of a butterfly of a certain species. Although now this might seem as waste of time, it was actually enriching, and it stimulated cross-fertilization in research, “side-thinking” and ability to make connections across topics. Now scientists are so specialised also because they have less opportunities to know what others are doing. ((Note to Freek: Reference Abbott’s digital paper for both a practical and sociological view on how to deal with these issues)

Sofia: Do you mean that nowadays researchers are behaving “badly”?

Paolo: No, not at all. But today’s researchers, especially the younger ones, when they look for a journal where to publish their own research, they look at the Impact Factor and publication time. So it’s a “fast science”, where editorials, as well as papers providing personal opinions, commentaries and ideas cannot survive. On the other hand, I think that, “fast science” can easily induce not-so-well-done peer-review.

It takes time and care to make a good peer-review process, while today researchers can opt to pay open-access journals to get their research online. This is not good for scientific research. Furthermore, it is obvious that a molecular researcher will have many more citations than a scientist working on a Himalayan beetle.

First, the size of the academic community for a certain topic affects the Impact Factor and number of citations. We are maybe 1000 dendrochronologists in the entire world, so our publications will never reach a very high number of citations. But is our research less relevant only because of this?

Secondly, and more importantly, how to judge the quality of the research? We may argue that research on the Himalayan beetle has its good relevance. The Impact Factor is a quantification of the utility, not a proof of quality, and less so of how relevant the performed research is.  Or better, other measures complementary or alternative to the Impact Factor should be developed that can account for the topic and certain characteristics of the academic domain. Similarly, the Impact Factor should not be used – or at least not solely – to hire or not people: hiring a new person in a research group is not only about choosing “Impact Factor Stars” – people with many top publications – but it is also necessary to have the sensitivity to weight other aspects, for instance if this person made other contributions (media, software, events, and so on) and on her/his personality, how it would fit to the research group overall. But these aspects are of course not so easily quantifiable as with the number of citations.

Sofia: Do you think that there is an “antidote”?

Paolo: As I said, the Impact Factor should be considered one of the possible measures of scientific research, and possibly not applicable to all the scientific disciplines. Other measures should be developed which are domain-specific, and at best which can also include qualitative assessments of the research. So I urge scientists to work on this and to propose other measures than Garfield’s Impact Factor, which was created to evaluate different journals basing on their utility.

Sofia: Thank you for your time and for sharing your experience Paolo.

Paolo: My pleasure!

 

Disclaimer:  The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of Slow Science.

 

Slow interview #1 – An interview with Paolo Cherubini

Author: Sofia Pagliarin

Impact Factor: how a useful tool turned into fever

Paolo Cherubini is a senior scientist at the Swiss Federal Research Institute WSL studying forest ecological processes using tree rings, i.e., dendroecology. Dendroecology provides information that can help us understand and predict how our forests and environment have changed over centuries, from carbon dioxide concentrations to warm or cold, dry or wet weather conditions.

I met Paolo Cherubini when I was working as a post-doc at the Swiss Federal Research Institute WSL. During an informal event, I discovered that in 2008, Paolo published a letter titled “Impact Factor Fever”  in Science (Vol. 322), where he strongly criticised the abuse and misuse of bibliometric in evaluating academic life and performance. He also told me different stories about the Impact Factor and his opinion about it.

As a contributor to the Slow Science network, I hence got the idea to organise in written form those early conversations. Actually, we scheduled a Skype interview where we talked about his thoughts about the Impact Factor, its increasing importance in academia and his experience as an author, reviewer and editor with it.

The following text is a synthetic re-elaboration of these interviews, structured in two posts and reviewed together with Paolo before its online publication on the Slow Science blog.

paolo1
Paolo Cherubini

Sofia: Paolo, first of all, what is the Impact Factor?

Paolo: The Impact Factor is a measure that calculates how frequently articles published in a journal are cited. The Impact Factor of a journal for a certain year (e.g., 2015) is calculated by dividing the number of citations accumulated by all citeable items, e.g., articles, reviews, personal commentaries, of this journal with the average of all published citeable items in the two previous years (e.g., 2013-2014). So, if a journal had 100 citations over 50 published items, its Impact Factor will be 2. The Impact Factor is a great tool that was developed by Eugene K. Garfield, the owner and founder of the Institute of Scientific Information (ISI), when staff was still typewriting and comparing the number of times the publications in a certain journal were cited by other journals, and now run by Thompson Reuters.

Sofia: In your letter to Science published 2008, you wrote that “The exacerbated pressure to publish we all suffer from is induced by an exaggerated reverence for the impact factor”. How did this all happen?

Paolo: Well, the story is long but basically the Impact Factor comes out of a true necessity to navigate through the different journals and rank them according to their reputation and utility. Although it is a measure that changes over time, it can give a good idea of how important a certain journal is for scientific publications in a certain domain, based on the ratio of citations of the journal articles.

However, this practical need for comparing international scholarly journals has “backfired” and evolved into a way to evaluate scientific performance, institutes and personal careers and a “citation rush”. So today not only journals are fighting to increase their Impact Factor, but also authors crave for the quantification of their research publication records. Personally, I am not against the Impact Factor; as an editor-in-chief of a small journal, I care about it for my journal, but I also know that it’s a measure that does not tell everything about a journal.

Sofia: But why is there such a “reverence”?

Paolo: Because the quantification of scientific results through the Impact Factor is extremely effective for decision-making: it is easy to use. From a measure useful to rank journals, the impact factor now is used to decide which departments have to die and which ones can survive, live or are “of excellence” and be funded. It is a measure that is functional if one has to demonstrate that “objective” decisions have been taken on who should be hired, who be funded and who be prized because of high scientific productivity.

The Impact Factor is functional to maintain a certain structure and functioning of academia, which is beneficial to many parties, that’s why I call it reverence. The journals gain from it as true “brands” in a scientific sector, and the scientists and universities gain, because they all compete for funding and the Impact Factor serves as objective quantitative measure to figure out who has been a good or a bad scientist. In a way, “it’s a simple game for simple minds”.

Sofia: Can you provide an example?

Paolo: Yes, sure. The Impact Factor serves specific academic structures and arrangements. For instance, the supervisor always has to be included in the publications of her/his post-docs and doctoral students. This is not only to increase her/his productivity record, but also because without the supervisor having such a productivity measure, s/he will not be able to get funding, and nurture this hierarchical structure of post-docs and PhDs who otherwise will not be working. Don’t get me wrong: this is not a problem per se, but of course it affects how academic research is done, the number of students in a certain department or research group, and their mentoring and supervising.

Another example is the impact this had on the social sciences, that have been forced to adapt to this parameter, created in and for the natural sciences; the social sciences have started to compete with the natural sciences for funding using the same measures and weapons, in spite of the fact that probably in the social sciences publishing a book makes more sense than publishing articles in ISI journals, making so the use of the Impact Factor perhaps less appropriate.

Sofia: So do you think we should get rid of it?

Paolo: Let’s say that the Impact Factor was invented for a certain purpose, that is ranking international scientific journals. Over time it altered into something that is almost a dictatorship in scientific research and production. It has been misused, and we should get rid of it being used in such a wrong way.

However, it is also true that the Impact Factor was a good thing in those university systems where competitiveness and productivity were not awarded. For instance, in Italy, where I originally come from, the traditional university system was based on professors making a career publishing one book every 10 or 20 years, or only a bunch of articles written in the local language in non-ISI journals. This corrupts the system and makes it extremely inefficient and not innovative at all, which is especially frustrating for students and for the careers of new researchers. It was a system that worked autoreferentially for decades, and recently, thanks to the Impact Factor culture, the university system has been changing. [See here for more background information on the effect of adopting metrices on self-citational practices in an Italian context]

Once I was giving a talk in a formerly Eastern European country, and when I said that I was critical about the Impact Factor, because it is not necessarily a good measure of good research, some colleagues got angry because they were trying to change the academic system there in order to put more pressure on the “dinosaurs” that did not publish and reward young researchers and professors who really did make an effort to compete on the international research scene.

So, it’s good to have a measure such as the Impact Factor that can tell us which journals are best and who is publishing the most. But the Janus-face of this is that, for instance in China, professors get paid thousands of euros more if they get published in one of those top-end journals in the Impact Factor list. This isn’t good, it can become a problem if academic research is solely oriented to get a publication done and cited: the quality of academic research will suffer.

Sofia: Thank you for your time, Paolo. Anything to add?

Paolo: Yes. As a dendroecologist and forest scientist I don’t think these are so far from the social sciences. Actually, ecological systems are very much connected to social sciences … So I believe we are all on the same boat, and the Impact Fair is currently misused in both the natural and the social sciences. Thank you for this opportunity to share with you these thoughts, that I have been discussing with so many other colleagues over the past two decades.

Sofia: Thanks to you Paolo!

 


 

“Slow Interviews” is a column published on the blog of the Slow Science network. The “Slow Interviews” articles/posts are conceived, written and reviewed by Sofia Pagliarin, as one of the collaborators of the Slow Science network. The publication and the content of each interview, in one or multiple posts, are discussed, reviewed and reciprocally agreed through a cooperative dialogue and effort taking place between Sofia Pagliarin, the interviewee(s) and some other members of the network.

These interviews have the aim to enrich the topics and debates that are central to the network, and are conceived to be in direct dialogue with the course on the critical analysis of academia and academic production organised annually by the Slow Science network: https://slowscience.be/our-doctoral-school/. They add to our understanding through adding different points of view, and are not intended to replace the topics and debates dealt with during the course. Interviewees are academics or informants that have experience and/or knowledge on a particular topic, and who are not necessarily related to either the Slow Science Network or the course.

Disclaimer: The views and opinions expressed in the “Slow Interviews” articles and blog posts are those of the authors and respondents and do not necessarily reflect an official policy or position of the Slow Science network.

 

Debate role of universities – Monday April 23rd at deBuren, Brussels

Dutch follows English

Dear all,

On Monday April 23rd , deBuren in collaboration with KU Leuven, UA, UGent and VUB are organizing a debate on the democratic legacy of 1968 and the role of universities then and now. The debate is organized as part of the doctoral school course “What does it mean to be a researcher in 21st century academia?”, but it is open to a broader audience. For everyone interested in slow science and/or the societal role of universities, this event is not to be missed!

You can find all relevant information via the website of deBuren: https://www.deburen.eu/programma/4603/de-democratische-erfenis-van-68-waar-staat-de-universiteit-vandaag

 

Practical information:

Monday April 23rd, 19h30-21h
deBuren
Leopoldstraat 6, Brussels

Tickets (5€/3€) are available through the website. The event is free for those who follow the workshop.

 

We hope to meet you there!

 

——–

Beste allen,

Aanstaande maandag organiseert deBuren in samenwerking met KU Leuven, UA, Ugent en VUB een debat over de democratische erfenis van 1968 en de rol van de universiteit, zowel toen als nu.

Het debat kadert in de doctoral school workshop “What does it mean to be a researcher in 21st century academia?” maar is open voor publiek. Voor iedereen die geïnteresseerd is in slow science en/of de maatschappelijke rol van universiteiten is dit een niet te missen evenement!

Meer informatie is te vinden op de website van deBuren: https://www.deburen.eu/programma/4603/de-democratische-erfenis-van-68-waar-staat-de-universiteit-vandaag

 

Praktische informatie:

Maandag 23 april, 19u30-21u
deBuren
Leopoldstraat 6, Brussel

Tickets (5€/3€) zijn te verkrijgen via de website. Het debat is gratis voor wie de workshop volgt.

 

We hopen jullie daar te mogen verwelkomen!

UCU strike in the UK and how to help

At the moment, the University and College Union (UCU) organizes a strike to fight against the new pension plans. A majority of UK academic institutions are taking part in this strike. For more information and latest updates about the strike:

https://www.ucu.org.uk/strikeforuss

This issue is also relevant for the broader slow science movement, as it is part of the marketization of universities. To give only one but very pertinent example: final pensions would depend on how the stock market performs, not on contributions.

As one of the strikers puts it:

“The real problem behind pensions is this: Universities have borrowed billions in bonds to spend on fancy new buildings – they are becoming property developers. These bonds are then traded on the financial markets so that the more they are ‘de-risked’ the more they are worth. De-Risking the pension to the extent that its modelled on all universities going bust simultaneously isn’t a realistic expectation, but it is a theoretical risk that when transferred to individual academics increases the bond’s value. We are losing our pension security to make more money for bankers in other words.”

 

HOW CAN YOU HELP?

  • You can tweet your messages of support to @UCU (union nationally) or to the local initiatives of different universities.

 

  • Donate to the fighting fund of UCU: there are lots of precarious, early career academics, and single parents on strike who will need financial support for the wages they are losing. https://www.ucu.org.uk/fightingfund

 

  • The Southampton University, in the midst of the strike, has received an email to make sure to ‘prioritize the partner organisations of Southampton University as part of the Internationalisation Strategy’, of which KU Leuven is apparently the number 2 partner. Therefore we want to use this internationalization strategy to show our solidarity with our colleagues. We would like to collect pictures of you showing a card/paper with (something like) “I support my colleagues in the UCU-strike @ Southampton University [name, university]”. This will only take 5 minutes of your time, but it would be really great if we can collect a big number of pictures. They will be shown on the website of those who strike in Southampton. Of course, scholars from KU Leuven are especially encouraged, but a broad support from all colleagues is highly appreciated. You can send your picture to Valerie, and they will be sent collectively to the UCU strikers @ Southampton.

 

  • Boycott the institutions from this list, as they engage in punitive behaviour during and between strike days

 

  • All universities could benefit from your support, so please reach out to any of your colleagues on strike at the moment and ask how you can help them. We can post the actions on our website or help circulate through other means.