Is Ghent University stepping out of the ratrace? An interview with Jan Dumolyn about the goals, strengths, and potential pitfalls of the new personnel policy

By Valerie De Craene, Anton Froeyman, Freek Van Deynze

In 2018, the University of Ghent and the Socialist Trade Union announced a new personnel policy, calling it “stepping out of the ratrace” and “no longer wishing to participate in the ranking of people”. The press release (by the University of Ghent and the Socialist Labour Union) was picked up by many scholars and media, also internationally. Inside Higher Ed even interviewed Rik Van de Walle, rector of the Ghent University.

Slow Science in Belgium read the new policy, followed the debates, and ended up with numerous questions. What is the new policy about? Will this policy be a start of a broader and fundamental change on the way universities are organised? Will this policy lead to different types of knowledges being valued and validated by universities and funding agencies, while preventing precarity in our universities? Or is this nothing more than window dressing with minor changes, and even only for the happy few? What about the broader context in which this policy is implemented? Luckily we found Jan Dumolyn -senior lecturer at Ghent University and, as member of the Socialist Trade Union, one of the people responsible for the negotiations and implementation of the new policy- willing to answer all our questions about the goals, strengths and potential pitfalls of the new direction Ghent University seems to take.


Slow Science: Very briefly and concretely, what are the main characteristics of this new policy vis-à-vis the previous system?

Jan Dumolyn: The former system was completely based on so-called quantifiable measures. Not only publications (with an elaborate ranking system) were measured, but grant applications and service as well, sometimes to the point of ridicule. The terms for promotion from one level to another stay the same: five years as a tenure track lecturer (docent, maître de conferences, assistant professor); then ten years as a senior lecturer (hoofddocent, associate professor); then eight years as a professor (hoogleraar, professeur); after that, one finally becomes a full professor (hoogleraar). Now, the quantitative approach has been replaced by a more qualitative method. A commission of five persons will be created for each staff member. This commission will be consist of the head of department, the president of the ‘education commission’ of the programme in which one teaches, a close colleague who knows the field, an HR-specialist and a member of the faculty board. In the first meeting the staff member presents a plan for what she will do during the next couple of years, then there is a feedback meeting two years later and finally after five years an evaluation. The idea is that every staff member who does her job correctly and adequately (so research, teaching and service) will be promoted. If this is not the case, the promotion is delayed. There is the possibility to appeal when one does not agree with a negative decision.


Slow Science: How long has this policy been in the making? And why did it get introduced now? Are there any examples or templates which served as inspiration (e.g. website of Leiden manifesto suggest so)

Jan Dumolyn: As trade unionists, we have been protesting against the former system from the moment it turned out that it would be based upon a purely quantitative logic (something that we never agreed to in the first place and was imposed upon us by the faculties). This is already ten years ago now. I had not heard about the Leiden manifesto before you asked me this question. In fact, we can be proud to say we were much earlier in saying this with ACOD-UGent (the socialist trade union at the university). When I already protested against this metric fetishism ten years ago, I was mocked by almost every powerful person in the Flemish academic world. Gradually, opinions changed, and now those in power mostly agree with us.


Slow Science: In the new policy, there is an important role for the so called HR-committee. Is it possible that people’s research will be assessed by someone who might be unfamiliar with your field?

Jan Dumolyn: I don’t see that as a problem, on the contrary. The HR-commission should contain both specialists of the field who understand the staff member but also other people who might have a fresh look or are not implied in specific feuds or controversies within the field. The composition of the HR-commission is made up by checks and balances.


Slow Science: One of its members is a person from DPO (Personnel Department). Who is this person, i.e. what is their profile/expertise/background? What is her/his role?

Jan Dumolyn: Her role will be that of a trained HR-specialist, a kind of career coach. These people will be mostly trained as pyschologists I suppose. Their task is to provide general suggestions for career management.


Slow Science: Will one person be performing this role for the entire university, or will this be several people? Do you expect this person to, officially or unofficially, also serve as “antenna’s” to give the central university level an idea of hiring and promotion practices?

Jan Dumolyn: Our university is recruiting them at the moment. There will be a few of them. And yes, I suppose they will serve as antennas in that way.


Slow Science: Part of the new policy is the requirement for academics to submit a so called ‘inpassingstekst’ or integration text: a text in which they show how they fit in the department. What about external people who do not know the department that well? Won’t they be disadvantaged by the new policy?

Jan Dumolyn: I don’t see why, they should be coached in this as well by the colleagues of their department. It’s normal that when teaching and service are concerned as a newcomer you don’t know everything.


Slow Science: In the new policy, there is a bigger role for the academic to choose to prioritize research, teaching, and/or service. How will this be evaluated? Are there any minimum criteria? What if, for example, academics choose to not publish any articles but rather aim to make a documentary or book? Would this be considered enough? Does all research need to be innovative, or is popularizing (existing) research also considered as valuable output? What about arts? What if someone aims to write a new opera? Is this considered research output?  Services? If so, will faculties allow it? What if they don’t?

Jan Dumolyn: In principle, all this should be allowed as long as one fundamentally remains a scientist. Of course, a lecturer or professor cannot just say ‘I don’t want to write scientific publications anymore’ but changing the focus to books or focusing more on science communication is of course a very valuable option. Of course, a researcher should always continue to engage in innovative research, we cannot just be popularizers or artists, these are different professions. So let’s not take things too far. Most of us are paid by tax payers’ money.

 The quantitative approach has been replaced by a more qualitative method. The idea is that every staff member who does her job correctly and adequately (so research, teaching and service) will be promoted.

Slow Science: Related to the previous question: do faculties want to focus on and value teaching and service more? If so, what needs to change in order to make this happen? Think about academics who are often present in mainstream media (e.g. Koen Aerts – Kinderen van de collaboratie, Carl De Vos): Aside from playing an important role in bringing scientific insights to the larger audience, they only provide an indirect benefit to their departments, in the form of publicity and possibly higher student numbers. Which role does this type of service play?Will this change anything in the current relationship between research, teaching, and service?

Jan Dumolyn: I think this type of academics are exceptional when you start counting them. In practice, universities have already for a long time been happy with well-informed and eloquent researchers who appear in the mass media. In all fairness, I do not believe in academic positions without research, or mostly oriented towards service. There can be no academic teaching or service which is not based upon original research.


Slow Science: Do you think the FWO too can validate this type of service like the one discussed above?

Jan Dumolyn: I don’t think it should. Fundamental research is the main objective for the FWO and it should remain so. Other types of funding could be created next to it. Fundamental research is always potentially under threat so we should not give too much weight to ‘impact’ either. There should be a balance. Otherwise private companies will tighten their grip on the universities. You and me are from the humanities, but let’s not be naïve, we don’t really count in the bigger picture, when it comes to money and funding. When we say ‘service to society’ we mean something else, in practice this will mean ‘service to big corporations’.


Slow Science: (How) can individual professors and/or the university avoid that this new policy eventually still leads to an output focused on quantitative metrics? The discourse might have changed, but how can this be applied in practice? Especially since the incentives that lead to the use of quantitative indicators (such as the internal allocation model, the financing decree and the BOF-distribution key) are still in place.

Jan Dumolyn: Yes, this is the next step that should be taken now; this is already going in the right direction, but slowly. Some rectors support moving away from focusing on quantitative aspects in order to divide the money between the universities in Flanders. The KULeuven however is trying to block a fundamental change as it profits most from the current situation and it has the most outspoken neoliberal discourse since a long time. It is up to the colleagues in Leuven to more actively resist their university leadership in this.

Rethinking the career of PhD-students and the ‘scholarship’ system, which is de facto a labour relation, is on our agenda for the coming year.

Slow Science: To what extent is the new policy part of a larger process of change? At this moment, the (external) incentives have not changed: the Flemish financing system as well as international criteria for funding remain the same. What about those incentives that undermine the new UGent policy? Will this affect competition to other universities who do not implement a similar policy?

Jan Dumolyn: Yes, but we are only the trade union of one university, we cannot do everything at the same time and we have to start somewhere. I would urge academics to join a labour union and work in these structures instead of only writing endless blogs to complain about ‘academia’ in purely individualist terms or signing the occasional petition. Only through collective organization one can  gain strength. And that is in a union. Academics should learn that they are workers too and are not ‘above such things’ as a labour union.

Evaluators could use their position to settle scores with the person being evaluated. We would have wanted a veto right by the candidate against members of the HR-commission as well, when she would think they are biased.

Slow Science: What about the careers of PhD students and Post-Docs? Is there something in the pipeline for them as well? And what will be the effect on PhD students and Post-Docs as long as there is no specific policy implemented for them?

Jan Dumolyn: Rethinking the career of PhD-students and the ‘scholarship’ system, which is de facto a labour relation, is on our agenda for the coming year. We have also recently reformed the statute of the ‘scientific staff’ i.e. mostly postdocs financed by external funding. As to the problem of the postdocs in general, the main solution is to create a lot more positions of lecturer, so more of them have a chance.


Slow Science: What do you, as labour unions, foresee as potential problems within the new system?

Jan Dumolyn: The downside of not using a purely quantitative logic is that there is more room for purely subjective judgments. Evaluators could use their position to settle scores with the person being evaluated. However, due to the composition of the committee, which is supposed to be balanced. And there are also possibilities to appeal.


Slow Science: Are you as ACOD happy with the new policy? Are there issues you wanted to see included too? What proposed elements didn’t make the cut?

Jan Dumolyn: We pretty much succeeded in obtaining the most important aspects of what we wanted. We would have wanted a veto right by the candidate against members of the HR-commission as well, when she would think they are biased. We lost on that one in the negotiations. Now there is the possibility to protest and write a letter to the dean and the faculty council but they can then decide if they want to maintain the nomination or not.


Jan Dumolyn (1974) is a senior lecturer in medieval history at Ghent University, a shop steward of the ACOD (Socialist Trade Union) and a member of the Staff Negotiation Committee.


Read more

UGent press release: (Nl)
UGent press release: (En)
ACOD (Socialist Trade Union) Q&A on new policy: (Nl)

Interview with Rik Van de Walle in Inside Higher Ed (En):


Opinie – Europese subsidies voor wapenonderzoek (be)dienen vooral militaire industrie

Nieuw opiniestuk van Slow Science in Mo*

Europese subsidies voor wapenonderzoek (be)dienen vooral militaire industrie

Vandaag, 27 juni, lanceren bezorgde wetenschappers en vredesorganisaties het Europese initiatief researchers for peace. Meer dan 700 onderzoekers uit 19 EU landen roepen hun collega’s op om zich uit te spreken tegen een Europees militair onderzoeksprogramma, dat donderdag en vrijdag op de agenda staat van de Europese regeringsleiders.

Met het Slow Science-netwerk ijveren we voor open, democratisch en duurzaam wetenschappelijk onderzoek. De huidige Europese plannen staan haaks op deze waarden.

Met de lancering van de “Preparatory Action on Defence Research” zette de Europese Unie in 2016 de eerste stappen naar het uitwerken van een Europees militair onderzoeksprogramma. De Europese Commissie heeft plannen voor een Europees Defensiefonds waarbij de komende jaren miljarden euro naar onderzoek en ontwikkeling van nieuwe wapens moet gaan. Uit een recent rapport van Vredesactie blijkt dat de wapenlobby een bepalende invloed heeft gehad op het proces dat leidde tot deze beslissing. Zoals te verwachten zijn de plannen dan ook op maat gesneden van de wapenindustrie.

De Europese Commissie stelt het Europees Defensiefonds voor als hét antwoord op de nood aan een Europees veiligheids –en defensiebeleid. Maar de EU slaat een belangrijke stap over: bepalen wat voor beleid ze voor ogen heeft. Dat is een politieke vraag. Pas als die beantwoord is, kan men de verdere vraag stellen hoe dit beleid het beste in praktijk wordt gezet. Hiervoor is kennis nodig, waar wetenschappelijk onderzoek een bijdrage kan leveren. Het soort kennis dat men hiervoor nodig heeft, is niet alleen militair van aard, maar ook economisch, sociologisch (en afhankelijk van het antwoord op de eerste vraag, ook ecologisch).

Het is alsof men de auto-industrie zou vragen om een mobiliteitsbeleid uit te werken voor Europa.

Wat men nu echter ziet is een compleet gebrek aan politieke visie. Iedereen lijkt het er over eens te zijn dat er een Europees veiligheidsbeleid moet komen, maar niemand stelt zich de verdere vraag wat dit beleid dan zou moeten inhouden. Het wetenschappelijk-technologisch onderzoek dat in de pijplijn zit, is bovendien ook niet van die aard dat het kennis oplevert die bijdraagt aan het beter uitvoeren van een politiek beleid of helpt in het uittekenen van ervan. Het vertrekt eerder van de aanname dat militair onderzoek door de privéindustrie het enige mogelijke antwoord is. Het is alsof men de auto-industrie zou vragen om een mobiliteitsbeleid uit te werken voor Europa. Uiteraard zal het “ideale” beleid er dan uit bestaan om te investeren in wegenbouw en geld te pompen in research & development uitgevoerd door de industrie zelf.

De belangen van de militaire industrie

De details van de financiering tonen bovendien hoe de huidige plannen enkel de belangen van de militaire industrie dienen. De EU financiert het onderzoek voor 100 %. De industrie zelf hoeft dus geen financiële bijdrage te leveren, maar krijgt wel de volledige intellectuele eigendomsrechten op de ontwikkelde technologie. Onderzoek houdt altijd een risico in: verwachte resultaten blijven soms uit, experimenten kunnen mislukken.

De samenleving betaalt de kosten van het risicovolle onderzoek, eventuele winsten worden volledig opgestreken door privébedrijven.

De motivatie die vaak gegeven wordt voor het toekennen van intellectuele eigendomsrechten is dat dit bedrijven en onderzoekers aanspoort om toch dit risico te lopen. Een geslaagde en gepatenteerde ontdekking kan namelijk de mogelijke mislukkingen meer dan compenseren. Wat we in het Europese plan echter zien, is een socialisering van het risico en het privatiseren van de winst: de samenleving betaalt de kosten van het risicovolle onderzoek, eventuele winsten worden volledig opgestreken door privébedrijven.

Als Slow Science onderzoekers roepen we de EU dan ook op om de belangen van haar burgers voorop te stellen, niet die van de militaire industrie. Om het beleid niet te laten bepalen door de militaire lobby, maar zelf politieke verantwoordelijkheid te nemen en democratische controle toe te laten. Om onderzoek te financieren dat bijdraagt tot een duurzame en veilige toekomst van de EU en niet louter de bankrekeningen van de militaire industrie spijst.


Slow interview #2 – An interview with Paolo Cherubini (part 2)

Author: Sofia Pagliarin

Why journal editorials are disappearing, and why we should care

Paolo Cherubini is a senior scientist at the Swiss Federal Research Institute WSL. He has been previously interviewed by Sofia Pagliarin as a collaborator of Slow Science when he shared some of his critical thoughts about the Impact Factor.

He is Editor-in-Chief of Dendrochronologia, an international scholarly journal publishing tree-ring science. Recently, he published an editorial about the rates of changes in scientific publishing and the “extinction” of editorials.


Sofia: Paolo, first of all thank you to be again with us.

Paolo: Thanks to you. Already 10 years ago I was thinking about the need to create a network about “slow science”, so I’m glad that it emerged and that I can contribute to it.

Sofia: It’s our pleasure! Anyway, let’s talk business. So are editorials as an introduction of a journal issue disappearing?

Paolo: In the old days, when journals were published on paper, editorials were an important part of a journal because they gave an opinion of the editor(s) on a certain topic possibly of interest for the readership of the journal. So you went to the library, you skimmed through editorials published on paper journals, and through published articles, too. Now, literature search is different. It’s digital, and scientists search and find exactly what they’re looking for. A readership of a journal almost doesn’t exist anymore, nobody takes a hardcopy of a journal in their hands, and nobody reads editorials anymore. The way of reading is also different. Everything is changing!

Sofia: …Changing?

Paolo: Well, the fact that editorials are disappearing also frees up editors’ time, so you do not lose time anymore by writing an editorial that nobody will ever read. What I am a bit worried about is the decrease of serendipity in academic research. But first I want to make a premise: I am one of those scientists who, despite having a high environmental awareness, still likes to print out papers, take notes and highlight stuff on them. Some of my colleagues make fun of me, but I think that reading papers digitally is quite another thing than reading them on paper. It’s not only a different feeling, it’s a different way to “study”, to get the content of the paper in your mind, and to use it possibly in your own research.

That said, when you looked at the editorial of a journal in your domain in the past, you also checked the table of contents. You certainly found stuff related to your own research, but you could also look through published articles in other topics, for instance about the flight trajectories of a butterfly of a certain species. Although now this might seem as waste of time, it was actually enriching, and it stimulated cross-fertilization in research, “side-thinking” and ability to make connections across topics. Now scientists are so specialised also because they have less opportunities to know what others are doing. ((Note to Freek: Reference Abbott’s digital paper for both a practical and sociological view on how to deal with these issues)

Sofia: Do you mean that nowadays researchers are behaving “badly”?

Paolo: No, not at all. But today’s researchers, especially the younger ones, when they look for a journal where to publish their own research, they look at the Impact Factor and publication time. So it’s a “fast science”, where editorials, as well as papers providing personal opinions, commentaries and ideas cannot survive. On the other hand, I think that, “fast science” can easily induce not-so-well-done peer-review.

It takes time and care to make a good peer-review process, while today researchers can opt to pay open-access journals to get their research online. This is not good for scientific research. Furthermore, it is obvious that a molecular researcher will have many more citations than a scientist working on a Himalayan beetle.

First, the size of the academic community for a certain topic affects the Impact Factor and number of citations. We are maybe 1000 dendrochronologists in the entire world, so our publications will never reach a very high number of citations. But is our research less relevant only because of this?

Secondly, and more importantly, how to judge the quality of the research? We may argue that research on the Himalayan beetle has its good relevance. The Impact Factor is a quantification of the utility, not a proof of quality, and less so of how relevant the performed research is.  Or better, other measures complementary or alternative to the Impact Factor should be developed that can account for the topic and certain characteristics of the academic domain. Similarly, the Impact Factor should not be used – or at least not solely – to hire or not people: hiring a new person in a research group is not only about choosing “Impact Factor Stars” – people with many top publications – but it is also necessary to have the sensitivity to weight other aspects, for instance if this person made other contributions (media, software, events, and so on) and on her/his personality, how it would fit to the research group overall. But these aspects are of course not so easily quantifiable as with the number of citations.

Sofia: Do you think that there is an “antidote”?

Paolo: As I said, the Impact Factor should be considered one of the possible measures of scientific research, and possibly not applicable to all the scientific disciplines. Other measures should be developed which are domain-specific, and at best which can also include qualitative assessments of the research. So I urge scientists to work on this and to propose other measures than Garfield’s Impact Factor, which was created to evaluate different journals basing on their utility.

Sofia: Thank you for your time and for sharing your experience Paolo.

Paolo: My pleasure!


Disclaimer:  The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of Slow Science.


Slow interview #1 – An interview with Paolo Cherubini

Author: Sofia Pagliarin

Impact Factor: how a useful tool turned into fever

Paolo Cherubini is a senior scientist at the Swiss Federal Research Institute WSL studying forest ecological processes using tree rings, i.e., dendroecology. Dendroecology provides information that can help us understand and predict how our forests and environment have changed over centuries, from carbon dioxide concentrations to warm or cold, dry or wet weather conditions.

I met Paolo Cherubini when I was working as a post-doc at the Swiss Federal Research Institute WSL. During an informal event, I discovered that in 2008, Paolo published a letter titled “Impact Factor Fever”  in Science (Vol. 322), where he strongly criticised the abuse and misuse of bibliometric in evaluating academic life and performance. He also told me different stories about the Impact Factor and his opinion about it.

As a contributor to the Slow Science network, I hence got the idea to organise in written form those early conversations. Actually, we scheduled a Skype interview where we talked about his thoughts about the Impact Factor, its increasing importance in academia and his experience as an author, reviewer and editor with it.

The following text is a synthetic re-elaboration of these interviews, structured in two posts and reviewed together with Paolo before its online publication on the Slow Science blog.

Paolo Cherubini

Sofia: Paolo, first of all, what is the Impact Factor?

Paolo: The Impact Factor is a measure that calculates how frequently articles published in a journal are cited. The Impact Factor of a journal for a certain year (e.g., 2015) is calculated by dividing the number of citations accumulated by all citeable items, e.g., articles, reviews, personal commentaries, of this journal with the average of all published citeable items in the two previous years (e.g., 2013-2014). So, if a journal had 100 citations over 50 published items, its Impact Factor will be 2. The Impact Factor is a great tool that was developed by Eugene K. Garfield, the owner and founder of the Institute of Scientific Information (ISI), when staff was still typewriting and comparing the number of times the publications in a certain journal were cited by other journals, and now run by Thompson Reuters.

Sofia: In your letter to Science published 2008, you wrote that “The exacerbated pressure to publish we all suffer from is induced by an exaggerated reverence for the impact factor”. How did this all happen?

Paolo: Well, the story is long but basically the Impact Factor comes out of a true necessity to navigate through the different journals and rank them according to their reputation and utility. Although it is a measure that changes over time, it can give a good idea of how important a certain journal is for scientific publications in a certain domain, based on the ratio of citations of the journal articles.

However, this practical need for comparing international scholarly journals has “backfired” and evolved into a way to evaluate scientific performance, institutes and personal careers and a “citation rush”. So today not only journals are fighting to increase their Impact Factor, but also authors crave for the quantification of their research publication records. Personally, I am not against the Impact Factor; as an editor-in-chief of a small journal, I care about it for my journal, but I also know that it’s a measure that does not tell everything about a journal.

Sofia: But why is there such a “reverence”?

Paolo: Because the quantification of scientific results through the Impact Factor is extremely effective for decision-making: it is easy to use. From a measure useful to rank journals, the impact factor now is used to decide which departments have to die and which ones can survive, live or are “of excellence” and be funded. It is a measure that is functional if one has to demonstrate that “objective” decisions have been taken on who should be hired, who be funded and who be prized because of high scientific productivity.

The Impact Factor is functional to maintain a certain structure and functioning of academia, which is beneficial to many parties, that’s why I call it reverence. The journals gain from it as true “brands” in a scientific sector, and the scientists and universities gain, because they all compete for funding and the Impact Factor serves as objective quantitative measure to figure out who has been a good or a bad scientist. In a way, “it’s a simple game for simple minds”.

Sofia: Can you provide an example?

Paolo: Yes, sure. The Impact Factor serves specific academic structures and arrangements. For instance, the supervisor always has to be included in the publications of her/his post-docs and doctoral students. This is not only to increase her/his productivity record, but also because without the supervisor having such a productivity measure, s/he will not be able to get funding, and nurture this hierarchical structure of post-docs and PhDs who otherwise will not be working. Don’t get me wrong: this is not a problem per se, but of course it affects how academic research is done, the number of students in a certain department or research group, and their mentoring and supervising.

Another example is the impact this had on the social sciences, that have been forced to adapt to this parameter, created in and for the natural sciences; the social sciences have started to compete with the natural sciences for funding using the same measures and weapons, in spite of the fact that probably in the social sciences publishing a book makes more sense than publishing articles in ISI journals, making so the use of the Impact Factor perhaps less appropriate.

Sofia: So do you think we should get rid of it?

Paolo: Let’s say that the Impact Factor was invented for a certain purpose, that is ranking international scientific journals. Over time it altered into something that is almost a dictatorship in scientific research and production. It has been misused, and we should get rid of it being used in such a wrong way.

However, it is also true that the Impact Factor was a good thing in those university systems where competitiveness and productivity were not awarded. For instance, in Italy, where I originally come from, the traditional university system was based on professors making a career publishing one book every 10 or 20 years, or only a bunch of articles written in the local language in non-ISI journals. This corrupts the system and makes it extremely inefficient and not innovative at all, which is especially frustrating for students and for the careers of new researchers. It was a system that worked autoreferentially for decades, and recently, thanks to the Impact Factor culture, the university system has been changing. [See here for more background information on the effect of adopting metrices on self-citational practices in an Italian context]

Once I was giving a talk in a formerly Eastern European country, and when I said that I was critical about the Impact Factor, because it is not necessarily a good measure of good research, some colleagues got angry because they were trying to change the academic system there in order to put more pressure on the “dinosaurs” that did not publish and reward young researchers and professors who really did make an effort to compete on the international research scene.

So, it’s good to have a measure such as the Impact Factor that can tell us which journals are best and who is publishing the most. But the Janus-face of this is that, for instance in China, professors get paid thousands of euros more if they get published in one of those top-end journals in the Impact Factor list. This isn’t good, it can become a problem if academic research is solely oriented to get a publication done and cited: the quality of academic research will suffer.

Sofia: Thank you for your time, Paolo. Anything to add?

Paolo: Yes. As a dendroecologist and forest scientist I don’t think these are so far from the social sciences. Actually, ecological systems are very much connected to social sciences … So I believe we are all on the same boat, and the Impact Fair is currently misused in both the natural and the social sciences. Thank you for this opportunity to share with you these thoughts, that I have been discussing with so many other colleagues over the past two decades.

Sofia: Thanks to you Paolo!



“Slow Interviews” is a column published on the blog of the Slow Science network. The “Slow Interviews” articles/posts are conceived, written and reviewed by Sofia Pagliarin, as one of the collaborators of the Slow Science network. The publication and the content of each interview, in one or multiple posts, are discussed, reviewed and reciprocally agreed through a cooperative dialogue and effort taking place between Sofia Pagliarin, the interviewee(s) and some other members of the network.

These interviews have the aim to enrich the topics and debates that are central to the network, and are conceived to be in direct dialogue with the course on the critical analysis of academia and academic production organised annually by the Slow Science network: They add to our understanding through adding different points of view, and are not intended to replace the topics and debates dealt with during the course. Interviewees are academics or informants that have experience and/or knowledge on a particular topic, and who are not necessarily related to either the Slow Science Network or the course.

Disclaimer: The views and opinions expressed in the “Slow Interviews” articles and blog posts are those of the authors and respondents and do not necessarily reflect an official policy or position of the Slow Science network.


Debate role of universities – Monday April 23rd at deBuren, Brussels

Dutch follows English

Dear all,

On Monday April 23rd , deBuren in collaboration with KU Leuven, UA, UGent and VUB are organizing a debate on the democratic legacy of 1968 and the role of universities then and now. The debate is organized as part of the doctoral school course “What does it mean to be a researcher in 21st century academia?”, but it is open to a broader audience. For everyone interested in slow science and/or the societal role of universities, this event is not to be missed!

You can find all relevant information via the website of deBuren:


Practical information:

Monday April 23rd, 19h30-21h
Leopoldstraat 6, Brussels

Tickets (5€/3€) are available through the website. The event is free for those who follow the workshop.


We hope to meet you there!



Beste allen,

Aanstaande maandag organiseert deBuren in samenwerking met KU Leuven, UA, Ugent en VUB een debat over de democratische erfenis van 1968 en de rol van de universiteit, zowel toen als nu.

Het debat kadert in de doctoral school workshop “What does it mean to be a researcher in 21st century academia?” maar is open voor publiek. Voor iedereen die geïnteresseerd is in slow science en/of de maatschappelijke rol van universiteiten is dit een niet te missen evenement!

Meer informatie is te vinden op de website van deBuren:


Praktische informatie:

Maandag 23 april, 19u30-21u
Leopoldstraat 6, Brussel

Tickets (5€/3€) zijn te verkrijgen via de website. Het debat is gratis voor wie de workshop volgt.


We hopen jullie daar te mogen verwelkomen!