“We no longer wish to participate in the ranking of people” Ghent University wants to become a place where talent feels valued and nurtured

ghent

“A university is above all a place where everything can be questioned.”

My last two blog posts have been raising some critical questions about the viability and legitimacy of the scientific ‘enterprise’ in neo-liberal times. The Publish AND Perish blog post led to a lot of responses from colleagues within academic but also from the publishing ‘industry,’ including from the CEO of MDPI, Paul Vazquez. Coincidentally, a few weeks later, Ghent University in Belgium released a statement in which the university declared to go  – what I would call – ‘off-the-grid’ of commodification, marketization and economic globalization by turning towards, autonomy, (local) relevance, responsibility towards people and, hopefully planet as well, by creating spaces for transdisciplinarity, boundary-crossing and collaborative action (perhaps I am filtering the statement using my own lens – apologies if I do so). Below some excerpts form the statement which can be found here as well: Ghent University’s New Pathway

Here is the message from Ghent’s Vice Chancellor Rik van de Walle

‘We are transforming our university into a place where talent once again feels valued and nurtured’

(17-12-2018)

Our university should once again belong to the academics, rather than the bureaucracy, writes the rector of Ghent University, Rik Van de Walle.

Ghent University is deliberately choosing to step out of the rat race between individuals, departments and universities. We no longer wish to participate in the ranking of people.

It is a common complaint among academic staff that the mountain of paperwork, the cumbersome procedures and the administrative burden have grown to proportions that are barely controllable. Furthermore, the academic staff is increasingly put under pressure to count publications, citations and doctorates, on the basis of which funds are being allocated. The intense competition for funding often prevails over any possible collaboration across the boundaries of research groups, faculties and – why not – universities. With a new evaluation policy, Ghent University wants to address these concerns and at the same time breathe new life into its career guidance policy. Thus, the university can again become a place where talent feels valued and nurtured. We are transforming our university into a place where talent once again feels valued and nurtured.
With the new career and evaluation model for professorial staff, Ghent University is opening new horizons for Flanders. The main idea is that the academy will once again belong to the academics rather than the bureaucracy. No more procedures and processes with always the same templates, metrics and criteria which lump everyone together.
We opt for a radically new model: those who perform well will be promoted, with a minimum of accountability and administrative effort and a maximum of freedom and responsibility. The quality of the individual human capital is given priority: talent must be nurtured and feel valued.
This marks the end of the personalized objectives, the annual job descriptions and the high number of evaluation documents and activity reports. Instead, the new approach is based on collaboration, collegiality and teamwork. All staff members will make commitments about how they can contribute to the objectives of the department, the education programmes, the faculty and the university.
The evaluations will be greatly simplified and from now on only take place every five years instead of every two or four years. This should create an ‘evaluation break’. 

 

We opt for a radically new model: those who perform well will be promoted, with a minimum of accountability and administrative effort and a maximum of freedom and responsibility. At the same time, we want to pay more attention to well-being at work: the evaluations of the supervisors will explicitly take into account the way in which they manage and coach their staff. The model must provide a response to the complaint of many young professors that quantitative parameters are predominant in the evaluation process. The well-known and overwhelming ‘publication pressure’ is the most prominent exponent of this. Ghent University is deliberately choosing to step out of the rat race between individuals, departments and universities. We no longer wish to participate in the ranking of people.

Through this model, we are expressly taking up our responsibility. In the political debate on the funding of universities and research applications, a constant argument is that we want to move away from purely competitive thinking that leaves too little room for disruptive ideas. The reply of the policy makers is of course that we must first do this within the university itself. This is a clear step in that direction, and it also shows our efforts to put our own house in order.
With this cultural shift, Ghent University is taking the lead in Flanders, and we are proud of it. It is an initiative that is clearly in accordance with our motto: ‘Dare to Think’. Even more so, we dare to do it as well.
A university is above all a place where everything can be questioned.
Where opinions, procedures and habits are challenged. Where there is no place for rigidity.

 

I am absolutely convinced that in a few years’ time we will see that this new approach has benefited the overall quality of our university and its people.

Rik Van de Walle, rector.

Advertisements

Publish or perish: Improving your H-factor made easy through PleaseCiteMe.com

The above post has been re-blogged now that the San Francisco Declaration has been created which has been endorsed by Science.

The San Francisco Declaration is looking to challenge some of the strategic games being played in the world of academia to increase productivity, get tenure and climb the rankings. These games could easily lead to both a decline of scientific quality and an erosion of trust in science all together. Below there’s an exerpt from the editorial that can be found in Science (see also: http://www.aaas.org/news/releases/2013/0516_impact-factors.shtml#fb). The San Francisco Declaration can be found here:The San Francisco Declaration.

————————————————————————————————————————————————–Science Endorses New Limits on Journal Impact Factors

———————- —————–A measure developed to assess the quality of scientific journals has distorted how research is evaluated, and should be not be used to judge an individual’s work, Science Editor-in-Chief Bruce Alberts writes in the 17 May issue of the journal.

The editorial coincides with the release of the San Francisco Declaration of Research Assessment (DORA), which grew out of a gathering of scientists at the December 2012 meeting of the American Society for Cell Biology (ASCB). More than 150 scientists and 75 scientific organizations including Science’s publisher AAAS have endorsed DORA, which recommends specific changes to the way scientific journal rankings are used in hiring scientists, funding research and publishing papers.

———————————————————————One of the most popular ranking measures, called Journal Impact Factor or JIF, ranks research journals based on the average number of times its papers are cited by other papers. (The higher the JIF score, the more often its research papers are cited by others.) JIF was devised to rank journals, but is now often used to evaluate an individual’s research, by looking at whether she or he has published in high-score journals.

This misuse of the JIF score encourages far too much “me-too science,” Alberts writes. “Any evaluation system in which the mere number of a researcher’s publications increases his or her score creates a strong disincentive to pursue risky and potentially groundbreaking work, because it takes years to create a new approach in a new experimental context, during which no publications should be expected.”

Alberts notes that an unhealthy obsession with journal ranking scores may also make journals reluctant to publish papers in fields that are less cited, such as the social sciences, compared to papers from highly-cited fields such as biomedicine.

————————————–The DORA guidelines offer 18 specific recommendations for discontinuing the use of JIF in scientists’ hiring, tenure, and promotion, along with ways to assess research on its own merits apart from its place of publication.

Transformative learning

Publish or perish: Improving your H-factor made easy through PleaseCiteMe.com

“What’s your h-factor?” is a question that is increasingly asked at gatherings of scientists or during interviews for academic positions. Scientific careers depend on h-factors these days. What am I talking about?

The h-index is an index that attempts to measure both the productivity and impact of the published work of a scientist or scholar. The index is based on the set of the scientist’s most cited papers and the number of citations that they have received in other publications. The index can also be applied to the productivity and impact of a group of scientists, such as a department or university or country.

A scientist has index h if h of his/her Nppapers have at least h citations each, and the other (Np − h) papers have no more than h citations…

View original post 792 more words