How can we fairly evaluate research?

14th September 2022

Authors: Emerald’s Impact Advisory Board. 

Research metrics such as journal impact factors, citation counts and h-indices are used to evaluate research impact and quality. Research metrics play a role in the assessment and management of research, but the story they tell is limited. Used in isolation or in the wrong context, metrics can unfairly disadvantage some researchers.

Many across the sector are calling for more responsible use of metrics, broader evaluation measures and reforms to research assessment practices in general. There have been numerous efforts to change the system, notably the San Francisco Declaration on Research Assessment (DORA), The Leiden Manifesto, The Metric Tide, and most recently the Agreement on Reforming Research Assessment. Many research institutes have released statements on the responsible use of metrics, and funding bodies, publishers and organisations have made commitments to work together for change.  

Commitment for change

Emerald is among those championing alternatives to traditional academic metrics and rewards. It is one of six commitments we have made as part of our Real Impact Manifesto and Are you in? movement. Part of our role is to encourage the research community to share experiences, ideas and best practices for change. We believe that by working together we can create a fairer and more inclusive system that rewards and supports quality and impactful research for a better world.

Our manifesto was developed in partnership with our Impact Advisory Board and reflects their views on the barriers to change. Building on this work, we had a discussion with the Board to explore the current challenges and opportunities around alternatives to traditional academic metrics and rewards.


Shifting sands

Ken Knight is a Research Impact Manager at Murdoch Children's Research Institute in Australia.

Many of the challenges for broader measures of impact centre around capacity for change, says Ken. In his view, institutions may commit to change and sign initiatives like DORA, but they don’t necessarily have the capacity to develop a strategy they can deliver on to address the challenges.

Ken sees academic culture as another blocker to change, particularly for researchers in the health and medical context who often have a deep commitment to traditional metrics and performance measures. "Very senior researchers and clinicians are comfortable with the metrics they’ve used their entire career," he says. "And they have very well-established narratives around those metrics and ways of thinking about performance."

"But I also think the systems that capture those metrics and indicators are the ones that are most developed and the ones we already have access to. So, while there are external drivers for change, there aren’t enough, and certainly none that are compelling or coordinated."

Very senior researchers and clinicians are comfortable with the metrics they’ve used their entire career and they have very well-established narratives around those metrics and ways of thinking about performance

Another major barrier to change lies in the precarity of the academic environment, says Ken. In his view, researchers are less willing to challenge the status quo because of job insecurities and increased competition for funding. "We already exist in a hyper competitive and brutal research funding environment, and I think people feel an appropriate level of anxiety and unease around metrics and evaluation," Ken says. "I think the elephant in the room is about ‘Is this going to make me look bad?’ and ‘Am I going to jeopardise whatever tenuous funding I have, if people are going to be evaluating the value of my research in new ways’."

The outlook for change isn’t all gloomy though, notes Ken, pointing to growing interest in initiatives like the Impact CV which was created by Australian researcher Katherine Andrews. Institutions are also beginning to explore new ways to benchmark and evaluate performance, says Ken. "I would like to see research engagement, co-production of research, partnerships, translation and implementation activities, and processes and planning factored into academic promotion criteria," he adds. "I think that's something that really needs to be addressed at the university level."

Metrics and academic careers

Louis Coiffait-Gunn is Director of Policy and Public Affairs at The Publishers Association.

One of the main challenges that Louis sees with current research evaluation systems is the use of citations and JIFs in hiring and promotion. He would like academia to try anonymous recruitment as other sectors are starting to, and to consult a broader range of metrics, including Altmetric and Overton.

I'd love to see teams of different communication experts take ‘raw’ research material and work with academics from the start on all kinds of multimedia outputs, for different audiences and purposes

Louis believes publishers can play a key role in helping facilitate relationships for researchers with policymakers, businesses and community groups. He also thinks publishers and communication experts could work with academics to create new formats for communicating their research, to reach new audiences. "So much research is still locked away in PDFs and very technical language" says Louis. "I'd love to see teams of different communication experts take ‘raw’ research material and work with academics from the start on all kinds of multimedia outputs, for different audiences and purposes."

More than a number

Lara Skelly is the Open Research Manager for Data and Methods at Loughborough University, UK.

Issues around metrics often relate to how they are used in performance evaluation, says Lara, who argues for change within HR practices. "Academia is supposed to be at the forefront of these things, so why are our HR departments still working with outdated, outmoded HR practices, and applying it to people who are creative, who are bigger than a number?" she asks. "I wonder if the push isn't with academics or even in management, maybe it's just as simple as an HR practice."

Another challenge for Lara is around academia’s mentorship model. She believes that because supervisors pass on their traditions and culture to their students, old ways become entrenched. "I think the mentorship model has great benefits, I'm not discounting them, but it poses a problem if we are going to change," she adds.

In terms of communicating science, Lara sees a need for a broader range of formats, including ways to present experience as a research output.

Looking at positive steps towards new ways to measure impact, Lara has been impressed with EFMD's Business School Impact System, an impact assessment tool for business schools. "I like the way that they are currently thinking about impact, although I wonder if it goes far enough. They are constantly updating their views and providing guidance to business schools. It’s something I keep an eye on," she says.

Impact through lived experience

Sharon Zivkovic is Adjunct Research Associate at the University of South Australia, Founder and CEO of Community Capacity Builders and the Cofounder and Chief Innovation Officer at Wicked Lab.

New approaches to social enterprise and autism research could help shape thinking around impact, says Sharon. Lived experience is one approach that is attracting attention in autism research, and it is already being applied in peer review for some journals. "There are a lot of movements starting to happen who are a bit anti the researchers, particularly in the autism community," explains Sharon.

The Global Autistic Task Force on Autism Research have been writing into journals as a right of reply and the lived experience researcher kind of movements are happening around mental health and autism. This is where people have been disenfranchised by researchers and are starting to organise, which is quite exciting to have their point of view put across in academic journals as well."

Turning to ways that researchers can advance the impact of their research, Sharon believes more should be done to influence decisions at the policy level. Policy briefs are a useful tool, says Sharon, because they communicate research quickly and don’t rely on research papers being published. Sharon frequently writes policy papers and has recently submitted one on the creation of autistic-led social enterprises to feed into the development of a national employment framework for autism in Australia. "To get policy change, it's got to happen quick, you can't wait to publish a paper or anything like," she adds.

For researchers who need support in communicating their research, Sharon is a fan of initiatives such as Fresh Science in Australia (managed by Science in Public) who run a national competition and training programme that helps researchers communicate their findings. "Fresh Science gives awards to early career researchers from different universities in Australia and the final night, you've got to get up on stage and talk about your research in the time it takes for a sparkler to go out," she says.

 

Jerry Davis is the Gilbert and Ruth Whitaker Professor of Business Administration at the Ross School of Business and Professor of Sociology, The University of Michigan, USA.

Jerry is involved in activities aimed at rewarding research that can make a difference to real world issues. He is among the thought-leaders at Responsible Research in Business and Management (RRBM) network who support business and management research that addresses real world issues. Their goal is to advance research that can inform policy and practice for a better world.

One of the ways RRBM recognises and rewards research is through its annual Responsible Research in Management Awards, which are judged by the Fellows of the Academy of Management. Award winners are invited to present a webinar on their research.

Launched more than five years ago, the awards are well respected and celebrated. "The awards themselves are modest in cash but have real cachet and visibility in the world – people are pretty proud of them, and this helps raise their credibility," says Jerry.

One area that is attracting attention is using artificial intelligence or content analysis tools to code articles based on their relevance to the Sustainable Development Goals

Honour roll for articles

RRBM has recently launched an honour roll as another way to advance responsible research. Publications that meet the criteria for responsible research are included in the RRBM Honor Roll. The RRBM-Honor Roll emblem can be included on CVs and used by schools, journals and publishers to track the likely societal impact of their research.

"The intention of the honour roll is to have a site that shows all the articles published in recognised top journals that from our judge’s perspective meet the criteria for responsible research.

"The process is labour intensive and can’t easily be automated. We recruited scores of reviewers to look over the published articles and judge if the research merits this recognition.

"We’re asking only the most senior respected scholars to be part of this initiative and they are doing this as volunteers because they believe in helping research make the world a better place."

Is potential impact good enough?

Jerry and his colleagues at RRBM are working to develop credible ways to assess the societal impact of research.

Suggestions they’ve received from subject experts include using Altmetrics to track impact and tracking intermediate research impact from publications to Wikipedia. One area that is attracting attention is using artificial intelligence or content analysis tools to code articles based on their relevance to the Sustainable Development Goals. Rather than track real-world impact, these tools look at outputs to identify research that has the potential to make an impact.

"There are competing models out there, such as software tools that take a set of prominent journals, look at the last three years of their articles, run them through their coding algorithm, and say here's what our classification says, here are the articles that are most relevant to health and wellbeing or income inequality," says Jerry.

"So that's more about relevance than impact, but it's further down the pipe than just counting articles and giving you one point for each article or counting citations."

Are you in?

Let’s work together to help the sector break ties with old measures of impact and enjoy a fairer, more equitable environment for research to thrive.

Find out more & pledge your support