2018

Summary of the 2nd Strategy of Impact Conference

In November 2018, 200 research professionals travelled from all corners of the globe to discuss the strategy of research impact assessment, data sharing, and collaboration.

Sean Newell, CEO Researchfish, opened the Conference, welcoming an audience consisting of research funders both large and small, charities, research organisations, researchers, consultants and ecosystem partners. Before introducing the three keynote morning speakers, Sean set the theme for the day by summing up where we are with impact:

  • Impact is a series of insights, questions, responsible metrics PLUS opinion and context.
  • Evidence is critical to demonstrate and articulate impact.
  • Impact goes well beyond publications.
  • All researchers are ‘agents of impact.’
  • Research, funding mechanisms and impact constantly evolve.

The first keynote speaker was Professor Graeme Reid, Chair of Science & Research Policy, University College London, who spoke on Impact: Past, Present and Future. Beginning with 1675 when Charles II founded the Royal Greenwich Observatory, through to New Scientist recruitment adverts in the 1970s to the present day, Professor Reid highlighted the need for clarity of impact and discussed future opportunities.

Rebecca Endean, Strategy Director of UK Research and Innovation, followed with how research impact assessment is vital to the role of UKRI, and how they created the Data Hub, which brings together outcomes data for UKRI’s core grants, studentships and HESA data View, with data collected via Researchfish playing a key role.

The final keynote speaker was Dr Kathryn Graham of Alberta Innovates who discussed an international data sharing collaboration between three international funders – Alberta Innovates, NIHR and the Novo Nordisk Foundation – to maximise the value and identify insights from data collected via the Researchfish platform.

A panel of the morning speakers with a healthy debate from the audience rounded off the morning.

The afternoon was split into four sessions as follows:

Tools & Methods

Sean Newell (Chair), CEO, Researchfish

Following platform updates and new functionality demonstrations, a strategic update outlining a move for Researchfish beyond data collection to data visualisation and analysis was highlighted, posing three important questions for discussion:

  • What questions data sharing between funders and research organisations enable to be answered? What are the opportunities and barriers for data sharing? ​
  • What pertinent questions do we need to ask to explore the data? Is the data available? What other data needs to be collected/identified? ​
  • What strategic/predictive questions should we ask of the data? How can data analysis inform and support decision-making and planning by funders and research organisations? ​

Three ecosystem partners then outlined how they are working with researchers to share, publish and report their scientific research.

  • Open Research Publishing Platforms to Support Open, Rapid and Holistic Reporting of Research Outputs – Rebecca Lawrence, Managing Director, F1000 — View slides
  • Making Online Attention Visible with Altmetric – Jean Liu, Head of Product, Altmetric — View slides
  • Helping Researchers Share their Uniqueness – Josh Brown, Director of Partnerships, ORCID — View slides

Impact Frameworks

Beverley Sherbon (Chair), Impact & Evaluation Adviser Researchfish

  • Workshop – Meeting Information Needs: Mapping Information for Reporting and Impact Assessment – Beverley Sherbon; Sarah Thomas, Senior Research Manager, NIHR — View slides. During the workshop some of the current challenges for reporting and impact assessment were discussed:
    • A structured approach to considering research progress and the resulting impact has significant merits but needs resource /commitment to put in place.
    • Institutional/organisational silos – impact is not linear, many actors/activities involved which link together, information available for evaluation doesn’t reflect this.
    • Sometimes still a need to build confidence in data available for evidencing frameworks and evaluations studies, e.g. information providers need a better understanding of what it is used for, and clarification on what stakeholders are asking for.
    • Consideration should be given to involving beneficiaries of research (e.g. patients, members of the public) in all stages of research impact assessment as there are clear benefits in this, however, it take resource and also presents new challenges in ways of working such as common language/understanding.
    • Information is always going to be collected through different channels and by different stakeholders, these are often similar but different pieces of information, more aligned definitions would enable more sharing.
  • Engaging Stakeholders to Design a Practical Impact Evaluation Framework: the Experience of Macmillan Cancer Support – Matthew Terry, Director, Cloud Chamber — View slides
  • DARE to be Different? Using Diversity as a Heuristic to Study Collaborative Research Interactions – Michael M. Hopkins, Director of Research, Science Policy Research Unit and Frédérique Bone, Research Fellow, Science Policy Research Unit — View slides

Data, Interoperability and Customer Use Cases

Gavin Reddick (Chair), Chief Analyst, Researchfish

The importance of reducing the burden on academics from both researcher and  institutional perspectives was discussed. The challenges posed by the fact that very few publications included both acknowledgement to the funder and a valid grant reference. As the number of non-publication output reporting is increasing, the Researchfish interoperability pilot is expanding to cover spin outs, further funding, and patents.

  • Integration with Third Party Databases – Gavin Reddick; Ian McArdle, Head of Research Systems and Information, Imperial College, London. Slides to come.
  • Where the Rubber Meets the Road: some Practical Thoughts about Analysing Evaluation Data, focusing on Collaborations and Publications – Nick Smith, Outputs, Outcomes and Impact Evaluation Manager, Research Strategy and Evaluation, Cancer Research UK — View slides
  • Assessing the Impact of Research: Lessons Learned from the ESPA programme – Valeria Izzi, University of Edinburgh — View slides
  • Data Research Impact: An Immersed View – Victoria Moody, UK Data Service Deputy Director and Director of Impact — View slides

Best Practice and Collaboration

Suzanne Rix (Chair), Research Manager, AMRC

Themes from this session included understanding data, communication and engagement with researchers, and collaboration and good practice and sharing with other organisations:

  • Using and Sharing Data – Lessons from the AMRC – Suzanne Rix, Research Manager, Association of Medical Research Charities — View slides
  • Reporting of Policy-related Outputs in Researchfish by ESRC Grant-holders – Allan Williams, Senior Evaluation Analyst, Economic and Social Research Council — View slides

A final panel followed by an evening drinks reception rounded off an interesting and stimulating day.