Ben Ryan leads for the EPSRC (the Engineering and Physical Sciences Research Council) on their use of Researchfish and any other systems that help gather the evidence needed to demonstrate the impact of the investments the EPSRC have made.
By allowing researchers to directly attribute their publications to our grants Researchfish gives us a much more robust dataset of uniquely identified publications that have arisen because of our funding. This means we can talk with much more confidence about what the funded research has produced. Gathering data on research outcomes helps us demonstrate the value of the investment we have made and that it’s worth continuing to invest.
EPSRC is the main UK government agency for funding research and training in engineering and the physical sciences, investing more than £800 million a year in a broad range of subjects – from mathematics to materials science, and from information technology to structural engineering.
“We have just completed our third submission period with Researchfish,” explained Ben. “We use Researchfish to gather from researchers information about the outcomes and impacts of the research we have funded.”
“Any funder of research needs to be able to demonstrate to the people who are providing the money that it is being well spent.“
EPSRC tracks the outcomes of research investments for more than just audit trail accountability. “We are a public funder so we are accountable to parliament which has to decide whether to support science and research or not. There are a lot of competing demands on the public purse, and unless we can make a compelling case to government that the money is actually delivering benefits to the economy as a whole they could be justified in saying they have better uses for the money.”
The challenge is that the benefits of research usually happen over a long timescale and may not be recognised during the period of the original investment. “You fund something that in turn leads to more research that in turn leads to more research and that can make a big impact. There are some classic stories in our domain, for example when lasers were first described and demonstrated as physical ideas people wondered what possible use they could have. Time speaks for itself.”
“Gathering data on research outcomes helps us demonstrate the value of the investment we have made and that it’s worth continuing to invest.”
Before the EPSRC used an open-ended outputs/outcomes/impacts collection system they had a ‘final reports’ system. Researchers at the end of a funded project would submit a final report describing the outcome(s) of their research. This was in two parts – a form to be filled in which collected, in some small measure, information that could be aggregated; and a narrative report up to six pages.
“We realised that to be able to demonstrate impact effectively we really needed to collect information over a longer period than a ‘final report’ was giving us – it had to be submitted within three months of the end of project funding, yet many impacts of research aren’t recognised – and may not even have occurred – so close to the end of a project,” explained Ben.
“So we decided to stop collecting the lengthy narrative element, which was taking researchers a lot of time to write but not providing us with the long-view of impact that we need, and to move to an online system of collecting much more structured information which would include some free text when necessary and be updatable over a longer period. Initially the seven UK Research Councils worked collectively to jointly specify such a system; to cut a long story short, following a competitive tender and in the interests of harmonisation and interoperability, we have since 2014 all adopted Researchfish as our single outcomes data collection system.”
“This process moved us away from forms that provided a one-time snapshot to a process that allows us to keep going back to the researchers for a period after the research has ended, and allows them to update the outcomes as and when they can.”
“Having a single reporting process makes it easier for us as seven councils, using a basket of agreed common outcome indicators as a starting point, to report in a joined-up way to our sponsoring department in government. It allows overall aggregations to be made. There is also potentially the large advantage of being able to aggregate outcomes with other funders, allowing a ‘bigger picture’ to be seen in a way that would be far harder if people are using different systems to collect their outcomes.”
“Researchfish gives us information in a structured form. The fact that Researchfish has done so much work on harvesting information from different sources has made things much easier not only for us but also and very importantly for the researchers who have to report to us. For example, the way Researchfish has integrated with ORCID is excellent. Moving information from one system to the other without having to re-enter it is key.”
“During the period that we were collecting information in final reports we simply asked researchers to give us the references to their top five publications and to give us a simple count of the rest. This meant that, although we could add up the numbers, most of the time we had no idea which publications were being talked about. Even now, when bibliometric databases can in theory provide information on who is funding published research, their information is often patchy because researchers aren’t always good at including – and publishers aren’t always good at collecting – proper funding acknowledgments in their publications. By allowing researchers to directly attribute their publications to our grants. Researchfish gives us a much more robust dataset of uniquely identified publications that have arisen because of our funding. This means we can talk with much more confidence about what the funded research has produced.”
“Once we’ve got that link clearly made we can undertake/commission bibliometric analyses that look into the citation impact of the research we are funding, and how it compares with others. It’s because of the ability to unambiguously attribute publications to funding that we can do that, as well as other sorts of analysis – for example, are some funding approaches better at generating some kinds of impacts? Do grants that are longer or larger result in the same or higher impact? If you support the best people for longer do you get ‘better quality’ outcomes? Because Researchfish links all the grants to all the data we can drill into the dataset and address questions like that in a robust way.”
“Are there any areas of investment that are more productive of some kinds of outcomes than others, and if there are how do we want to respond to that? The outcome reports that we have published demonstrate the capacity that we now have to aggregate in terms of what our funding is delivering for the nation.”
Ben concluded: “To any research funder who isn’t collecting information about the outcomes of their funded research, I would ask ‘why not?’ If you are supporting research, spending resource with the expectation of some sort of outcome – and presumably benefit – and you’re not collecting the evidence that you need to evaluate whether you’re achieving what you hope to, then obviously it’s going to be difficult to justify past – let alone future – funding decisions. Using Researchfish, or a system very like it, makes it much easier to get a better perspective on the results of your investment”.