Posts tagged: REF

Impact Case Study Action Plans

As part of the REF readiness exercise in preparation for our REF submission in 2020, the Impact, Engagement and Environment Coordinator, in conjunction with the School Impact Coordinators, is holding a series of ‘Impact Case Study Action Plan’ meetings with identified case study leads.

Initial meetings have already started to take place and will continue throughout September and October. These will be followed up by mid-point review and year-end review meetings.

The aim of these meetings is to set a number of SMART objectives to guide our case study leads through the development of their case studies across the next two years in preparation for the final REF submission.

 

 

Key areas of focus include:

  • Creation of an ‘impact’ folder within Figshare in order to deposit all impact evidence collected to date and to maintain on an ongoing basis
  • Identification of research outputs to be included as ‘underpinning research’ for the impacts claimed in the case study
  • Completion of a stakeholder analysis to identify who has benefitted so far from the research, who to contact for testimonials, who to target to generate future impact and so on
  • Redrafting of impact case study information to date to reflect development of objectives and to shape the final submission

 

Further details on what constitutes an impact and how to measure ‘reach and significance’ of the impact have been released by REF in their draft panel guidance.

These details can be found here: https://www.ref.ac.uk/publications/

This list is not exhaustive, but provides a useful overview of the many different ways that research can generate an impact of some kind. Some food for thought!

 

Peer review of impact case studies

It is anticipated that both formal and informal internal peer reviews of impact case studies will be held in late 2018/early 2019, with another external peer review planned for Summer 2019.

Development of impact case studies through adherence to the ‘Impact Case Study Action Plan’ should help our case study leads to submit further drafts of their impact case studies for peer review to enable meaningful feedback to be provided and fed into future iterations of the action plan.

 

Why not take this opportunity to check out this and other impact resources available on the REF staff intranet: https://www.salford.ac.uk/ref


Research Impact Resources

The University of Salford’s REF intranet site (www.salford.ac.uk/ref) provides a wealth of resources for researchers to tap into, wherever they may be on their impact journey. This ranges from those that are new to impact and those that are impact-aware through to experienced impact practitioners.

A few of the links to external resources available on the REF intranet site are as follows:

  1. Fast Track Impact (Prof Mark Reed)

Fast Track Impact resources can be accessed online at the following link:

http://www.fasttrackimpact.com/resources

 

This wealth of information includes:

  • Fast Track Research Impact Guides

 

A series of “how to” guides for researchers.

  • Fast Track Research Impact Podcasts

 

A series of twelve ½-hour podcasts around the subject of impact can be listened to here:

http://www.fasttrackimpact.com/podcast

  • Fast Track Impact: What makes a 4* impact case study?

 

Fast Track Impact What Makes a 4 Star Impact Case Study.pdf

https://www.fasttrackimpact.com/single-post/2018/06/04/What-made-a-4-impact-case-study-in-REF2014

 

  1. The Leadership Foundation Impact Toolkit

The Leadership Foundation have developed a Research Leaders Impact Toolkit designed to “offer a suite of research-based tools that can be used by higher education institutions to:

  • Develop a formal research impact strategy
  • Devise strategies for leading, managing and practising impact
  • Align impact work with engagement, knowledge exchange, outreach and quality improvement
  • Inform teaching and learning
  • Improve processes and infrastructure
  • Build capacity, skills and knowledge”

To access the toolkit, you will need to login to or create your “MyLF” account.

  1. Research Councils

Most Research Councils have specific guidance and advice about how to complete Pathways to Impact applications.

Further information on Research Councils can be found here: https://www.ukri.org/funding/funding-opportunities/

Funders ask for explicit answers to the following questions:

  • Who will benefit from this research?
  • How will they benefit from this research?
  • What will be done to ensure that they have the opportunity to benefit from this research?

Funders emphasise the importance of allocating sufficient costs/resources to activities described in impact plans.

 

  1. Taylor and Francis Editors

This is a useful website offering advice and guidance on how UK authors can be compliant with HEFCE’s open access policy:

http://editorresources.taylorandfrancisgroup.com/research-and-the-ref/

http://editorresources.taylorandfrancisgroup.com/category/citations-and-impact/

 

  1. UK Parliament

Research impact at the UK Parliament

This website provides you with everything you need to know to engage with Parliament as a researcher:

http://www.parliament.uk/research-impact?mc_cid=23e455bd5d&mc_eid=1648a9ffa9

It offers advice on:

What interests Parliament?

Why should you engage with Parliament?

How Parliament uses research

Ways to engage with Parliament

It also provides a number of ‘How To’ guides:

http://www.parliament.uk/get-involved/education-programmes/universities-programme/research-impact-at-the-uk-parliament/how-to-guides/

 

  1. Vertigo Ventures

Vertigo Ventures was founded to measure impact and works closely with clients to deliver high quality impact reporting, which provides clear and actionable insight into how individuals and organisations can maximise the reach and significance of their impact.

Vertigo Ventures provides a range of proprietary services, such as training workshops, consultancy services and the VV-Impact Tracker, all of which utilise an innovative online reporting framework known as VV-Impact Metrics.

The VV Hub can be accessed at: http://www.vertigoventures.com/impacthub/

Sign up for free webinars, newsletters and blogs relating to impact.

 

  1. Research To Action

Research to Action is a global guide to research impact. It offers reading lists, opportunities to blog as well as a number of ‘how to’ guides around key communication and engagement activities to help widen the dissemination of your research.

The website is found at: www.researchtoaction.org

A useful Rethinking Research Partnerships toolkit and discussion guide is found on the website and can be downloaded here: Research to Action Toolkit and Discussion Guide.pdf

 

  1. National Co-ordinating Centre for Public Engagement (NCCPE)

NCCPE’s definition of public engagement in the context of REF:

‘Public engagement’ (in the context of the REF) describes an approach to involving the public in meaningful roles in the development, uptake and/or application of research. The act of engaging the public with research does not count as impact. Impact is what happens when people interact with the research, take it up, react or respond to it. Public engagement doesn’t just happen when the research is complete. It can (and often does) take place before and during the research – for instance, helping to shape its focus and direction and its relevance to potential users.

The website is found at: https://www.publicengagement.ac.uk/

 

 

Why not take this opportunity to check out the impact resources available on the REF staff intranet: https://www.salford.ac.uk/ref

 


Peer Review of Impact Case Studies

According to Fast Track Impact’s calculations (see: http://www.fasttrackimpact.com/single-post/2017/02/01/How-much-was-an-impact-case-study-worth-in-the-UK-Research-Excellence-Framework for further details), the best impact submissions to REF2014, i.e. those achieving a 4* star narrative case study, had a currency exchange of some £324,000 (£46,300 per year between 2015/16-2021/22). By contrast, a 4* research output was typically valued at between £5,000-£25,000. Generally speaking, impact case studies are thought to be worth around 5 times more than outputs at higher full-time equivalents (FTEs).

As such, the huge potential value this may bring to institutions cannot be underestimated, particularly given the increased weighting of impact from 20% to 25% for the next REF exercise in 2021. Consequently, institutions employ a number of strategies and resources to ensure the best possible outcomes of their REF impact submissions. For example, there are reports of significant sums being spent by some universities in the REF2014 exercise on copy editors or science writers in order to create compelling narratives that would stand up to the scrutiny of the REF panel members.

A robust internal and external peer review process is one means of tracking progress over time in order to enhance and improve narratives and impact evidence ahead of the final REF submission in 2020.

 

Upcoming peer review events

The University of Salford is undertaking its first external peer review of draft impact case studies this Summer as part of its REF Readiness exercise. This will give the University a snapshot of where things stand and where improvements still need to made in the 2 years leading up to the REF submission. The feedback from the external peer review will inform the planned internal peer review due to take place in early 2019.

 

Dates for the diary include:

Monday, 18 June 2018 – Friday, 29 June 2018: External peer review of 10 x impact case studies across UoAs      This will include review and annotation of draft case studies, an overview report, notes on potential grades and advice on how to enhance impact.

Monday, 25 June 2018 – Friday, 20 July 2018: University of Salford Festival of Research       A month-long programme of events celebrating and promoting the University’s valuable research. This will include a REF-focused impact case study writing workshop, an impact ‘writing retreat’ and one-to-one mentoring on impact narratives.

Wednesday, 27 June 2018: Fast Track Impact case study writing workshop with Prof Mark Reed           Mark will focus specifically on the REF and what makes a good impact case study, how to improve your writing around impact, as well as evidence collection tips. This workshop will also include detailed external peer review of 4 draft impact case studies, with recommendations of how these can be enhanced and improved.

 

To book: https://myadvantage.salford.ac.uk/students/events/Detail/597642/staff-development-fast-track-t

 

Why not take this opportunity to look at the upcoming peer review meetings and events information on our REF Intranet site at: https://teamsite.salford.ac.uk/sites/sc02/REF2021/SitePages/Training.aspx


Impact Training and Events

As the REF draws ever closer, thoughts are now turning to impact and how to ensure that the University’s research is demonstrating impact beyond academia and making a real difference in the wider world. This raises a number of questions about what constitutes impact and impact evidence, where this should be stored, when it should be collected and how it can be enhanced.

In order to help researchers to gain a better understanding of research impact and what it means to them, a training programme designed specifically around impact is being rolled out across the University in the coming months.

 

Upcoming internal training and events

Future training will be tailored to meet individual needs in terms of impact. For example, you might be looking for a taster session to learn what research impact is all about, or maybe you are an early career researcher bidding for funding for the first time. Perhaps you are a mid-career or senior researcher who needs some advice on collection of impact evidence. Whatever your requirements, there is something to suit every level and discipline.

 

Events of note include:

 

Monday, 16 April 2018: Impact writing workshop with Chris Simms

Chris from the Royal Literary Fund will be visiting the University again to hold a sessions around writing for impact, creating a narrative and telling a story.

To book: https://myadvantage.salford.ac.uk/students/events/Detail/597635/staff-development-new-to-impac

 

Thursday, 3 May 2018: Fast Track Impact workshop with Prof Mark Reed

Mark returns for the first of two workshops, this one focusing on generating and evaluating impact, as well as how to maximise your social media presence for enhanced impact.

To book: https://myadvantage.salford.ac.uk/students/events/Detail/597641/staff-development-fast-track-t

 

Monday, 25 June 2018 – Friday, 20 July 2018: University of Salford Festival of Research

A month-long programme of events celebrating and promoting the University’s valuable research. This will include the popular PGR event ‘SPARC’ (Salford Postgraduate Annual Research Conference) on 4 + 5 July 2018, as well as an impact ‘writing retreat’ on 3 July 2018 for budding impact case study writers

 

Wednesday, 27 June 2018: Fast Track Impact case study writing workshop with Prof Mark Reed

Mark will focus specifically on the REF and what makes a good impact case study, how to improve your writing around impact, as well as evidence collection tips.

To book: https://myadvantage.salford.ac.uk/students/events/Detail/597642/staff-development-fast-track-t

 

From September 2018 a suite of workshops specifically around impact will be embedded into the staff development programme (SECRET) – further information will be available shortly.

Why not take this opportunity to look at the upcoming training, meetings and events information on our REF Intranet site at: https://teamsite.salford.ac.uk/sites/sc02/REF2021/SitePages/Training.aspx

 

External training

Alternatively, why not sign up for the free 5-week impact online training course run by Fast Track Impact?

Each session comprises 6-minute video and a short reading. After each session, you will be given tasks to complete within your own research before the next session:

  • Introduction: Five ways to fast track your impact
  • Week 1: Envision your impact
  • Week 2: Plan for impact
  • Week 3: Cut back anything hindering or distracting you from your impact 
  • Week 4: Get specific about the impacts you will seek and the people who can help you achieve impact this month
  • Week 5: Achieve your first step towards impact and monitor your success

 

Further details can be found here: http://www.fasttrackimpact.com/for-researchers


How to Write a 4* Journal Article

Professor Mark Reed, Professor of Socio-Technical Innovation at Newcastle University

In December, Prof Mark Reed, Professor of Socio-Technical Innovation at Newcastle University and the man behind Fast Track Impact, tweeted some thoughts on how to write a 4* paper for the REF and wrote a blog about it. This post is published here with the author’s permission.

How do you write a 4* paper for the Research Excellence Framework (REF)? It is a question I’ve asked myself with some urgency since the Stern Review shredded my REF submission by not allowing me to bring my papers with me this year to my new position at Newcastle University.

Obviously the answer is going to differ depending on your discipline, but I think there are a few simple things that everyone can do to maximize their chances of getting a top graded research output.

I’m going to start with the assumption that you’ve actually done original, significant and rigorous work – if you haven’t then there is no point in reading any further. However, as I am increasingly asked to pre-review papers for colleagues across a range of disciplines, I am seeing examples of people who write up work as a 2* or 3* paper that has the potential to get a better score. I should point out that I believe that there is an important role for 1* and 2* papers, and that I regularly write these on purpose to address a problem of national significance and frame it for the specific, narrow audience that is likely to be able to benefit most from my work. However, whether I like it or not, as a Professor in a research-intensive University, there is an expectation that I will be submitted as a 4* researcher, which means I need a few 4* papers as well.

You can see some more detailed thoughts on what I think makes 4* for different types of paper in this Tweet:

https://twitter.com/profmarkreed/status/801348612345253888/photo/1

As you’ll see from the discussion under that tweet though, my more detailed thoughts probably only apply to Units of Assessment across panels A-C, and probably isn’t relevant to the arts and humanities.

Having said this, I think there are a number of things we can all do to maximize the chances of our work being viewed favourably by REF panelists.

  1. Write to the criteria:when I was learning to drive, my instructor told me that in the test I should make sure I moved my head when I was looking in the rear view mirror, to make sure the examiner noticed I was using my mirrors. We’re all used to writing to the criteria of funding calls, and in fact we are all perfectly used to writing papers to the criteria of our target journals. In the last REF, research outputs were judged against three criteria: originality, significance and rigour. Whatever the interpretation of these criteria in your discipline, have you made it explicit to REF panelists reading your work exactly what is original, and why it is so original? Have you explained and effectively justified the significance of your work? And have you included evidence that your methods, analysis and interpretation is rigorous, even if you have to use supplementary material to include extra detail about your methods and data to get around journal word limits?
  2. Get REF feedback before you submit your work for publication:find out who is going to be reviewing research outputs for REF internally within your Unit of Assessment at your institution and ask them to review your work before you submit it. They may be able to make recommendations about how you might improve the paper in light of the REF criteria. Sometimes a little bit of extra work on the framing of your research in relation to wider contexts and issues can help articulate the significance of your work, and with additional reading and thinking, you may be able to position your work more effectively in relation to previous work to demonstrate its originality more clearly. Adding a few extra details to your methods and results may re-assure readers and reviewers that your approach is indeed rigorous. This is not just about doing world-leading research; it is about demonstrating to the world that your work is indeed world-leading. For me, these criteria are nothing new and are worth paying attention to, whether or not we are interested in REF. Meeting these three criteria will increase the chances that you get through peer-review and will increase the likelihood that your work gets cited.
  3. Analyse and discuss good practice in your own area: the only way to really “get your eye in” for REF is to actually look at examples of good and poor practice in your own area. Below, I’ve described how you can design an exercise to do this with your colleagues. You can do it yourself and learn a lot, but from my own experience, you learn a lot more by doing this as a discussion exercise with colleagues who work in your area. If you can, take notes from your discussion and try and distill some of the key lessons, so you can learn collectively as a group and more effectively review and support each other’s work.

How to organize a discussion to work out what makes a 4* paper in your area:

  • Identify top scoring institutions for your Unit of Assessment (UOA): download the REF2014 results, filter for your UOA (columns E or F), then filter so it only shows you the outputs (column J), and then filter for 4* (column L), showing only the institutions from your UOA that had the highest percentage of 4* outputs. Now for those institutions, look across the table (columns L-P) to see which has the highest proportion of outputs at either 3* or 4*. For example, an institution may have 80% of its outputs graded at 4* and 15% graded at 3*, meaning that 95% of its outputs were graded at 3-4*
  • Download a selection of papers from the top scoring institutions: go to your UOA on the REF website, find and click on the institutions you’ve identified in step 1, under “view submission data”, click on “research outputs”, copy and paste output titles into Google Scholar (or your search engine of choice) and download the articles. You may want to select outputs randomly, or you may want to go through more selectively, identifying outputs that are close to the areas your group specialize in
  • Repeat for low scoring institutions so you can compare and contrast high and low scoring outputs
  • Discuss examples: print copies of the high and low scoring outputs, labeled clearly, and in your next UOA meeting, let everyone choose a high and a low-scoring example. Given them 10-15 minutes to quickly read the outputs (focusing on title, abstract, introduction, figures and conclusions so you’re not there all day) and then ask the group (or small groups if there are many of you) to discuss the key factors that they think distinguish between high and low scoring outputs. Get your group(s) to distill the key principles that they think are most useful and disseminate these more widely to the group, so that anyone who wasn’t present can benefit.

It would be great if I could tell you that these are my “three easy ways to get a 4* paper” but doing work that is genuinely original, significant and rigorous is far from easy. If you have done work that is of the highest quality though, I hope that the ideas I’ve suggested here will help you get the credit you deserve for the great research you’ve done.


The Impact Environment: REFlections on the Stern Review (Part 1)

Stern

By Dr Chris Hewson, Impact Coordinator

Upon its release last Thursday, the Twittersphere became the locus for a series of overlapping debates on Lord Nicholas Stern’s Review of the REF (see:#SternReview) [i]. This was heartening, chiming with my previous post ‘We need to talk about research impact (again)’ on the need for ‘robust discussions’; a refrain I will seek to expand upon in future pieces [ii]. The report presents a considered and balanced perspective, seeking to develop the well-regarded aspects of REF2014, whilst addressing the three blights of disciplinary siloing, resource burden, and permissible yet unprincipled ‘gaming’. In what follows, I consider the wider canvas upon which the report paints, interspersing this with observations on the structure of the proposed REF2010 ‘impact environment’. In a follow up post, I’ll build on these points, considering how the report seeks to reconfigure impact case study submission, and how this may have knock-on effects with respect to how Higher Education Institutions (HEIs) manage, promote, and report knowledge exchange.

Full submissions and portability

The beating heart of the public debate was Recommendation 3 “Outputs should not be portable [rather, tied to] the institution where the output was demonstrably generated” (para. 73); thus bringing outputs into line with existing rules on HEIs claiming impact. A clear worry was rapidly identified, that this would remove one obvious source of leverage still available to early career researchers in a tight and often volatile job market. This concern was tied to a series of parallel discussions around Recommendation 1 “All research active staff should be returned” (para. 65) and the HR manoeuvrings this could unleash in deciding who counts as ‘research active’ (and indeed what counts as ‘research activity’). This has the potential to usher in new modes of marginalisation, and as Richard Watermeyer suggests this “profound problem… could end up diminishing what universities recognise as the role and contribution of the researcher: primarily, the successful procurement of research funds and prominence.” Nevertheless, it is likely that the devil will be in the detail, with the work of high performing academics – who may be able to submit as many as six outputs – conceivably mitigating the risk of including a ‘long tail’ of less prolific researchers [iii]. Much will rest on how individual HEIs establish and interpret ‘who should be doing what’ within given research units, however configured. I will not dwell on these points, except to note that they have been thoughtfully covered by others, including  Athene Donald, Martin Eve, Adam Golberg and Liz Morrish [iv].

Enhancing impact

Mark Reed has provided a sympathetic summary of Stern’s impact-focused recommendations, arguing that the report successfully addresses a perceived narrowing of impact that occurred in the post-REF2014 period. To those working in an impact support role this ossification was always largely a matter of perception. Nevertheless, as a contribution towards a concerted ‘push-back’, Stern is of considerable assistance. As Reed notes “it was not HEFCE who constrained the definition of impact; it was the way that the academy interpreted HEFCE’s very broad definition in strategic and instrumentalist terms.” It is probable that the way the REF was/is managed within HEIs adds to this hive of semantic circumspection. One would therefore hope that whilst Stern maintains that “all panels should have the same broad approach to impact”(para.82), in the second iteration of the REF the individual sub-panels – the“number and shape” of which Stern argues “was about right” (para. 63) – are allowed freer rein to define what impact is ‘for them’. The indications, which I will cover in my next post, are that a reassertion of the value of impacts on cultural life, via public engagement, and through pedagogy will bolster this fresh optimism. One is also left with an irony, in that more prescriptive guidance could ‘lock in’ a broader overall interpretation of how research impact can and should be presented.

Unifying templates

Perhaps the area of most considerable interest is Stern’s recognition of the oft-touted suggestion to merge the environment and impact templates. In one move this increases the value of impact without changing the 65%-20%-15% structure of the REF (impact cases studies now worth 20% of an HEI submission, rather than 16%, with the combined templates remaining at 15%). This sits alongside proposals to introduce institutional as well as Unit of Assessment (UoA) Environment templates, removing a previously unavoidable layer of repetition (and institutional boilerplate), and establishing a means of assessing “steps taken to promote interdisciplinary and other joint working internally and externally and to support engagement and impact, beyond that which is just the aggregate of individual units of assessment” (para.88). The proposal to allow the (tick-box) identification interdisciplinary outputs, as well as document the role of ‘interdisciplinary champions’ (para. 100) – whilst probably not assuaging critics such as Derek Sayer – are also clear moves in the right direction. Besides, this latter suggestion will likely provide a welcome springboard for public engagement champions, and those in similar, often semi-official HEI roles.

The writing was on the wall for the impact template, the moment the HEFCE commissioned RAND review detailed respondents who “spoke of the ‘fairy tale’-like nature of the impact templates, which they felt to be ‘a cosmetic exercise’… Whilst the information the template provided was good to have, there was no way of verifying claims that were made without having a site visit, and there was no confidence that the impact template reflected reality” [v]. As an impact manager, one might also view this tweak as a barely concealed attempt to make the REF ‘UKRI ready’, actively fulfilling the need expressed by RCUK to move knowledge exchange out of the periphery, “embedding throughout the research base a culture in which excellent research departments consistently engage with business, the public sector and civil society organisations, and are committed to carrying new ideas through to beneficial outcomes”. On a practical level outlining a UoA’s research and knowledge exchange trajectory within a unified document is also manifestly more straightforward to execute. As Stern notes:“impact and environment should be seen in a more integrated way and at a more institutional level… becom[ing] more strategic and forward looking whilst retaining a strong evidence base in past performance” (para. 126). Come the forthcoming consultation, a key point of contestation will likely be around the credit split between institutional and UoA components, the report only hinting at the discussions to come: “a share of QR funding should be awarded to the institution based on its Institutional Environment statement and the institutional-level impact case studies which it submits. This innovation will require careful testing and we recommend that the funding bodies explore options for piloting the institutional level assessment to test this proposal” (para. 91).

Looking back, looking forward

An allied issue will be how, without increasing the burden [vi], the new template structure can adequately assess “the future research and knowledge exchange strategy of the HEI, as well as the individual Units of Assessment, and the extent to which both have delivered on the strategies set out in the previous REF” (para. 88). In seeking to collapse institutional boundaries, via a two tier submission, opportunities open up for a more rounded expression of an HEI’s medium to long-term aims and objectives. However, at the same time this affords an opportunity to senior managers, who may seek to actively dismantle and distance themselves from existing institutional plans, as the refreshed REF itself becomes a strategic determinant over the preceding four year period. In a surprising – but not unwelcome – sojourn into detail, the report goes as far as proposing fourteen ‘headers’ for the Institutional and UoA Environment statements, including brief mention of ‘progress’ and ‘strategic plans’ previously outlined (para. 94). This could be beefed up, asking HEIs to reflect in more direct and tangible terms on existing – and one might add publicly available – ‘research strategy’ and ‘(impact) strategy and plans’, as contained within REF2014 submissions [vii].

Data sharing

How data is shared across the research system feeds into the strategic benchmarking, implied above. Stern recognises this as a deep-seated challenge, advising that “attention will have to be paid to the quality and comparability of databases… an issue which applies for the sector as a whole and the new UKRI”(para. 126). In the current system the trail from source funding, via published research towards consolidation within a REF impact case, is somewhat haphazard. A recent report from the BBSRC exemplifies this predicament, citing“a disparity between the institutional distribution of BBSRC research funding and the distribution of [REF] case study acknowledgments… a result of local expectations and practice, driven by researchers and research managers drafting case studies” [viii]. Put simply, for the purposes of accountability different parts of the research system require different forms of output (evidence), each founded upon different forms, or configurations of input (resource). In untangling this, and generating a solution that works for all parties including central government, the report is vague, noting only that HESA will be consulted with respect to numbers of outputs required per UoA (para. 70) [ix]. However, the general thrust of the report clearly fits with a stated desire that UKRI and its board should use REF as a strategic catalyst, a debate re-hashed from the years preceding REF2014, where harried HEI strategy departments wondered aloud what exactly David Willetts was going to ‘do’ with c.7,000 impact case studies (answer: not a lot). As Ant Bagshaw, writing for Wonkhe maintains, this premise “is rather cheeky both in its tasking of UKRI to be more ‘imaginative’ – the body doesn’t exist yet… also it’s a clear request for cash[from]… researchers with the authorial paws on the document… In the Brexit context… that request is all the more important.” Whatever the solution, for good or ill more effective data-driven benchmarking could be utilised as a means to ‘efficiently’ exclude some HEIs from funding schemes under the UKRI umbrella (as per the DTC model). This is a point one can be sure the established HEI associations will foreground within written submissions to the forthcoming consultation.

A conclusion, and a beginning

The report was notable in that, akin to an obverse King Canute, Stern was able to drive back the metric tide purportedly backed within some governmental circles. In line with the HEFCE sponsored report, led by James Wilsdon, Stern calls for a responsible approach: “Panels should set out explicitly how they have used bibliometric data in their working methods” (para. 76). It might be for consultation respondents to press on how panels should set out, in advance, how they will use research metrics. As Wilsdon notes, in an excellent summary of the report’s key judgements, Stern’s approach “maintaining the primacy of peer review, using carefully-selected metrics in the environment section of the REF, and improving data infrastructure and interoperability – is completely in line with the findings in The Metric Tide. And a new Forum for Responsible Metrics, involving all the major research funders, will take forward the detailed work needed to get this system up and running for REF 2021.” Any pre-review of the use of metrics would be a key task for the forum, not least as the  door has been left ajar for some limited sampling if “subject panels are able to make the case, explicitly supported with reference to robust evidence, that bibliometric data could be used to reduce the workload” (para. 71).

Overall, one is left with a sense that REF2021 will see a greater percentage of research, researchers, and research outcomes submitted within discrete and focussed – rather than strategically engineered – returns; all supported beneath an interdisciplinary superstructure [x]. It is indicated that “by the end of the year a formal consultation should be issued so that the community can offer their views on the proposed process and the future REF formula. The decisions arising from this consultation should be published in the summer of 2017” (para. 117). One imagines that after a brief flurry of commentary, a significant amount of covert legwork will take place, as stakeholders positon themselves for the horse-trading that will follow.

Notes

[i] To give the report its full title “Building on Success and Learning from Experience: An Independent Review of the Research Excellence Framework.”

[ii] Alas, my call for a blanket ban on the word impactful fell on deaf ears – albeit one solitary appearance, in the call “to make the UK research base even more productive and impactful” (para. 59).

[iii] Whether researchers can be submitted with zero outputs, remains a point for discussion. It is projected that UoAs will be required to submit “two outputs on average per submitted full-time equivalent (FTE) individual” (para. 67).

[iv] With every indication that the ‘Stern’ puns will not be going away at any time soon.

[v] Manville et al (March 2015) Evaluating the Assessment Process for the Impact Element of Research Excellence Framework 2014, RAND Europe (p.38)

[vi] The word ‘burden’ appears 29 times, possibly the most instances per page of any published work since ‘The Pilgrim’s Progress’.

[vii] Sections within the REF2014 environment template (section b) and impact template (section c), respectively.

[viii] Digital Science (July 2015) REF 2014 Impact Case Studies and the BBSRC(p.2)

[ix] Whether this means UKRI will seek to replace Researchfish, will presumably be up for discussion.

[x] Marxist wordplay, partly intentional.


We need to talk about research impact (again)

imapct

REF

Original post 

University of Salford’s Impact Coordinator – Chris Hewson discusses why we need to talk about research impact:

Over the last eighteen months, much has been written and said about impact, and how Higher Education Institutions (HEIs) can effectively, and efficiently, place themselves on a secure footing in preparation for the next Research Excellence Framework (REF). Nonetheless, it could be argued that the fevered animation generated by REF2014 has led to a prolonged and ongoing hangover. For most academics and administrators the experience of co-producing impact case studies was a forensic and thought-provoking, albeit ‘seat of the pants’ and largely extemporised experience. The refrain consistently repeated in strategy offices across the land goes something like this: ‘…there is absolutely no chance we’re going to execute our REF impact strategy in such an unsystematic and post-hoc fashion come 2020.’

But we are, aren’t we? As Julie Bayley and Casper Hitchens note, “the burden of effort and pressure to ‘find impact’ led to impact fatigue and a tarnished view of the concept” [i]. Their remedies are sound, and were arrived at independently by a number HEIs of in the aftermath of 18th December 2014; the need for greater planning and (ongoing) data collection, the institutional normalisation of impact, the co-ordination of both internal and external engagement processes, and so forth. The authors playfully mimic the language of HEFCE, noting the dawning of “an opportunity to significantly and demonstrably… change how we achieve impact.

Read more…..


New REF regulations for published research papers from 1 April

NMUnder the Higher Education Funding Council for England (HEFCE) Open Access Policy, all peer reviewed journal articles and conference papers, accepted for publication from 1 April 2016, must meet open access requirements to be eligible for the next REF.

The University has developed an internal support approach which will minimise additional work for staff and postgraduate research students. When a paper is accepted for publication, the author must deposit the accepted manuscript into USIR as soon as possible. Colleagues in the Library will then either make sure that all other requirements under the HEFCE Policy are met, or, contact the author to advise on next steps.

Read more…..