Posts in SIRC Category

How to Write a 4* Journal Article

Professor Mark Reed, Professor of Socio-Technical Innovation at Newcastle University

In December, Prof Mark Reed, Professor of Socio-Technical Innovation at Newcastle University and the man behind Fast Track Impact, tweeted some thoughts on how to write a 4* paper for the REF and wrote a blog about it. This post is published here with the author’s permission.

How do you write a 4* paper for the Research Excellence Framework (REF)? It is a question I’ve asked myself with some urgency since the Stern Review shredded my REF submission by not allowing me to bring my papers with me this year to my new position at Newcastle University.

Obviously the answer is going to differ depending on your discipline, but I think there are a few simple things that everyone can do to maximize their chances of getting a top graded research output.

I’m going to start with the assumption that you’ve actually done original, significant and rigorous work – if you haven’t then there is no point in reading any further. However, as I am increasingly asked to pre-review papers for colleagues across a range of disciplines, I am seeing examples of people who write up work as a 2* or 3* paper that has the potential to get a better score. I should point out that I believe that there is an important role for 1* and 2* papers, and that I regularly write these on purpose to address a problem of national significance and frame it for the specific, narrow audience that is likely to be able to benefit most from my work. However, whether I like it or not, as a Professor in a research-intensive University, there is an expectation that I will be submitted as a 4* researcher, which means I need a few 4* papers as well.

You can see some more detailed thoughts on what I think makes 4* for different types of paper in this Tweet:

https://twitter.com/profmarkreed/status/801348612345253888/photo/1

As you’ll see from the discussion under that tweet though, my more detailed thoughts probably only apply to Units of Assessment across panels A-C, and probably isn’t relevant to the arts and humanities.

Having said this, I think there are a number of things we can all do to maximize the chances of our work being viewed favourably by REF panelists.

  1. Write to the criteria:when I was learning to drive, my instructor told me that in the test I should make sure I moved my head when I was looking in the rear view mirror, to make sure the examiner noticed I was using my mirrors. We’re all used to writing to the criteria of funding calls, and in fact we are all perfectly used to writing papers to the criteria of our target journals. In the last REF, research outputs were judged against three criteria: originality, significance and rigour. Whatever the interpretation of these criteria in your discipline, have you made it explicit to REF panelists reading your work exactly what is original, and why it is so original? Have you explained and effectively justified the significance of your work? And have you included evidence that your methods, analysis and interpretation is rigorous, even if you have to use supplementary material to include extra detail about your methods and data to get around journal word limits?
  2. Get REF feedback before you submit your work for publication:find out who is going to be reviewing research outputs for REF internally within your Unit of Assessment at your institution and ask them to review your work before you submit it. They may be able to make recommendations about how you might improve the paper in light of the REF criteria. Sometimes a little bit of extra work on the framing of your research in relation to wider contexts and issues can help articulate the significance of your work, and with additional reading and thinking, you may be able to position your work more effectively in relation to previous work to demonstrate its originality more clearly. Adding a few extra details to your methods and results may re-assure readers and reviewers that your approach is indeed rigorous. This is not just about doing world-leading research; it is about demonstrating to the world that your work is indeed world-leading. For me, these criteria are nothing new and are worth paying attention to, whether or not we are interested in REF. Meeting these three criteria will increase the chances that you get through peer-review and will increase the likelihood that your work gets cited.
  3. Analyse and discuss good practice in your own area: the only way to really “get your eye in” for REF is to actually look at examples of good and poor practice in your own area. Below, I’ve described how you can design an exercise to do this with your colleagues. You can do it yourself and learn a lot, but from my own experience, you learn a lot more by doing this as a discussion exercise with colleagues who work in your area. If you can, take notes from your discussion and try and distill some of the key lessons, so you can learn collectively as a group and more effectively review and support each other’s work.

How to organize a discussion to work out what makes a 4* paper in your area:

  • Identify top scoring institutions for your Unit of Assessment (UOA): download the REF2014 results, filter for your UOA (columns E or F), then filter so it only shows you the outputs (column J), and then filter for 4* (column L), showing only the institutions from your UOA that had the highest percentage of 4* outputs. Now for those institutions, look across the table (columns L-P) to see which has the highest proportion of outputs at either 3* or 4*. For example, an institution may have 80% of its outputs graded at 4* and 15% graded at 3*, meaning that 95% of its outputs were graded at 3-4*
  • Download a selection of papers from the top scoring institutions: go to your UOA on the REF website, find and click on the institutions you’ve identified in step 1, under “view submission data”, click on “research outputs”, copy and paste output titles into Google Scholar (or your search engine of choice) and download the articles. You may want to select outputs randomly, or you may want to go through more selectively, identifying outputs that are close to the areas your group specialize in
  • Repeat for low scoring institutions so you can compare and contrast high and low scoring outputs
  • Discuss examples: print copies of the high and low scoring outputs, labeled clearly, and in your next UOA meeting, let everyone choose a high and a low-scoring example. Given them 10-15 minutes to quickly read the outputs (focusing on title, abstract, introduction, figures and conclusions so you’re not there all day) and then ask the group (or small groups if there are many of you) to discuss the key factors that they think distinguish between high and low scoring outputs. Get your group(s) to distill the key principles that they think are most useful and disseminate these more widely to the group, so that anyone who wasn’t present can benefit.

It would be great if I could tell you that these are my “three easy ways to get a 4* paper” but doing work that is genuinely original, significant and rigorous is far from easy. If you have done work that is of the highest quality though, I hope that the ideas I’ve suggested here will help you get the credit you deserve for the great research you’ve done.


Engineering industry event – JMEE: Enhancing the Participation of Industry in Research Projects in Telecommunications and Energy Sectors

Attendees at JMEE Engineering EventEarlier in April Professor Haifa Takruri MBE, Director of the JMEE (Joint MsC Electrical Engineering) programme, organised a special industry event which presented the project’s progress to date. The workshop entitled ‘JMEE: Enhancing the Participation of Industry in Research Projects in Telecommunications and Energy Sectors’ covered the processes involved in developing the JMEE programme, as well as knowledge sharing, academia-industry collaboration and EU and Palestine cultural exchange.

A fantastic example of industry collaboration in action, the event was attended by a number of high-profile engineering partners. Mr Nigel Platt, System Engineering Manager at Siemens Energy, presented about AC and HVDC interconnections for offshore wind farms, from the platform installation to the energy transfer to land. Nigel answered audience questions about wind farm designs, voltage transfer and average output yield on the farms.

Professor Andy Sutton, Principal Network Architecture at BT and a visiting Professor in CSE, presented state-of-the-art research and standards development in 5G telecommunications technology, demonstrating how future IMT technology development is shaping the strategies for 2020 and beyond.

Dr Sam Grogan, Pro-Vice Chancellor Students Experience, brought the discussion back to student experience by speaking about the work the University is doing both locally and internationally in developing the entrepreneurial skills of students.

The talks were followed by an intense and technical discussion showing the vast experience and understanding of the sector by the speakers and participants. After lunch the JMEE team visited Siemens Ardwick railway maintenance facilities. Delegates got to see the new electric train and diesel train maintenance methodology. The group gained an understanding into the capacity and operation required to ensure commuter services are sustained in the Manchester region.

Haifa, who recently received an outstanding achievement award for her work in engineering, said: “It was a great pleasure to host the JMEE workshop at our Media City campus. I am grateful to the speakers for sharing their industrial knowledge and experience with the consortium and for EU TEMPUS for funding the JMEE project.”


Prof Haifa Takruri-Rizk recognised as inspiration for female engineers

Prof Haifa Takruri-Rizk

Prof Haifa Takruri-Rizk collecting her award

Professor Haifa Takruri-Rizk from the School of Computer Science and Engineering has been recognised once again for her work to attract more females into engineering. Prof Takruri-Rizk was awarded an Outstanding Achievement Award at the recent North West Engineering Excellence Awards held in Manchester.

The joint award is from the Institute of Engineering & Technology (North West), the Institute of Mechanical Engineering and the Institute of Chemical Engineering (Manchester branches).

Earlier this year, she was the key speaker at an event organised by Barclay’s Academy to inspire hundreds more schoolgirls to follow careers in electronic engineering and computer science – the latest in a raft of ‘mentoring’ work she undertakes including the annual summer school for young women.

Through her expertise in electronics and mobile networking, and initiatives to address the participation of women and ethnic minorities, Haifa has worked with the BBC, Opportunity Now, the Royal Academy of Engineering, Skillset, Equal Opportunities Commission, Women’s Engineering Society, UK Resource Centre for Women in SET, Science, Engineering, Manufacturing and Technologies Alliance (SEMTA) and many others. Furthermore in 2009 she was awarded the MBE for services to women, black and minority ethnic people in science, engineering and technology education.

 


Salford Professors launch phonebox book at the National Telephone Kiosk Collection

Profs Nigel Linge and Andy Sutton with their publication in gift shop

Nigel Linge (left) and Andy Sutton (right)

Professor of Telecommunications Nigel Linge and Visiting Professor Andy Sutton, both from the School of Computing, Science and Engineering, last week launched their second book ‘The British Phonebox’ at Avoncroft Museum in Bromsgrove.

The Avoncroft Museum of Historic Buildings hosts the National Telephone Kiosk Collection and as Nigel said, “when you have written a book about phone boxes, where else would you choose to launch it but at the museum that is the home of the kiosk”. Despite the fact that phone boxes have declined in number and are used less and less each year, the older red ones have become icons of Britain, recognised the world over. Nigel and Andy’s book not only traces the origins of the British phone box from its birth in 1884 but also includes details and photographs of all major versions that have appeared on our streets and proves that the phone box still has a future by showcasing new designs that are being introduced this year.

The British Phonebox is published by Amberley Publishing. Their first book, ’30 years of Mobile Phones in the UK’ was also published by Amberley in 2015. Profs Nigel Linge and Andy Sutton by phonebox


ESRC Festival of Social Science 2017 – Call for Proposals

ESRC Festival 15th Year BannerBuilding upon the successful collaboration from last year, University of Salford will partner with the Economic and Social Research Council, the University of Manchester and Manchester Metropolitan University to deliver the ESRC Manchester Festival of Social Science.

The aim of the Festival is to showcase Manchester social science research to a broad non-academic audience. Last year we hosted an eclectic blend of activities designed to celebrate the social sciences, including discussions and debates, exhibitions, schools visits, workshops, and lots more.

The call for applications is now open. The Festival runs from 4-11 November and will involve academics working alongside community and cultural partners to create engaging and inspiring research-led events, aimed at a broadly non-academic audience. The goal is to provide an insight into the many ways social science contributes to social, economic and political life across our cities, regions and beyond.

Any researcher or team can apply to hold an event under the ESRC Festival banner. Applicants can also request up to £1,000 sponsorship from the University of Salford to hold an event as part of the Festival. This will also be an excellent opportunity to tag these events to the University’s 50th anniversary celebrations. Events must include social science and seek to engage groups outside of academia including young people, third sector organisations, business, local government, policy makers and the general public.

We particularly welcome applications that:

  • Seek to bring together two or more festival partners
  • Seek to deliver interdisciplinary events
  • Consider the role and future of social science as a discipline
  • Involve early career researchers
  • Address issues pertinent to the Manchester city-region

For inspiration and ideas for the kind of event you might run, you can find out about the 2016 events at www.esrcmanchesterfest.ac.uk

Please note, applications SHOULD NOT be made directly to the ESRC, but rather via the University of Salford. The application deadline is 4pm on Friday 5th of May. The application form and guidance can be requested through research-impact@salford.ac.uk.

Further details can be found on the ESRC website, including eligibility criteria: www.esrc.ac.uk/public-engagement/festival-of-social-science/apply-to-organise-an-event/

 


Publishing in Scholarly Journals

Peer review of scholarly writingAs a researcher, sharing your work is essential to furthering the discussion, development and potentially even funding of your findings. The sheer quantity of guides available on “how to write” and “how to target X journal” perhaps signifying the impact of targeting the right place and the best audience for your research.

Before reaching the stage of submitting in the hope of publication, many publishers expect researchers to have already made some key considerations:

  1. Is your research original, engaging, innovative?
  2. Who do you expect to be the audience for your research?
  3. Which journal(s) do you think might be interested in accepting your article for publication and does your article fit with their aims, scope and style?
  4. What are your open access needs?
  5. Is your manuscript suitably and well written (free from grammatical error, solid narrative, clear abstract and conclusions) in accordance with the journal’s style guide?

Your researching peers and foremost, your supervisor, are the best place to start for advice on where to publish and whether your manuscript is ready. Then, once you think you have found the right journal for your article, you should read their Author’s Guide and make sure you can freely submit to them as some journals are invitation-only.

Read more…..


Salford ICZs at Work in a Research Partnership with NEC, BT & EE

ICZs in Action BT, EE, NEC

Photograph – Nick Harrison

Telecommunications student Odum Rowani is conducting a leading-edge study of how weather affects mobile networks in partnership with top engineers from NEC, BT and EE.

Odum, who graduated in MSc Data Telecommunications Networks, is researching for his PhD on the effects of variations in global weather conditions on the quality of data transmission for mobile networks.

And he has the perfect test-bed for his work at the University of Salford after telecom giants NEC, BT and EE chose Salford as a research partner to test new 4G evolution and 5G related network technology.

Odum, who is from Nigeria, said: “A challenge for engineers is how to connect the evolved 4G and 5G cell sites back to the operators core network, and one solution is the use of V-band point to point radio systems.”

Much testing is still needed on the optimum deployment and robustness of ‘point-to-point’  transmissions which use radio millimetre wave frequencies in the 60GHz band; particular how they may stand up to the rigours of the British weather.

Using the University of Salford as a base, the NEC, BT and EE have created a research site to measure the performance of the V-band radio system over a 12-month period when exposed to rain, wind, fog and ice.

“This will be one of the most detailed tests of this type done anywhere in the world to date, so we are delighted it will be hosted in Salford with our partners NEC, EE and British Telecom,” explained Professor Nigel Linge, one of Odum’s professors.

“Millimetre wave point-to-point links operate at very high frequencies to transmit high volumes of data over relatively short distances.  However, the high frequency does mean that it is possibly affected by climatic conditions – the question being by how much.”

The University has installed a radio system complete with transceivers and antennas on the Newton Science and Engineering building and the Maxwell Building at its Peel Park Campus and will monitor transmissions until early 2018.

Stephen Walthew, Manager – Transport Networks at NEC Europe, said Salford was a perfect choice for the testing:  “We were looking for an urban area, somewhere the weather is very variable and where there is expertise in network engineering. Given our long-standing relationship with Professor Linge and his colleagues, we are delighted the University of Salford can host the tests.”

“The 60GHz connection has the opportunity to become the solution of choice for high capacity backhauling, so the more scientific evidence we can collect about its performance, the better we can make decisions about design and deployment.”

 


Student named rising star in Cyber Security

A Computer Science student at the University of Salford has been named a rising star in Milda Petraityteher profession by Cyber World magazine.

Milda Petraityte juggles studying part time for an MSc in Information Security management, with working at global professional services firm KPMG as a cyber security consultant.

She said: “I’m flattered and slightly shocked to have been named as Cyber World’s Rising Star. I have been working with my lecturer Dr Ali for the past year and he has been very supportive of me. Studying in the university I enjoyed his module of Cyber Security in Practice where we focused on IT systems pen testing and forensic investigations”.

Milda has gained a wealth of knowledge and experience working at KPMG, for example advising clients on various cyber security aspects and helping to deal with general company issues on a daily basis.

She added: “My job allows we  to understand the world of cyber from a business perspective. It is an exciting time to be working in the Cyber Security field. It is a booming industry, and in my job I am lucky in that I learn something new every day”.

Dr Ali Dehghantanha, Lecturer in Cyber Security and Forensics, within the School of Computer Science and Engineering said: “Milda is one of the most talented and hardworking students that I have ever had in my academic career. She volunteers for completing the most difficult tasks and has managed to meet deadlines in spite of her heavy workloads. It is a great pleasure to work with such a talented candidate”.


The Impact Environment: REFlections on the Stern Review (Part 1)

Stern

By Dr Chris Hewson, Impact Coordinator

Upon its release last Thursday, the Twittersphere became the locus for a series of overlapping debates on Lord Nicholas Stern’s Review of the REF (see:#SternReview) [i]. This was heartening, chiming with my previous post ‘We need to talk about research impact (again)’ on the need for ‘robust discussions’; a refrain I will seek to expand upon in future pieces [ii]. The report presents a considered and balanced perspective, seeking to develop the well-regarded aspects of REF2014, whilst addressing the three blights of disciplinary siloing, resource burden, and permissible yet unprincipled ‘gaming’. In what follows, I consider the wider canvas upon which the report paints, interspersing this with observations on the structure of the proposed REF2010 ‘impact environment’. In a follow up post, I’ll build on these points, considering how the report seeks to reconfigure impact case study submission, and how this may have knock-on effects with respect to how Higher Education Institutions (HEIs) manage, promote, and report knowledge exchange.

Full submissions and portability

The beating heart of the public debate was Recommendation 3 “Outputs should not be portable [rather, tied to] the institution where the output was demonstrably generated” (para. 73); thus bringing outputs into line with existing rules on HEIs claiming impact. A clear worry was rapidly identified, that this would remove one obvious source of leverage still available to early career researchers in a tight and often volatile job market. This concern was tied to a series of parallel discussions around Recommendation 1 “All research active staff should be returned” (para. 65) and the HR manoeuvrings this could unleash in deciding who counts as ‘research active’ (and indeed what counts as ‘research activity’). This has the potential to usher in new modes of marginalisation, and as Richard Watermeyer suggests this “profound problem… could end up diminishing what universities recognise as the role and contribution of the researcher: primarily, the successful procurement of research funds and prominence.” Nevertheless, it is likely that the devil will be in the detail, with the work of high performing academics – who may be able to submit as many as six outputs – conceivably mitigating the risk of including a ‘long tail’ of less prolific researchers [iii]. Much will rest on how individual HEIs establish and interpret ‘who should be doing what’ within given research units, however configured. I will not dwell on these points, except to note that they have been thoughtfully covered by others, including  Athene Donald, Martin Eve, Adam Golberg and Liz Morrish [iv].

Enhancing impact

Mark Reed has provided a sympathetic summary of Stern’s impact-focused recommendations, arguing that the report successfully addresses a perceived narrowing of impact that occurred in the post-REF2014 period. To those working in an impact support role this ossification was always largely a matter of perception. Nevertheless, as a contribution towards a concerted ‘push-back’, Stern is of considerable assistance. As Reed notes “it was not HEFCE who constrained the definition of impact; it was the way that the academy interpreted HEFCE’s very broad definition in strategic and instrumentalist terms.” It is probable that the way the REF was/is managed within HEIs adds to this hive of semantic circumspection. One would therefore hope that whilst Stern maintains that “all panels should have the same broad approach to impact”(para.82), in the second iteration of the REF the individual sub-panels – the“number and shape” of which Stern argues “was about right” (para. 63) – are allowed freer rein to define what impact is ‘for them’. The indications, which I will cover in my next post, are that a reassertion of the value of impacts on cultural life, via public engagement, and through pedagogy will bolster this fresh optimism. One is also left with an irony, in that more prescriptive guidance could ‘lock in’ a broader overall interpretation of how research impact can and should be presented.

Unifying templates

Perhaps the area of most considerable interest is Stern’s recognition of the oft-touted suggestion to merge the environment and impact templates. In one move this increases the value of impact without changing the 65%-20%-15% structure of the REF (impact cases studies now worth 20% of an HEI submission, rather than 16%, with the combined templates remaining at 15%). This sits alongside proposals to introduce institutional as well as Unit of Assessment (UoA) Environment templates, removing a previously unavoidable layer of repetition (and institutional boilerplate), and establishing a means of assessing “steps taken to promote interdisciplinary and other joint working internally and externally and to support engagement and impact, beyond that which is just the aggregate of individual units of assessment” (para.88). The proposal to allow the (tick-box) identification interdisciplinary outputs, as well as document the role of ‘interdisciplinary champions’ (para. 100) – whilst probably not assuaging critics such as Derek Sayer – are also clear moves in the right direction. Besides, this latter suggestion will likely provide a welcome springboard for public engagement champions, and those in similar, often semi-official HEI roles.

The writing was on the wall for the impact template, the moment the HEFCE commissioned RAND review detailed respondents who “spoke of the ‘fairy tale’-like nature of the impact templates, which they felt to be ‘a cosmetic exercise’… Whilst the information the template provided was good to have, there was no way of verifying claims that were made without having a site visit, and there was no confidence that the impact template reflected reality” [v]. As an impact manager, one might also view this tweak as a barely concealed attempt to make the REF ‘UKRI ready’, actively fulfilling the need expressed by RCUK to move knowledge exchange out of the periphery, “embedding throughout the research base a culture in which excellent research departments consistently engage with business, the public sector and civil society organisations, and are committed to carrying new ideas through to beneficial outcomes”. On a practical level outlining a UoA’s research and knowledge exchange trajectory within a unified document is also manifestly more straightforward to execute. As Stern notes:“impact and environment should be seen in a more integrated way and at a more institutional level… becom[ing] more strategic and forward looking whilst retaining a strong evidence base in past performance” (para. 126). Come the forthcoming consultation, a key point of contestation will likely be around the credit split between institutional and UoA components, the report only hinting at the discussions to come: “a share of QR funding should be awarded to the institution based on its Institutional Environment statement and the institutional-level impact case studies which it submits. This innovation will require careful testing and we recommend that the funding bodies explore options for piloting the institutional level assessment to test this proposal” (para. 91).

Looking back, looking forward

An allied issue will be how, without increasing the burden [vi], the new template structure can adequately assess “the future research and knowledge exchange strategy of the HEI, as well as the individual Units of Assessment, and the extent to which both have delivered on the strategies set out in the previous REF” (para. 88). In seeking to collapse institutional boundaries, via a two tier submission, opportunities open up for a more rounded expression of an HEI’s medium to long-term aims and objectives. However, at the same time this affords an opportunity to senior managers, who may seek to actively dismantle and distance themselves from existing institutional plans, as the refreshed REF itself becomes a strategic determinant over the preceding four year period. In a surprising – but not unwelcome – sojourn into detail, the report goes as far as proposing fourteen ‘headers’ for the Institutional and UoA Environment statements, including brief mention of ‘progress’ and ‘strategic plans’ previously outlined (para. 94). This could be beefed up, asking HEIs to reflect in more direct and tangible terms on existing – and one might add publicly available – ‘research strategy’ and ‘(impact) strategy and plans’, as contained within REF2014 submissions [vii].

Data sharing

How data is shared across the research system feeds into the strategic benchmarking, implied above. Stern recognises this as a deep-seated challenge, advising that “attention will have to be paid to the quality and comparability of databases… an issue which applies for the sector as a whole and the new UKRI”(para. 126). In the current system the trail from source funding, via published research towards consolidation within a REF impact case, is somewhat haphazard. A recent report from the BBSRC exemplifies this predicament, citing“a disparity between the institutional distribution of BBSRC research funding and the distribution of [REF] case study acknowledgments… a result of local expectations and practice, driven by researchers and research managers drafting case studies” [viii]. Put simply, for the purposes of accountability different parts of the research system require different forms of output (evidence), each founded upon different forms, or configurations of input (resource). In untangling this, and generating a solution that works for all parties including central government, the report is vague, noting only that HESA will be consulted with respect to numbers of outputs required per UoA (para. 70) [ix]. However, the general thrust of the report clearly fits with a stated desire that UKRI and its board should use REF as a strategic catalyst, a debate re-hashed from the years preceding REF2014, where harried HEI strategy departments wondered aloud what exactly David Willetts was going to ‘do’ with c.7,000 impact case studies (answer: not a lot). As Ant Bagshaw, writing for Wonkhe maintains, this premise “is rather cheeky both in its tasking of UKRI to be more ‘imaginative’ – the body doesn’t exist yet… also it’s a clear request for cash[from]… researchers with the authorial paws on the document… In the Brexit context… that request is all the more important.” Whatever the solution, for good or ill more effective data-driven benchmarking could be utilised as a means to ‘efficiently’ exclude some HEIs from funding schemes under the UKRI umbrella (as per the DTC model). This is a point one can be sure the established HEI associations will foreground within written submissions to the forthcoming consultation.

A conclusion, and a beginning

The report was notable in that, akin to an obverse King Canute, Stern was able to drive back the metric tide purportedly backed within some governmental circles. In line with the HEFCE sponsored report, led by James Wilsdon, Stern calls for a responsible approach: “Panels should set out explicitly how they have used bibliometric data in their working methods” (para. 76). It might be for consultation respondents to press on how panels should set out, in advance, how they will use research metrics. As Wilsdon notes, in an excellent summary of the report’s key judgements, Stern’s approach “maintaining the primacy of peer review, using carefully-selected metrics in the environment section of the REF, and improving data infrastructure and interoperability – is completely in line with the findings in The Metric Tide. And a new Forum for Responsible Metrics, involving all the major research funders, will take forward the detailed work needed to get this system up and running for REF 2021.” Any pre-review of the use of metrics would be a key task for the forum, not least as the  door has been left ajar for some limited sampling if “subject panels are able to make the case, explicitly supported with reference to robust evidence, that bibliometric data could be used to reduce the workload” (para. 71).

Overall, one is left with a sense that REF2021 will see a greater percentage of research, researchers, and research outcomes submitted within discrete and focussed – rather than strategically engineered – returns; all supported beneath an interdisciplinary superstructure [x]. It is indicated that “by the end of the year a formal consultation should be issued so that the community can offer their views on the proposed process and the future REF formula. The decisions arising from this consultation should be published in the summer of 2017” (para. 117). One imagines that after a brief flurry of commentary, a significant amount of covert legwork will take place, as stakeholders positon themselves for the horse-trading that will follow.

Notes

[i] To give the report its full title “Building on Success and Learning from Experience: An Independent Review of the Research Excellence Framework.”

[ii] Alas, my call for a blanket ban on the word impactful fell on deaf ears – albeit one solitary appearance, in the call “to make the UK research base even more productive and impactful” (para. 59).

[iii] Whether researchers can be submitted with zero outputs, remains a point for discussion. It is projected that UoAs will be required to submit “two outputs on average per submitted full-time equivalent (FTE) individual” (para. 67).

[iv] With every indication that the ‘Stern’ puns will not be going away at any time soon.

[v] Manville et al (March 2015) Evaluating the Assessment Process for the Impact Element of Research Excellence Framework 2014, RAND Europe (p.38)

[vi] The word ‘burden’ appears 29 times, possibly the most instances per page of any published work since ‘The Pilgrim’s Progress’.

[vii] Sections within the REF2014 environment template (section b) and impact template (section c), respectively.

[viii] Digital Science (July 2015) REF 2014 Impact Case Studies and the BBSRC(p.2)

[ix] Whether this means UKRI will seek to replace Researchfish, will presumably be up for discussion.

[x] Marxist wordplay, partly intentional.


Academic elected to prominent international committee

stefanpletschacherfinal

Computer Science Lecturer Stefan Pletschacher joins the ALTO Editorial Board

An academic from the School of Computing, Science and Engineering, Stefan Pletschacher, has been elected to a prominent committee overseeing the main format used by the world’s most important libraries to represent and make available their digitised holdings.

Stefan, a member of the PRImA (Pattern Recognition & Image Analysis) Lab, has been elected by his peers to the Editorial Board of ALTO. ALTO (Analysed Layout and Text Object) is a type of file used by libraries and software companies worldwide as the representation format for digitised content. As such, it plays a key role in the ongoing efforts to make mankind’s printed heritage available online. It is maintained by The Library of Congress and overseen by the international editorial board.

The Editorial Board is responsible for standardising the implementation of ALTO so that it is generic enough to cover a variety of real world uses and practical application by software developers, while at the same time avoiding ambiguity and misinterpretation when taking into account differences in writing systems and languages worldwide.

Read more…..