Posts tagged: HEI

Unclear education roles to support practice learning – A blog post by Dr Jackie Leigh @JackieALeigh

HEIs are responding to the Nursing & Midwifery Council (NMC) Consultation on Standards of Proficiency for Registered nurses. Also required is for HEIs to tell the NMC their views on the Education Framework: Standards for Education and Training.

The NMC refer to their standards as being ambitious, setting out the enhanced knowledge and skills that people can expect from nurses in the future. It will be interesting to see if this view is reflected in the results of the consultation.

It is interesting to look at the two documents in terms of practice learning, particularly in relation to by whom and how nurses will be supervised and assessed in clinical practice and what should be the educational requirement.

A key message and what will not change from the current pre-registration standards is the fundamental requirement for partnerships between HEIs and healthcare organizations to provide the practice based learning for the student nurse.

What is new is the introduction of the Five Pillars for education and training:

  1. Learning culture
  2. Educational governance and quality
  3. Student learning and empowerment
  4. Educators and assessors
  5. Curricula and assessment

What seems to be absent from this new consultation and draft document are the prescriptive elements for the education and on-going continuing professional development needs of educators and assessors of student nurse in practice. The current requirements for mentorship are set out in the 2008 NMC Standards to Support Learning and Assessment in Practice: Preparation for Mentors, Practice Teachers and Teachers, and have led to the proliferation of credit and non- credit bearing programmes that prepare the qualified nurse for the role of mentor. Prescriptive annual updates are also required in order to comply with maintaining ‘live’ mentorship recognition.

In the current NMC education framework consultation document, new roles are introduced such as practice supervisor (Pillar 3), educator and assessor (Pillar 4). When reading this document is it clear the difference in roles and preparation for the role?

Education Framework

The NMC state:

“Our education framework and the new requirements for learning and assessment provide flexibility for approved education institutions, practice placement and work placed learning providers in developing innovative approaches to education for nurses and midwives while being accountable for the local delivery and management of NMC approved programmes in line with our standards” (NMC 2017:5).  Is this statement permission by the NMC for entrepreneurialism or are they being vague with no real ideas of their own?

Taking a position that the NMC are offering some flexibility regarding practice based learning, timely is the need for HEIs and healthcare organizations to work collaboratively and to set their benchmark for quality teaching and learning. Reconsidering models of student support is imperative. This includes the use of coaching as opposed to mentoring and to redefine the practice roles required. Be creative and entrepreneurial adopting service improvement and transformation tools and techniques.

Flexible should not mean reduced quality. Indeed, Health Education England whose wider remit for ensuring that there are high quality learning environments for all healthcare learners in England makes clear their expectations of what constitutes a quality clinical learning environment.

Within the Greater Manchester we are in a strong position, already adopting coaching approaches to supporting student clinical leadership development. The Greater Manchester Practice Education Group that is attended by all 4 HEIs and healthcare practice organisations provides the platform for leading innovations in healthcare delivery models. Clear is the need for academic leadership to deliver on any new models of education and to create the culture shift required.

What do you feel are the top 3 leadership behaviours required to affect change and to change cultures for practice learning?

Find out more:

 


University of Salford launches first student safety and wellbeing accreditation scheme developed by the Design Against Crime Solution Centre

 

 

A new accreditation scheme launched by the University of Salford has been developed by the Design Against Crime Solution Centre with the Head of Security at Salford that will make it easier for prospective students and their parents to identify safe universities in the UK.

All higher education institutions across the UK are now being encouraged to join ProtectED an accreditation scheme assessing the work done by universities to ensure their students’ safety, security and wellbeing.

They can then work towards accreditation by providing details about the services and structures they provide to enable students to avoid problems and achieve their full potential.

 

ProtectED is the first accreditation scheme in the UK’s higher education sector to comprehensively consider practices across the areas of student safety, security and wellbeing.

It is founded on the belief that HEIs have a critical role to play in student safety, security and wellbeing — one that does not end at campus boundaries but encompasses the wider student experience.

Professor Helen Marshall, Vice-Chancellor of the University of Salford, said: “An issue which the higher education sector has grappled with for years is that institutions have varied and different ways of considering the safety and wellbeing of their students, without a higher education specific code of practice and benchmark for policies and best practice.

“These are huge issues to students and parents, but up until now there has been no standard way of benchmarking and assessing how effectively universities manage the issue. I really welcome this work developed by our dedicated and internationally-recognised security and community relations team at Salford.”

Through the accreditation process, ProtectED will gain insight into issues and collect evidence on what works. This will be anonymised, aggregated and analysed, and findings shared with members, enabling them to focus resources on effective strategies that provide demonstrable benefits.

ProtectED accreditation focuses on five areas: Core Institutional Safety and Security – covering campus security measures; Wellbeing and Mental Health; International Students; Harassment and Sexual Assault; and the Student Night Out.

There are 2.3 million university students in the UK’s 162 HEIs — more that the population of Qatar. Office of National Statistics figures show full-time students are more at risk than the general population of being victims of crime, while an NUS survey of more than 1,000 students found 78 per cent had experienced mental health issues during the previous year.

ProtectED brings together university staff and students in tackling these issues, and requires HEIs to implement practical measures. For example, ProtectED universities will deliver training and awareness-raising initiatives to highlight the support available to students, and to facilitate conversation around sensitive subjects such as mental ill-health and sexual assault.

Research suggests that international students are particularly concerned about safety in their choice of where to study overseas.

Helen Clews, External Relations Adviser for the British Council and member of the ProtectED Advisory Board, said: “Personal safety in the UK for students, their dependents, visitors and workers coming to the UK is a duty of care the British Council takes very seriously and we work with partners such as ProtectED to help international students take care of themselves and settle happily into their community.”

Student retention is another significant issue. According to the Higher Education Statistics Agency, 26,000 UK students failed to complete their first year in 2010/11. ProtectED is based around the need for effective prevention, early intervention and timely support, raising levels of student satisfaction and enabling more students to complete their studies.

Mark Sutton, chairman of the Association of University Chief Security Officers (AUCSO), said: “The ProtectED code of practice gives a clear opportunity to benchmark processes and procedures that will allow universities to focus on sector best practice, continuous improvement and the student experience. It will raise standards throughout HE and therefore I fully support this excellent initiative.”

Ben Lewis, chairman of the Association of Managers of Student Services in Higher Education (AMOSSHE), said: “ProtectED gives real potential for institutions to think more strategically about how they structure their security and support services, how they work with one another and how they can improve all aspects of the student experience. AMOSSHE is fully supportive of the work being led by ProtectED and the team at Salford University.”

Dave Humphries, Director of Partnerships & Interventions at the Security Industry Authority, said: “As the UK Government’s regulator of private security, we support the ProtectED initiative as it is an innovative way to ensure a higher university security standards. We have been pleased to work alongside colleagues at the University of Salford.”

Institutions wanting to join must sign up to the five key ProtectED Principles, committing to adopting within their policies, structures, processes and culture.

To gain accreditation, applicant institutions must self-assess their own policies, processes and practice against the ProtectED Code of Practice. This is followed by peer review and a verification visit by a ProtectED approved assessor and student assessors.

Membership is open from Monday 6 February 2017, with the first group of ProtectED Accredited Institution award holders expected to be certified in early 2018.

For more information, visit www.Protect-ED.org, follow @ProtectED_HEI or email info@protect-ed.org

ENDS

Notes to Editors:

  1. ProtectED has benefitted from the support and guidance of organisations including the Association of University Chief Security Officers (AUCSO), the British Council, the Security Industry Authority (SIA), the Association of Managers of Student Services in Higher Education (AMOSSHE), the University Mental Health Advisers Network (UMHAN), Greater Manchester Police, student insurers Endsleigh, International Professional Security Association (IPSA), National Landlords Association, College & Universities Business Officers (CUBO).
  2. The launch of the ProtectED Code of Practice is especially timely given the publication in October 2016 of the Universities UK ‘Changing the Culture’ task force report, which examines violence against women, harassment and hate crime affecting university students. For example, the National Union of Students (NUS) ‘Hidden Marks’ report (2010) found that 68% of female students experienced one or more incidents of sexual harassment at university — a problem that has been increasingly reported upon in recent months. Further, the NUS ‘No Place for Hate’ survey (2012) found that 18% of students from ethnic minority backgrounds described experiencing at least one racial hate incident whilst at university. The Universities UK Task Force report clearly signals that HEIs can no longer continue to ignore these issues. The ProtectED Code of Practice incorporates all of the report’s recommendations and goes further in addressing staff-to-student sexual harassment, hate crime and cyber bullying.
  3. The wide-ranging measures contained in the ProtectED Code of Practice (the indicators universities must meet to achieve accreditation) were developed using an evidence-based approach. To better understand the issues facing contemporary HEIs and their students, the ProtectED team conducted a literature review of the mental health and wellbeing of students and young adults. They also ran focus groups with University Security Managers, Police Higher Education Liaison Officers and Students Union Sabbatical Officers, and surveyed 800 university NUS students.
  4. Eric Baskind, senior lecturer in law and consultant in violence reduction at Liverpool John Moores University, and a member of the ProtectED Advisory Board, said: “ProtectED provides institutions with an excellent tool for implementing best practice procedures and improving campus safety and thereby enhancing the student experience. It is an excellent initiative and has my full support.”
  5. It is proposed that the ProtectED accreditation scheme will eventually be expanded to cover UK further education (FE) colleges, as well as universities in other parts of Europe.

For press enquiries please contact: Conrad Astley, Senior Press and PR Officer, University of Salford at c.l.astley@salford.ac.uk  / +44 (0) 161 2956363


The Impact Environment: REFlections on the Stern Review (Part 1)

Stern

By Dr Chris Hewson, Impact Coordinator

Upon its release last Thursday, the Twittersphere became the locus for a series of overlapping debates on Lord Nicholas Stern’s Review of the REF (see:#SternReview) [i]. This was heartening, chiming with my previous post ‘We need to talk about research impact (again)’ on the need for ‘robust discussions’; a refrain I will seek to expand upon in future pieces [ii]. The report presents a considered and balanced perspective, seeking to develop the well-regarded aspects of REF2014, whilst addressing the three blights of disciplinary siloing, resource burden, and permissible yet unprincipled ‘gaming’. In what follows, I consider the wider canvas upon which the report paints, interspersing this with observations on the structure of the proposed REF2010 ‘impact environment’. In a follow up post, I’ll build on these points, considering how the report seeks to reconfigure impact case study submission, and how this may have knock-on effects with respect to how Higher Education Institutions (HEIs) manage, promote, and report knowledge exchange.

Full submissions and portability

The beating heart of the public debate was Recommendation 3 “Outputs should not be portable [rather, tied to] the institution where the output was demonstrably generated” (para. 73); thus bringing outputs into line with existing rules on HEIs claiming impact. A clear worry was rapidly identified, that this would remove one obvious source of leverage still available to early career researchers in a tight and often volatile job market. This concern was tied to a series of parallel discussions around Recommendation 1 “All research active staff should be returned” (para. 65) and the HR manoeuvrings this could unleash in deciding who counts as ‘research active’ (and indeed what counts as ‘research activity’). This has the potential to usher in new modes of marginalisation, and as Richard Watermeyer suggests this “profound problem… could end up diminishing what universities recognise as the role and contribution of the researcher: primarily, the successful procurement of research funds and prominence.” Nevertheless, it is likely that the devil will be in the detail, with the work of high performing academics – who may be able to submit as many as six outputs – conceivably mitigating the risk of including a ‘long tail’ of less prolific researchers [iii]. Much will rest on how individual HEIs establish and interpret ‘who should be doing what’ within given research units, however configured. I will not dwell on these points, except to note that they have been thoughtfully covered by others, including  Athene Donald, Martin Eve, Adam Golberg and Liz Morrish [iv].

Enhancing impact

Mark Reed has provided a sympathetic summary of Stern’s impact-focused recommendations, arguing that the report successfully addresses a perceived narrowing of impact that occurred in the post-REF2014 period. To those working in an impact support role this ossification was always largely a matter of perception. Nevertheless, as a contribution towards a concerted ‘push-back’, Stern is of considerable assistance. As Reed notes “it was not HEFCE who constrained the definition of impact; it was the way that the academy interpreted HEFCE’s very broad definition in strategic and instrumentalist terms.” It is probable that the way the REF was/is managed within HEIs adds to this hive of semantic circumspection. One would therefore hope that whilst Stern maintains that “all panels should have the same broad approach to impact”(para.82), in the second iteration of the REF the individual sub-panels – the“number and shape” of which Stern argues “was about right” (para. 63) – are allowed freer rein to define what impact is ‘for them’. The indications, which I will cover in my next post, are that a reassertion of the value of impacts on cultural life, via public engagement, and through pedagogy will bolster this fresh optimism. One is also left with an irony, in that more prescriptive guidance could ‘lock in’ a broader overall interpretation of how research impact can and should be presented.

Unifying templates

Perhaps the area of most considerable interest is Stern’s recognition of the oft-touted suggestion to merge the environment and impact templates. In one move this increases the value of impact without changing the 65%-20%-15% structure of the REF (impact cases studies now worth 20% of an HEI submission, rather than 16%, with the combined templates remaining at 15%). This sits alongside proposals to introduce institutional as well as Unit of Assessment (UoA) Environment templates, removing a previously unavoidable layer of repetition (and institutional boilerplate), and establishing a means of assessing “steps taken to promote interdisciplinary and other joint working internally and externally and to support engagement and impact, beyond that which is just the aggregate of individual units of assessment” (para.88). The proposal to allow the (tick-box) identification interdisciplinary outputs, as well as document the role of ‘interdisciplinary champions’ (para. 100) – whilst probably not assuaging critics such as Derek Sayer – are also clear moves in the right direction. Besides, this latter suggestion will likely provide a welcome springboard for public engagement champions, and those in similar, often semi-official HEI roles.

The writing was on the wall for the impact template, the moment the HEFCE commissioned RAND review detailed respondents who “spoke of the ‘fairy tale’-like nature of the impact templates, which they felt to be ‘a cosmetic exercise’… Whilst the information the template provided was good to have, there was no way of verifying claims that were made without having a site visit, and there was no confidence that the impact template reflected reality” [v]. As an impact manager, one might also view this tweak as a barely concealed attempt to make the REF ‘UKRI ready’, actively fulfilling the need expressed by RCUK to move knowledge exchange out of the periphery, “embedding throughout the research base a culture in which excellent research departments consistently engage with business, the public sector and civil society organisations, and are committed to carrying new ideas through to beneficial outcomes”. On a practical level outlining a UoA’s research and knowledge exchange trajectory within a unified document is also manifestly more straightforward to execute. As Stern notes:“impact and environment should be seen in a more integrated way and at a more institutional level… becom[ing] more strategic and forward looking whilst retaining a strong evidence base in past performance” (para. 126). Come the forthcoming consultation, a key point of contestation will likely be around the credit split between institutional and UoA components, the report only hinting at the discussions to come: “a share of QR funding should be awarded to the institution based on its Institutional Environment statement and the institutional-level impact case studies which it submits. This innovation will require careful testing and we recommend that the funding bodies explore options for piloting the institutional level assessment to test this proposal” (para. 91).

Looking back, looking forward

An allied issue will be how, without increasing the burden [vi], the new template structure can adequately assess “the future research and knowledge exchange strategy of the HEI, as well as the individual Units of Assessment, and the extent to which both have delivered on the strategies set out in the previous REF” (para. 88). In seeking to collapse institutional boundaries, via a two tier submission, opportunities open up for a more rounded expression of an HEI’s medium to long-term aims and objectives. However, at the same time this affords an opportunity to senior managers, who may seek to actively dismantle and distance themselves from existing institutional plans, as the refreshed REF itself becomes a strategic determinant over the preceding four year period. In a surprising – but not unwelcome – sojourn into detail, the report goes as far as proposing fourteen ‘headers’ for the Institutional and UoA Environment statements, including brief mention of ‘progress’ and ‘strategic plans’ previously outlined (para. 94). This could be beefed up, asking HEIs to reflect in more direct and tangible terms on existing – and one might add publicly available – ‘research strategy’ and ‘(impact) strategy and plans’, as contained within REF2014 submissions [vii].

Data sharing

How data is shared across the research system feeds into the strategic benchmarking, implied above. Stern recognises this as a deep-seated challenge, advising that “attention will have to be paid to the quality and comparability of databases… an issue which applies for the sector as a whole and the new UKRI”(para. 126). In the current system the trail from source funding, via published research towards consolidation within a REF impact case, is somewhat haphazard. A recent report from the BBSRC exemplifies this predicament, citing“a disparity between the institutional distribution of BBSRC research funding and the distribution of [REF] case study acknowledgments… a result of local expectations and practice, driven by researchers and research managers drafting case studies” [viii]. Put simply, for the purposes of accountability different parts of the research system require different forms of output (evidence), each founded upon different forms, or configurations of input (resource). In untangling this, and generating a solution that works for all parties including central government, the report is vague, noting only that HESA will be consulted with respect to numbers of outputs required per UoA (para. 70) [ix]. However, the general thrust of the report clearly fits with a stated desire that UKRI and its board should use REF as a strategic catalyst, a debate re-hashed from the years preceding REF2014, where harried HEI strategy departments wondered aloud what exactly David Willetts was going to ‘do’ with c.7,000 impact case studies (answer: not a lot). As Ant Bagshaw, writing for Wonkhe maintains, this premise “is rather cheeky both in its tasking of UKRI to be more ‘imaginative’ – the body doesn’t exist yet… also it’s a clear request for cash[from]… researchers with the authorial paws on the document… In the Brexit context… that request is all the more important.” Whatever the solution, for good or ill more effective data-driven benchmarking could be utilised as a means to ‘efficiently’ exclude some HEIs from funding schemes under the UKRI umbrella (as per the DTC model). This is a point one can be sure the established HEI associations will foreground within written submissions to the forthcoming consultation.

A conclusion, and a beginning

The report was notable in that, akin to an obverse King Canute, Stern was able to drive back the metric tide purportedly backed within some governmental circles. In line with the HEFCE sponsored report, led by James Wilsdon, Stern calls for a responsible approach: “Panels should set out explicitly how they have used bibliometric data in their working methods” (para. 76). It might be for consultation respondents to press on how panels should set out, in advance, how they will use research metrics. As Wilsdon notes, in an excellent summary of the report’s key judgements, Stern’s approach “maintaining the primacy of peer review, using carefully-selected metrics in the environment section of the REF, and improving data infrastructure and interoperability – is completely in line with the findings in The Metric Tide. And a new Forum for Responsible Metrics, involving all the major research funders, will take forward the detailed work needed to get this system up and running for REF 2021.” Any pre-review of the use of metrics would be a key task for the forum, not least as the  door has been left ajar for some limited sampling if “subject panels are able to make the case, explicitly supported with reference to robust evidence, that bibliometric data could be used to reduce the workload” (para. 71).

Overall, one is left with a sense that REF2021 will see a greater percentage of research, researchers, and research outcomes submitted within discrete and focussed – rather than strategically engineered – returns; all supported beneath an interdisciplinary superstructure [x]. It is indicated that “by the end of the year a formal consultation should be issued so that the community can offer their views on the proposed process and the future REF formula. The decisions arising from this consultation should be published in the summer of 2017” (para. 117). One imagines that after a brief flurry of commentary, a significant amount of covert legwork will take place, as stakeholders positon themselves for the horse-trading that will follow.

Notes

[i] To give the report its full title “Building on Success and Learning from Experience: An Independent Review of the Research Excellence Framework.”

[ii] Alas, my call for a blanket ban on the word impactful fell on deaf ears – albeit one solitary appearance, in the call “to make the UK research base even more productive and impactful” (para. 59).

[iii] Whether researchers can be submitted with zero outputs, remains a point for discussion. It is projected that UoAs will be required to submit “two outputs on average per submitted full-time equivalent (FTE) individual” (para. 67).

[iv] With every indication that the ‘Stern’ puns will not be going away at any time soon.

[v] Manville et al (March 2015) Evaluating the Assessment Process for the Impact Element of Research Excellence Framework 2014, RAND Europe (p.38)

[vi] The word ‘burden’ appears 29 times, possibly the most instances per page of any published work since ‘The Pilgrim’s Progress’.

[vii] Sections within the REF2014 environment template (section b) and impact template (section c), respectively.

[viii] Digital Science (July 2015) REF 2014 Impact Case Studies and the BBSRC(p.2)

[ix] Whether this means UKRI will seek to replace Researchfish, will presumably be up for discussion.

[x] Marxist wordplay, partly intentional.