Tag Archives: evaluation

Looking back (and forward)

Danielle Lloyd, has been with the Oxplore team on a five-month Ambitious Futures placement. As she leaves for her next challenge, she reflects on working with Oxplore and across Widening Access & Participation at Oxford.

My placement with the Oxplore team is (sadly) coming to an end. It’s been an exciting time to be involved in such an innovative and fast moving project, and I’ve had the opportunity to learn lots about widening access and participation along the way. I’ve collected together a few examples of practices I think are important in outreach work…

Tracking and evaluation

Within the Widening Access and Participation team and across the collegiate university, there is an abundance of excellent outreach work taking place. Evaluating these activities has a range of benefits, including informing future outreach practice, providing evidence for continual investment in outreach, ensuring that activities are engaging and meeting the needs of their target audience, and sharing best practice (both within and outside of the institution).

The breadth and variety of practice at Oxford provides opportunity to track and evaluate many types of outreach, but this can also present challenges. How do we evaluate consistently across the university in a way that is effective and time-efficient? A new evaluation framework seeks to address some of these challenges by offering a flexible framework (including suggested survey questions and evaluation format) that can be used by all outreach practitioners. This will also integrate with HEAT (the Higher Education Access Tracker) which is a great tool for a joined up evaluation technique, not just within the university but across all partner institutions.

My experience so far is that whilst evaluation can be challenging and time-consuming, its benefits for effective outreach outweigh the costs.

Collaborative working

Following on from the idea of sharing best practice through evaluation, I have also experienced the importance of sharing resources, knowledge and experience in outreach work. For example, the Oxplore team has been creating (learning) materials (e.g. engaging workshop plans, colourful flyers and lots of branded goodies) to share with college and departmental outreach officers. Working with the wider outreach community in this way gives us an avenue to share Oxplore with a wide range of young people, but also gives outreach officers a new way to share academic research through an engaging Big Questions workshop.

Within Undergraduate Admissions and Outreach, a recent move to a bigger office where the majority of teams are now sitting together increases the potential for collaborative working between outreach, recruitment and communications teams. It can be small things, like sharing a list of annual awareness days for social media marketing, but also bigger things like sending 1000s of flyers to schools and UCAS fairs across the country!


Student involvement

During my time at Oxford, every outreach project I have worked on has included some kind of involvement from student ambassadors, which has a hugely important impact. Students can offer a perspective on Oxford that many staff can’t, and are much more likely to be someone that young people can relate to. At the UNIQ Summer Schools, the Lauriston Lights camp and our own launch day, I saw the ambassadors build a rapport with the participants which engaged and welcomed them in a situation that had the potential to be very intimidating.

launch students
The Oxplore team with student ambassadors Amy, Alastair, Serena and Rebecca

This is just a small sample of the lessons I’ve learnt with Oxplore, and across Widening Access and Participation. I intend to take this all with me to my next role in a FE college (and beyond!).


Measuring success

While much of the team’s planning and development at this stage is to do with getting the portal up and running, we can’t forget the end-game. We must be able to evaluate our success – to see how much impact a digital outreach portal can have – in order to improve it over time.

With so many factors to take into account, evaluation for this project is tricky. As a team, we need to measure the technical performance of the site, including interface and navigation ease of use, not to mention the quality of content. This is while also capturing measurements to help us understand if the site is benefiting users; whether it’s positively impacting their attitude to learning and helping foster an inquisitive mindset. Overlaid with these elements, we also need to collect data to establish who is using the site and whether they’re part of our core target audience. You can see it’s complicated!

To help us keep track, we’ve produced an evaluation grid to provide an overview of all the data we need to capture and its use: cross referencing qualitative and quantitative data for quality assurance and impact assessment. Data to be collected ranges from the conventional digital analytics such as site dwell time and the number of pages visited to user-specific details such as post code and feedback from user testing.

Google Analytics will help us capture data around site performance, but we’ll be trialling the University of Bath’s NERUPI framework to establish the impact of the site for end users (Network for Evaluating and Researching University Participation Intervention). This NERUPI approach is being trialled across all University outreach activity from September this year. For our project in particular, the NERUPI framework is going to need to be flexed and adapted, as the approach is usually used to measure the effectiveness of more traditional face to face outreach intervention such as school liaison. Our challenge is to create metrics for the portal that is equivalent to one traditional intervention. At the moment, the hypotheses is that 20 hours of site use should be equal to one significant face to face intervention.

The person on the right will not be us! — “What kind of evaluation do you need” by Chris Lysy of Fresh Spectrum

A clear data collection schedule, broken down by project phase, is in process. This should (we hope!) help keep us on the ball with so much information to review.