Measuring success

While much of the team’s planning and development at this stage is to do with getting the portal up and running, we can’t forget the end-game. We must be able to evaluate our success – to see how much impact a digital outreach portal can have – in order to improve it over time.

With so many factors to take into account, evaluation for this project is tricky. As a team, we need to measure the technical performance of the site, including interface and navigation ease of use, not to mention the quality of content. This is while also capturing measurements to help us understand if the site is benefiting users; whether it’s positively impacting their attitude to learning and helping foster an inquisitive mindset. Overlaid with these elements, we also need to collect data to establish who is using the site and whether they’re part of our core target audience. You can see it’s complicated!

To help us keep track, we’ve produced an evaluation grid to provide an overview of all the data we need to capture and its use: cross referencing qualitative and quantitative data for quality assurance and impact assessment. Data to be collected ranges from the conventional digital analytics such as site dwell time and the number of pages visited to user-specific details such as post code and feedback from user testing.

Google Analytics will help us capture data around site performance, but we’ll be trialling the University of Bath’s NERUPI framework to establish the impact of the site for end users (Network for Evaluating and Researching University Participation Intervention). This NERUPI approach is being trialled across all University outreach activity from September this year. For our project in particular, the NERUPI framework is going to need to be flexed and adapted, as the approach is usually used to measure the effectiveness of more traditional face to face outreach intervention such as school liaison. Our challenge is to create metrics for the portal that is equivalent to one traditional intervention. At the moment, the hypotheses is that 20 hours of site use should be equal to one significant face to face intervention.

evaluation
The person on the right will not be us! — “What kind of evaluation do you need” by Chris Lysy of Fresh Spectrum

A clear data collection schedule, broken down by project phase, is in process. This should (we hope!) help keep us on the ball with so much information to review.

Leave a comment