About the Value & Impact toolkit first steps: value

As an introduction to evaluation, this section provides some essential and basic first steps for evaluation planning. It has been adapted from Schuh and Upcraft’s ‘framework for evaluation’ (2001) and further developed from the experiences of the universities involved in the pilot phase of the value and impact project. It uses questions to probe the purpose, type, scope, setting and methods of an evaluation project, and defines the steps to be followed in planning and implementing evaluation.

This page about value features the following:

Return to the Value & Impact online toolkit.

Value diagram

Background

A SUMS Consulting report (2009) notes that ‘(t)he evaluation of the performance of student support services, and demonstration of their value and contribution to the wider university, is becoming increasingly important across the sector.’ Performance management for student support services relies on a robust strategy with defined objectives, which will act as a precursor to developing key performance (or value) indicators for the service as a whole or specific service areas. Once a performance management system is in place, the outputs it produces can be used to demonstrate the value of the services across the university and to all stakeholders, and to confirm its fit with the university’s strategic aims and objectives.

This section describes what value for money is, how it can be demonstrated, and suggests approaches to undertaking value for money evaluations. Value for money, however, is more than just undertaking an evaluation – it is also dependent on the results of an evaluation and on any action being taken in response to the findings. Achieving value for money requires an attitude and culture, which seeks continuous improvement and the integration of value for money principles into management practices.

Demonstrating value for money, however, is not a straightforward process. There are no ‘off-the-shelf’ methods readily available. Value for money is context driven and therefore the concepts and processes described here should be adapted to the context within which they are to be used. Examples are provided to help demonstrate how these concepts and processes can be used and adapted.

Evaluating and measuring value for money is a challenge. Some elements, such as quality and sustainability, are more subjective and more difficult to measure than other elements. ’Value’ can often take years to materialise. What is value for money at one point in time may not be a year later. Value for money is also specific to different contexts and a key component in evaluating effectiveness is relevance to, alignment with, and impact on, the strategic objectives of the organisation. What is value for money for one university / service, may not be the same for another. A strong element of good, informed judgement is therefore required when considering whether value for money has been satisfactorily achieved or not, and how it might be improved. Ongoing monitoring is therefore required to ensure ‘currency’.

Economy

Economy is consideration of cost. It is a measure of what goes into providing a service, such as the cost per hour of a demand-driven service, or the rent per square metre of accommodation. The whole life costs of inputs such as the (direct and indirect) costs of acquiring, running and disposing of assets or resources should be considered.

Efficiency

Efficiency is making the most of the money that is spent on a service. It is a measure of productivity, in other words how much you get out in relation to what is put in. This might include the number of students seen per counsellor, either individually or though group work, or the contribution the counselling service makes to the student learning experience and the university’s corporate strategy.

Effectiveness

Effectiveness is making sure that services meet the needs of students. It is about understanding the intended outcomes of services / activities and developing impact or value indicators to determine whether these outcomes have been met. Effectiveness is, therefore, a measure of the impact that has been achieved and is essential to demonstrating the achievement of value for money. An activity may be efficient and economic, but it does not represent value for money unless it is effective in delivering its intended objectives. Examples might be: an increase in student retention as a result of specially targeted support initiatives, or an increase in the number of students seeking financial advice / support as a result of improved marketing / publicity of a bursary scheme. Effectiveness measures can be either quantitative (reduction in levels of students not completing their courses) or qualitative (increases in student satisfaction with support services).

Top

A word of warning

Some indicators will be of limited benefit depending on the size of the departmental unit. For example, cost relative to total cost of running the university may be less helpful in small units.

Top

Indicators

Performance management for student support services relies on a robust strategy with defined objectives, which will act as a precursor to developing value indicators for the service as a whole, individual service areas or specific activities.

A widely accepted method of creating and disseminating strategic objectives is the Balanced Scorecard approach (Kaplan and Norton, 1992). This approach is about emphasising that evaluation of value for money is more than just a financial calculation – it is a holistic evaluation of value, hence the balanced approach. The balanced scorecard uses four perspectives through which strategy can be viewed and from which objectives can be created. These are:

  • Customer perspective (C)
  • Internal business processes (B)
  • Learning and growth (LG)
  • Financial (F)

It also provides a mechanism through which performance can be tracked holistically using value indicators supported by both quantitative and qualitative data. Once these indicators have been identified and the data collected, analysed and interpreted, a value for money judgement can be reached.

In developing value indicators, it is essential to define the scope of the support services function and identify core questions that reflect the requirements of a modern, value for money support services function, which the indicators are intended to help explore. These core questions need to link to the overall student services strategy and be agreed before any indicators are developed to validate whether or not they are being achieved. Core questions, which are linked to the four perspectives of the Balanced Scorecard approach, might include the following:

  • Are student users satisfied with the services provided? (C)
  • Is the support services function cost-effective? (B/F)
  • Are support services processes operated in an efficient and timely manner? (B/C)
  • Is the support services function helping to effectively promote and manage students’ transition into and through their time at university? (B/LG)
  • Is the support services function helping the university to effectively support students’ academic and personal development and performance? (B/LG)
  • Does the support services function help to ensure the university appropriately identifies and supports those students with specific needs? (B)
  • Is the support services function proactively planning for future resource needs and taking appropriate action to address gaps? (F/LG)

Top

Evaluating value for money

Suggested methods for reaching an overall value for money judgement

The following methods can be used as possible options for reaching an overall judgement:

  1. Carry out a self-evaluation based on the checklist (including the key value indicators identified) using a scale ranging from -5 to +5 depending on performance. Total these up to arrive at an overall judgement.
  2. Carry out a self-evaluation but use ‘traffic light indicators’: red (no evidence of achievement), green (evidence of achievement), and amber (partial evidence of achievement) depending on performance. Arrive at an overall traffic light based on the whole.
  3. Carry out a self-evaluation giving an outcome of ‘met’, ‘not met’ or ‘point of excellence’ depending on performance. (This scale is based on the British Council Evaluation of English Language Teaching.)
  4. Self-evaluate using low / medium / high confidence ratings and low / medium / high importance recommendations, based on a financial audit model.
  5. Self-evaluate and list areas for action. Compare the number and significance of action points with the overall number of elements / indicators / standards in the checklist. The following action points might be identified: more systematised record keeping; refresh overall service purpose and mission; introduce overall user satisfaction rating within feedback process.

The above suggestions have been developed and provided by Student Services at the University of Sheffield.

Top

Pilot comments

"Although we knew we wanted to concentrate on each of the areas we had selected, it was a bit disconcerting to think through how we would achieve it. What helped with this was meeting with the pilot researcher and discussing this, and then using the toolkit's impact study template to iron out the objective(s), outputs, intended outcomes, impact indicators and source of data."

University of Brighton

"The necessity of managing expectations regarding scope and scale, and marshalling appropriate resources is a key learning point."

University of Surrey

AMOSSHE, The Student Services Organisation is a UK non-profit professional association. Company registration number 4778650.
Read our privacy policy.

Powered by Wild Apricot Membership Software