Book a Coffee Call

The ABE (‘Always be Evaluating’) of OD & D

By Catherine Morris of the Department for Communities and Local Government, having recently completed Mayvin’s Advancing Your Practice programme with our client the UK Civil Service OD & D Service. The programme is aimed at experienced Organisational Development and Design (OD&D) and change practitioners working in business/client facing roles, dealing with complex change. Throughout the programme, practitioners learn about OD&D […]
Share this:

Sign up to our mailing list

Want to get updated on regular insights and ideas from the experts in the Mayvin team? Sign up to our newsletter and never miss out again!

By Catherine Morris of the Department for Communities and Local Government, having recently completed Mayvin’s Advancing Your Practice programme with our client the UK Civil Service OD & D Service.

The programme is aimed at experienced Organisational Development and Design (OD&D) and change practitioners working in business/client facing roles, dealing with complex change.

Throughout the programme, practitioners learn about OD&D in the service of turning real, current, urgent business needs across the Civil Service (CS), and within departments into OD&D challenges, and taking steps individually and collectively to make a difference in relation to those needs

“Can you suggest some Organisational Design and Development (OD&D) measures for an HR dashboard?” was the question I was asked recently. Of course I thought – why not? As a communications specialist, the Government Communications Service (GCS) had recently introduced a new evaluation framework for communication campaigns so I was pretty familiar with the differences between outputs, outtakes, outcomes – and organisational impact. Surely it was just a case of translation and application – or was it?

‘You get what you measure’

Dashboards are a familiar sight in most organisations. They are succinct, visually appealing ways of displaying often quite complex data, allowing decision-makers to assess progress against organisational aims. ‘You get what you measure’ is an oft quoted term - and we are culturally conditioned by a system that rewards metrics and hard measures. Our managers have been led to expect certainty, predictability and control and even our language is shaped by industrial terms (‘cogs in a wheel’, ‘shift the dial’, ‘pull the levers’…)

Cause and effect

However, the tangible things that organisations seek – such as performance improvement, profitability, etc. - are often driven by intangible, subjective factors such as motivation, engagement, leadership and culture. See Burke & Litwin (1992) model. Neuroscience confirms what we’ve probably known all along – that it’s generally our heart (our emotions) rather than our head (or logic) that actually drives us to change our behaviours and take action. Taking this a step further, if we recognise that the outcomes for any OD&D interventions are dependent on the thoughts, feelings and actions of people emerging through constant interplay – how can we be certain that we have really influenced or caused a particular change?

 

Transactional vs transformational

We know the world is not about simple cause and effect chains - but dashboards still predominate. And, to be fair, they are definitely useful for tracking some of the more transactional measures such as sales figures, complaint reduction, innovation time from ideas to market. But we need to navigate carefully if we want to truly assess the transformational value of our OD&D work. Having recently completed the Mayvin ‘Advancing your Practice’ programme, I’ve outlined some pointers that helped me on my ‘evaluation’ journey.

1. Always be evaluating – a Mayvin maxim but essential given that the world is always changing so whatever you intended to measure at the beginning of the project, will most likely have changed/moved on.

2. Engage all stakeholders – work with those involved across the whole system to develop collaborative interventions that focus on outcomes and impact; get senior people (or their representatives) involved in helping to track and measure the indicators that they feel will make the difference in achieving their objectives.

3. Have an agreed understanding of OD&D. How do your senior team understand OD&D? Their perceptions will influence their expectations of your work and what it can achieve. We need to confidently express the value and ambition of OD&D as supporting organisational health and well-being and helping align humanistic values to wider business goals.

4. Recognise evaluation is a political process – there are ‘multiple truths’ and you need to understand who is asking for the evaluation and what they are looking for.

5. Be clear on your definition(s) of success - OD&D is about systemic organisational improvement and, given the difficulties of correlation and causation, it makes sense to ask everyone involved what success looks like to them to achieve a consensus.

6. Check alignment of interventions – ask those commissioning work how their projects are connected or aligned with other work – to encourage a holistic approach to organisational development.

7. Focus on the transformational factors as well as transactional measures. It’s easy to focus on the hard crunchy numbers (eg productivity measures) rather than the more intangible stuff (for example clarity of mission, strategy, leadership and culture) but it’s these elements that can really transform organisations.

8. Clarify what you are measuring – is it about needs (what issue should the intervention be targeting to inform its design and planning?); processes (how is the intervention being implemented?); outcomes (what effect or impact is that intervention or programme having?), or; efficiency (assessing costs and benefits).

9. Use different measures for ‘hard’ and ‘soft’ data– recognise that information comes in different forms and avoid committing ‘bad science’ (Ben Goldacre) or data misrepresentation.

10. Question if and when evaluation is necessary – based on usefulness (is the information obtained going to be useful to key stakeholder groups?); feasibility (can the evaluation be carried out in a practical/cost-effective manner?); ethicality (will evaluation be conducted fairly?); technical adequacy (do you have the technical support to ensure appropriate and proportionate evaluation?). Sometimes it’s just not worth doing…

 

Catherine's perspective on the impact of the Mayvin UK Civil Service programme:

I’ve particularly enjoyed the opportunity for reflective learning with an active community of OD&D practitioners and working as a consultant in another organisation to complete the assignment. As regards my practice, I am far less prone to ‘solutionising’ and have become more comfortable working in ambiguity, open to the messiness of on-going exploration and socially generated ideas and solutions. Whilst the tidiness of ‘hard’ models of change will always prove attractive, the reality of managing change in complex systems that are constantly adapting, means re-thinking the ways we currently measure and evaluate our success.

Find out more about Mayvin's work with the UK Civil Service.

Fancy a chat? Book a virtual coffee call with our friendly team today!

Get started on your organisational development journey today with the help of our friendly experts. We’d love to meet you for a quick cuppa and see how we can help you. Just click the button to get going!
Book your call with us today
Be first to hear about our free events and resources!

We're based in the South East of the UK and work globally.

Quick Links
Connect with us on LinkedIn

© Copyright 2024 Mayvin | Site by Bozboz