There is a problem with this blog that stems from its source, a Savings Learning Lab document ‘Human Centred Design in International Development - A review of what works and what doesn’t’ by Rathi Mani-Kandt (the author of this blog) and James Robinson 2021.
The central issue is that the shortfalls that Rathi and James report aren’t problems with the HCD approach at all. Instead they are problems with the planning and execution of the projects they have chosen to report. So, the problem is with the practitioners of the projects and their leadership, not with the process of HCD. The blog isn’t a review of what works and what doesn’t – it’s a review of ‘what’s worked and what hasn’t’. Quite different. The blog and the underlying report are specific and anecdotal and not general and conclusive - and I think its important to recognise this.
So let me pull out two or three points from the blog:
(i) Rathi says HCD is a process driven by empathy (so does the report) but it’s not, it’s a process driven by evidence - and that’s a massively important point of difference. HCD is a live, changing, practitioner led discipline, but it hasn’t let go of its core commitment to evidence. HCD is not some magical, inductive, creative process.
HCD uses qualitative research methods that place the human at the centre of the analysis and uses observational and other techniques derived from disciplines (amongst others) such as anthropology, psychology and traditions of participatory design to gain insight into people’s authentic needs and create design solutions to meet them.
Empathy has a role, but only a role - the integrity of the process is not reliant on the researcher empathising with the people they are researching, it’s about using observation and discussion to gain insight as a way of understanding needs and experiences – this moves the emphasis considerably and if the culture is not yours, you’d better be sure to have locals on the team.
I’ve been reflecting on how a confusion between evidence and empathy arose, and I wonder if it doesn’t lie with the work of one or two of the big-brand US design agencies who have been active in the development space. They may not reflect best practice in HCD, or to be more generous, they may just reflect their own ‘in-house’ forms of HCD.
(ii) So back to the blog, Rathi says HCD is ‘especially effective when addressing a narrow or focused problem with a clear path to change’, well, that’s true in part. HCD can effectively tackle these issues, but that’s not the only or indeed major value of HCD. Broad, deep, complex, ’wicked problems’ (as they are known in the discipline) are very amenable to the broad contextual framing that HCD has of systems problems. HCD is nothing if not a discipline based on systems thinking. Service Design, another HCD term for systems-based problem framing, addresses just those things.
Whilst marketing comms, sign ups and service usage are, as Rathi says, all tractable by HCD, they are by no means representative of what HCD can and does achieve.
(iii) So ‘what hasn’t worked’. Well, Rathi tells us that poverty, income equality and restrictive social norms are not the ideal problem candidates for HCD. Frankly I’m not surprised. But I would be surprised if HCD as an investigative methodology couldn’t be an extremely useful tool amongst others in tackling these issues. I would be even more surprised If a funder had ever commissioned an HCD project team who claimed to be able to tackle poverty.
The blog asserts that the origin of HCD lies in the private sector (not true by the way, it lies in the military and in early computing) and that this means success is inevitably quite short lived and limited. This makes me a little hot under the collar because it is so palpably wrong. The very selective examples given to make the point (such as conversion and retention) just aren’t representative of the range of success metrics that can be achieved.
Let me take a big example. Done right, HCD is quite capable of identifying systemic problems in an organisation that point to a need for restructuring to deliver better long-term outcomes for *whomever*. That’s nothing whatsoever to do with conversion rates! In fact a team I worked with achieved precisely this in Central Asia a few years ago
Finally, the blog reports a series of well-known best practice guidelines that are used in the blog as exemplars of problems with the HCD approach.
So, looking at those:
• Yes, it is useful for HCD practitioners to stay engaged with a project – that’s why parachuting in external experts and not skilling up local practitioners is generally fruitless, and for the same reason that is why it is not a good idea not to try run an HCD project in Africa from a campus on the West Coast of the US.
• Yes, it is important to make sure that solutions are contextually relevant and achievable, it’s also important to make sure they are credible
• Yes, HCD can be fast paced - but it doesn’t have to be - and if the process is wrong for the organisation it is aiming to support, it will fail.
• Yes, for HCD outcomes to be sustainable the culture of an organisation needs to be supportive, and preferably customer centric.
• Yes, ‘Desirability’ is not the central point of HCD in development, or indeed almost anywhere.
• Yes, Viability and Feasibility are critical to account for (early) in solution development.
There are no surprises here other than that the author seems surprised. I however am surprised that a 101 understanding of HCD methods is described as ‘news’.
I’d love to open a discussion about the real and extensive capability of HCD to tackle issues in International Development, and perhaps also consider why some projects appear to have lacked the leadership and experience necessary to deliver on this, and what we can learn from and do about this.
There is a problem with this blog that stems from its source, a Savings Learning Lab document ‘Human Centred Design in International Development - A review of what works and what doesn’t’ by Rathi Mani-Kandt (the author of this blog) and James Robinson 2021.
The central issue is that the shortfalls that Rathi and James report aren’t problems with the HCD approach at all. Instead they are problems with the planning and execution of the projects they have chosen to report. So, the problem is with the practitioners of the projects and their leadership, not with the process of HCD. The blog isn’t a review of what works and what doesn’t – it’s a review of ‘what’s worked and what hasn’t’. Quite different. The blog and the underlying report are specific and anecdotal and not general and conclusive - and I think its important to recognise this.
So let me pull out two or three points from the blog:
(i) Rathi says HCD is a process driven by empathy (so does the report) but it’s not, it’s a process driven by evidence - and that’s a massively important point of difference. HCD is a live, changing, practitioner led discipline, but it hasn’t let go of its core commitment to evidence. HCD is not some magical, inductive, creative process.
HCD uses qualitative research methods that place the human at the centre of the analysis and uses observational and other techniques derived from disciplines (amongst others) such as anthropology, psychology and traditions of participatory design to gain insight into people’s authentic needs and create design solutions to meet them.
Empathy has a role, but only a role - the integrity of the process is not reliant on the researcher empathising with the people they are researching, it’s about using observation and discussion to gain insight as a way of understanding needs and experiences – this moves the emphasis considerably and if the culture is not yours, you’d better be sure to have locals on the team.
I’ve been reflecting on how a confusion between evidence and empathy arose, and I wonder if it doesn’t lie with the work of one or two of the big-brand US design agencies who have been active in the development space. They may not reflect best practice in HCD, or to be more generous, they may just reflect their own ‘in-house’ forms of HCD.
(ii) So back to the blog, Rathi says HCD is ‘especially effective when addressing a narrow or focused problem with a clear path to change’, well, that’s true in part. HCD can effectively tackle these issues, but that’s not the only or indeed major value of HCD. Broad, deep, complex, ’wicked problems’ (as they are known in the discipline) are very amenable to the broad contextual framing that HCD has of systems problems. HCD is nothing if not a discipline based on systems thinking. Service Design, another HCD term for systems-based problem framing, addresses just those things.
Whilst marketing comms, sign ups and service usage are, as Rathi says, all tractable by HCD, they are by no means representative of what HCD can and does achieve.
(iii) So ‘what hasn’t worked’. Well, Rathi tells us that poverty, income equality and restrictive social norms are not the ideal problem candidates for HCD. Frankly I’m not surprised. But I would be surprised if HCD as an investigative methodology couldn’t be an extremely useful tool amongst others in tackling these issues. I would be even more surprised If a funder had ever commissioned an HCD project team who claimed to be able to tackle poverty.
The blog asserts that the origin of HCD lies in the private sector (not true by the way, it lies in the military and in early computing) and that this means success is inevitably quite short lived and limited. This makes me a little hot under the collar because it is so palpably wrong. The very selective examples given to make the point (such as conversion and retention) just aren’t representative of the range of success metrics that can be achieved.
Let me take a big example. Done right, HCD is quite capable of identifying systemic problems in an organisation that point to a need for restructuring to deliver better long-term outcomes for *whomever*. That’s nothing whatsoever to do with conversion rates! In fact a team I worked with achieved precisely this in Central Asia a few years ago
Finally, the blog reports a series of well-known best practice guidelines that are used in the blog as exemplars of problems with the HCD approach.
So, looking at those:
• Yes, it is useful for HCD practitioners to stay engaged with a project – that’s why parachuting in external experts and not skilling up local practitioners is generally fruitless, and for the same reason that is why it is not a good idea not to try run an HCD project in Africa from a campus on the West Coast of the US.
• Yes, it is important to make sure that solutions are contextually relevant and achievable, it’s also important to make sure they are credible
• Yes, HCD can be fast paced - but it doesn’t have to be - and if the process is wrong for the organisation it is aiming to support, it will fail.
• Yes, for HCD outcomes to be sustainable the culture of an organisation needs to be supportive, and preferably customer centric.
• Yes, ‘Desirability’ is not the central point of HCD in development, or indeed almost anywhere.
• Yes, Viability and Feasibility are critical to account for (early) in solution development.
There are no surprises here other than that the author seems surprised. I however am surprised that a 101 understanding of HCD methods is described as ‘news’.
I’d love to open a discussion about the real and extensive capability of HCD to tackle issues in International Development, and perhaps also consider why some projects appear to have lacked the leadership and experience necessary to deliver on this, and what we can learn from and do about this.