Problems with learning as an approach to system change
Lankelly Chase Foundation’s (LCF) Place Action Inquiry is driven by a conviction that those living and working closest to social problems are best placed to direct change. LCF’s work across places therefore consists of many different people with many different perspectives working toward many different goals. Our original approach described our intention to make learning the golden thread linking all this disparate activity together and giving us a common sense of purpose. In reality, despite a dedicated team of researchers working as ‘learning partners’, learning has been one of the most difficult things to get right. This article draws on lessons from using our ‘noticing framework’ as a standardised model for capturing learning and reflects on what this might mean for approaches to place-based funding and system change.
In the early days of working as a Learning Partner, we were conscious that without an infrastructure to capture and disseminate learning from the huge volume of activity occurring in the Inquiry, learning was slipping through our fingers like sand. In conversation with Lankelly Chase staff, we developed an idea of rolling out a ‘noticing framework’ across the Place Inquiry — a set of questions broad enough to contain the Place Inquiry’s diversity, yet linked to time-tested methodologies (Most Significant Change, Outcome Mapping and Action Learning).
- What has happened in the last month that feels significant?
- What are the main actions have you have undertaken in the last month?
- Have you learned anything as a result about your own practice?
- Have you learned anything about your system?
- Have you learned anything you could share with others?
Working as a dedicated learning team, we could use this noticing framework to survey activity, assess cross-cutting themes and provide a direct link between Lankelly Chase and the places it held relationships. We always intended this framework to be the starting point for a conversation which would become adapted to be relevant in individual contexts. Nevertheless, it was often understood by others as standardised and blanket approach to learning extraction. So, to turn the tables on ourselves, what did we learn from this?
The worst way to learn is to ask people what they have learned.
Three questions asked people directly what they learned. Very quickly, it became apparent this is not the way people think. Learning was implicitly framed as a definitive point of knowledge, perhaps indicated by the removal uncertainty, and perhaps linked to concrete changes in practice. People rarely reached such a position of clarity, and struggled to determine what separated ‘learning’ from the other anxieties and uncertainties in their minds. While we often received insightful responses, the overall reaction was often extremely broad, ‘people can be trusted more than you think,’ or highly personalised, ‘I need to organise my calendar better’.
Learning minus purpose equals bureaucracy
Completing a monthly learning survey for some brought a welcome opportunity for reflection. For others, it became a regular inconvenience. While we focussed our energy on transferring learning to Lankelly Chase staff to develop their understanding and decision making capacity, those completing the survey never saw the benefits. We were often challenged about this — what’s in it for the places? Rather than using the noticing framework as a one-way learning extraction exercise, how could we bring learning into play locally and use this to change practice? This was a destination we had intended to reach eventually, rather than design in at the start, so it was difficult to respond quickly enough to keep people engaged.
Learning depends on context for meaning
Despite the difficulties noted, the responses that we received were often very interesting and illuminating about the work happening in places. As soon as we came to aggregating and summarising this however, learning would lose its impact. In feeding back learning to Lankelly Chase every month, we were faced with presenting a huge amount of information from places leading to information overload, or a synthesised thematic summary which felt depersonalised and unduly removed from the work. Focussing on a single place in these meetings worked much better since this allowed us the space and focus to situate learning in the dynamic and individual experiences in places. The idea of connecting learning to decision making on a monthly basis brough the appeal of agility and responsiveness. In practice it meant stripping away its meaning through standardisation and aggregation.
People do not learn well by themselves
The fifth question, ‘have you learned anything you could share with others?’ was the most difficult for people to answer and was often left blank. People lacked enough understanding of one another’s contexts and did not feel a strong enough sense of shared identity or mission to be confident about what others could learn from them. While people wanted to learn with others, they did not wish to do so through the medium of a disembodied survey question. When learning was the basis of a group discussion with peers (such as our regular peer learning meetings), the quality of learning was often much higher. Social learning opportunities allowed for a better process of discovery and self-realisation which also created a stronger sense of community and shared mission. Once people generated a better understanding of common priorities and cross-cutting issues, they had something more tangible to learn about.
What does this mean for an Action Inquiry predicated on leveraging learning into system change?
As often the case with reporting our learning, the above points dwell on the negatives. None of this means we regret the noticing framework — as mentioned, it was always intended as the beginning of a conversation rather than the end. We know that people want more opportunities to learn together, and that in order to do this, we have work to do developing of a stronger sense of identity and mission in the Place Inquiry. It has also increased the visibility and appetite for learning locally, and the place (rather than merely Lankelly Chase) is now a destination as well as a source of learning.
Our experience might also be instructive for those with interest in reforming funding for system change, complexity and place-based funding. Learning is often put forward as an alternative to the control mechanisms which funders employ, like grant monitoring and KPIs, balancing accountability and monitoring with the need for local adaptation. Learning in this approach can be codified, transmitted between people and implemented in different contexts, and learning rather than control becomes the basis for decision making. The idea of replacing performance controls with flows of learning — while keeping other organisational and decision making structures intact — is intuitively appealing and reassuringly achievable.
Our experience suggests learning offers no such fix. Instead, in accord with social learning theory, we are finding that learning only occurs in meaningful interaction with others and cannot be straightforwardly transferred to other contexts. As learning partners in this context, we face a different and perhaps trickier job. We are less intermediaries between LCF and places or custodians of a technical learning process. Instead, we are more concerned with creating the conditions for others to engage in meaningful social learning by crossing boundaries (e.g. between LCF staff and places) and decentering decision making processes. Already we are finding this draws attention to more fundamental issues in the power relationships between funders and place actors, and the composition of actors involved in decision making processes. In the world of place-based funding and ‘system change’, learning is a necessary but not sufficient response.