In chipping away at my “reports to read” pile, I found an excellent one from Mercy Corps and Engineers Without Borders Canada.

Navigating Complexity: Adaptive Management in the Northern Karamoja Growth, Health, and Governance Program” provides a great case study of how adaptive development ideas play out at the program management level. It earned kudos right off the bat for giving a clear definition of what it means by “adaptive management”:

(i) a high level of experimentation, where some initiatives will work while others will not;

(ii) excellent monitoring processes feed a continual flow of information that sheds light on the operating environment (e.g., gathered from successful and failed experiments); and

(iii) the ability of the organization to change strategies, plans, and activities rapidly in response to this new information.

(Emphasis mine.)

The Growth, Health, and Governance (GHG) program had a few general principles and concrete practices that enabled this. On the side of principles: Efforts to have a light touch, stay largely facilitative, and be “invisible to the end user.” Managers focused messaging to their teams around flexibility, allowing dissent and creative tension, admitting failure, and critical thinking. All of this requires greater emotional intelligence and soft, staff-coaching skills from development managers.

The highlight on admitting failure is not surprising in an EWB-C report, as they’ve really carved out a name for themselves around the practice. However, external admissions of failure are largely about branding (“look, we’re a learning organization!”) and sector learning (though the “lessons” are often surface-level and somewhat trite). This report talks more about the impact that internal admissions of failure have on organizational culture and decision-making. Admitting failure to your team members is a different animal entirely—and potentially more important for adaptive management.

Perhaps best of all, the report shared a few of the tools used by GHG’s adaptive managers. Moving from those used infrequently to those on the tightest loop:

  • Results chains: illustrates the mission, logic flow, and indicators for each team on the project; remains relatively stable over time. (Sample provided in the report.)
  • Strategic reviews: each sector team presents to others for peer review and critique, once every 3-6 months.
  • In-house studies: quick turnaround research, with study design, data collection, and analysis all conducted by the team members themselves—not outside consultants.
  • After-action reviews: a simple template focused on comparing intended versus actual results, following key events or meetings.
  • Weekly reports: 1-2 page reports to promote reflection and compile information across the organization; submitted in a table format, one per sector-team. (Sample provided in the report.)

Finally, the report comes with a slew of recommendations. A few that struck me:

  • Donors should loosen strict experience requirements for managers, so as to avoid repeating the status quo with merely “15 years of development management experience”. (I would add that job descriptions should focus more on demonstrable qualities, rather than using specific past experiences as recruiting heuristics.)
  • Programs should keep larger (and fewer) budget lines, defined at the level of outcomes rather than activities, to allow flexibility.
  • Team members should be encouraged to improve “situational awareness” and socio-political intelligence gathering, while being wary about data-collection fatigue among partners and local actors.

My favorite aspect of this report is that it connects all levels of adaptive management—from the abstract principles down to the reports that staff submit—and shows how they reinforce one another. In management literature, this is called alignment: all aspects pulling together in the same direction and in the right rhythms. That doesn’t mean perfect efficiency or some ideal of frictionless execution, but it’s a useful way to think about optimization for multifaceted operations.

The caveat here is that the report was commissioned by Mercy Corps as a study of their own project. Safe money bets that space constraints, political savvy, and basic self-interest resulted in the omission of many nuances—some of them not so pretty. Everything looks better from the outside. That said, the report contributes meaningfully to our evolving understanding of adaptive development management. See recent work from ODI for more on the same.

Photo credit: “Navigating Complexity” report.

  1. […] might review whether suitable mechanisms are in place to enable learning, reflection and adaptive development; the World Resources Institute’s Environmental Democracy Index is worth a look here. But there is […]


  2. […] thoughts/examples on this here, here and here.  A broader but related reflection on learning and strategy in the context of the Open […]


  3. […] Dave Algoso – Operational models: Adaptive management in northern Karamoja […]


  4. […] might review whether suitable mechanisms are in place to enable learning, reflection and adaptive development; the World Resources Institute’s Environmental Democracy Index is worth a look here. But there is […]


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: