It’s a cliché to claim that data is the “new oil”, a resource to be mined. We collect it from the field, refine it with experts, and utilize it for decision-making. However, we rarely ask what this extractive model does to the workers and communities that provide the raw materials. This is a summary of how and why we developed an Ideas Engine to collect and share insights.
The flow of data remains largely one-way. We ask local actors to report on vaccination coverage, disease outbreaks, or supply shortages. Yet, all too often, this valuable information travels up the chain without ever returning to the people who generated it in a way they can use.
What if the act of reporting was, in itself, an act of learning? What if the input mechanism was designed not just to feed a database, but to inform the practitioner? What if this recognized the significance of qualitative experiences that are usually dismissed as anecdotes?
This shift in perspective is the driving force behind The Geneva Learning Foundation’s Ideas Engine, first launched in July 2020 with a group of more than 600 practitioners who designed the COVID-19 Peer Hub with support from the Bill & Melinda Gates Foundation (BMGF).
This mechanism is helping us move beyond the traditional survey model to create a system of reciprocal value. Every piece of data shared becomes a tool for empowerment, connection, and locally-led change.
Ideas Engine: moving beyond mining the frontline
Epidemiologists are trained to dismiss experience as anecdotal, to minimize bias, and to extract clean data. We treat the local actor as a sensor or a passive instrument to measure coverage or disease incidence. But a local actor is not a sensor. She is a professional with the capacity to think, act, and learn. And yet, data reported by local actors are treated with suspicion, generally assumed to be unreliable for multiple reasons.
When we treat a community volunteer or a district medical officer merely as a source of data, we do more than miss the context. We strip them of their agency. We reduce a thinking, adapting professional operating in a complex adaptive system to an anonymous row in a dataset.
This is an epistemic injustice. It assumes that knowledge resides in the center, with experts who analyze the data, while the periphery become an anonymous source or informant.
When we treat people and communities as data sources, we also fail to capture the tacit knowledge that explains the numbers. We miss the story of how a nurse in Kano negotiated with a community leader to allow vaccinators entry. We miss how a district officer in Bihar adapted cold chain logistics during a flood.
The Insights mechanism that led to developing the “Ideas Engine” is not a survey tool designed to extract information for the center. It is a pedagogical pattern designed to build power at the periphery. It supports local actors’ inherent capacity to learn from each other, while offering global actors a rare opportunity: the chance to listen, to act on what they hear, and to question governing assumptions that drive global strategies.
Our Insights mechanism is designed to capture this layer of reality. It operationalizes what learning theorists like Diana Laurillard describe as a conversational framework, but applies it outside classrooms and at a massive scale. Instead of a teacher-student dialogue, we facilitate a peer-to-peer dialogue across borders. This draws on George Siemens’s connectivism, where learning happens by connecting nodes of information across a network. We then add a critical layer of structure to ensure those connections lead to action. This embodies Cope and Kalantzis’s vision of active knowledge production, where the learner is not a consumer of content, but a creator of it. Last but not least, we draw on the insights from the work of Karen E. Watkins and Victoria Marsick to map the capacity for change or “learning culture” that set outer boundaries that local actors operate within.
This mechanism remixes these theoretical frameworks to life on the outer cusp of chaos. It operates in humanitarian emergencies, disasters, war zones, and extreme poverty, engaging tens of thousands of participants where traditional systems fracture.
Reciprocity as justice, not transaction
In traditional marketing, there is a concept called give-to-get. You give a free resource to get an email address. This is transactional. Our philosophy is different. We believe that giving back is a requirement of justice.
When a health worker in a conflict zone takes thirty minutes to share a story about overcoming vaccine hesitancy, they are performing unpaid labor for the global good. If we do not return that value to them rapidly and in a usable form, we are participating in the same extraction we claim to oppose.
Learn more: Why answer Teach to Reach Questions?
Our Insights mechanism is therefore built on a specific architecture of reciprocity. It cycles value back to the contributor at every stage of the process. This ensures that the mechanism serves the practitioner first, and the hierarchy is positioned in support of the practitioner. This distinct ethical framework is what allows us to maintain high levels of engagement and trust over time.
The architecture of the Ideas Engine: from reflection to action
The mechanism is a complex assembly of pedagogical scripts, technical workflows, and community engagement loops. It functions as the heartbeat of our entire learning system. It feeds both our Teach to Reach events and the Impact Accelerator.
The mechanism is a complex assembly of pedagogical scripts, technical workflows, and community engagement loops. It functions as the central operating system for our learning programs, feeding both the Teach to Reach events and the Impact Accelerator.

1. The input: reflections, not reporting
Standard data collection asks for statistics. How many children did you vaccinate? This triggers compliance. Our Teach to Reach Questions ask for narratives. Tell us about a time you faced a challenge. What did you do?
This phrasing is intentional. It forces the user to pause and reflect on their own practice. This is metacognition. It transforms them from a data subject into a knowledge producer.
2. The immediate return: collections of experiences
Our insights team reads every contribution. The team then does the grueling work of producing a collection of Shared Experiences. This is a compendium with hundreds and sometimes thousands of peer stories. It is filtered only to remove nonsensical or AI-generated content.
We strive to share this back with the community as quickly as possible. A health worker facing a cholera outbreak today cannot wait for a peer-reviewed paper next year. This validates tacit knowledge. It tells the health worker that their experience matters enough to be shared with the world rapidly.
3. The synthesis: thematic insights reports
While the raw collection is fast, we then use more conventional qualitative research techniques to produce thematic insights reports, also known as “eyewitness reports”. Each report distills many contributions into short summaries of what we learned from them, on a specific topic or challenge. Written for the community, they identify patterns that no single individual could see on their own. These reports also turn out to be surprisingly relevant and useful for non-local actors.

4. The dialogue: dynamic event-driven knowledge translation
Knowledge in action is dynamic, by definition. The Ideas Engine is about turning knowledge into action. This is why we host Insights Live. These are rapid-fire livestreamed sessions where the data comes alive.
We invite the contributors themselves to take the floor as our guests of honor. This is more akin to jazz improvisation, rather than the rigid classical music orchestration of presentation webinars. We invite global partners and funders to listen. This reverses the usual power dynamic. We then turn these livestreamed events into podcasts. This ensures that even those with low bandwidth or no time to watch a screen can access the learning.
5. The application: closing the loop
Knowledge is useless if it cannot be shared. We provide tools for dissemination. For example, we prepare short slide decks that provide even more concise summaries readers can use to present insights to their colleagues and teams.
Crucially, this includes a feedback facility. We track not just who downloaded the deck, but who presented it and what their colleagues said. This allows us to measure the ripple effect of the insight, including actual use and, in some cases, how the use of an insight led to changes in practice and tangible improvements in health outcomes.
Does the Ideas Engine actually make a difference?
Does this actually work? Is it better than a survey? The data suggests yes.
In an independent analysis by the University of South Australia’s Centre for Change and Complexity in Learning, researchers examined our Ideas Engine. This was a core component of this mechanism during the COVID-19 Peer Hub. The report revealed the scale of engagement that this proprietary method generates.
- Scholars contributed 1,103 ideas and 3,061 comments. This is an average of 2.77 comments per idea.
- 80.2% of participants reported using the Ideas Engine.
- Of those who used it, 92.9% reported finding ideas that were useful for their work.
- Perhaps most importantly, the analysis of citations showed that two-thirds of the citations in action plans were to ideas from peers working at different levels of the health system.
This proves that the mechanism does not just collect data. It successfully bridges the gap between knowledge and action by connecting practitioners across hierarchies.

