ICT for Civic Data — Crash Course 2026
# From Example to Proposal
Self-Paced Review — Section B
civicliteraci.es
--- # Why This Matters --- ## Turn a case into a proposal The goal of Section B is to go from a **concrete case study** to a **compelling, grounded proposal**, one that answers a floaty RFP with a specific, realistic story. The RFP asks for broad things: *"strengthening CLI's ability to anticipate and prepare for climate-related disasters at the community level."* Your job is to make that concrete. A strong proposal doesn't try to answer everything. It picks **one angle** grounded in a real example, and builds a story around it that the funder can see, trust, and fund. Note: This is slide 25 from the outline. The RFP is deliberately broad: it mentions data, AI, community resilience, and preparedness without specifying exactly what it wants. Most data-focused RFPs in the civic space are like this. The respondent's job is to bring specificity. --- # Before You Start --- ## You can't answer floaty with floaty The funder is vague, but **you must be concrete**. The RFP says: *"Proposals should describe how data analysis and geospatial tools can be used to confirm and enrich existing local knowledge about community-level risk, infrastructure gaps, and population vulnerability."* That is a broad wish list. You cannot address all of it: - The funder doesn't fully know their own needs - A single project won't cover everything - The budget won't allow it Your advantage is **specificity**. Pick a clear thread, build evidence around it, and show what it would look like in practice. Concreteness beats comprehensiveness. Note: Slide 26. This echoes Section A's insight that most data-focused RFPs in the civic space are strategic, not technical. The funder knows data and AI matter but can't articulate exactly how. Your response fills that gap, but only if it is grounded and specific, not a mirror of the funder's own vagueness. --- ## You can start from a dataset or from a question
From a dataset
You find an interesting dataset and ask: **what questions can this data answer?** Risk: you only ask questions the data can answer. You may miss the most important questions entirely because no dataset covers them. Example: you find flood records with coordinates → you ask "where do floods happen?" but never ask "who is affected?"
From a question
You start with a problem and ask: **what data would I need?** Risk: the data may not exist. You spend time framing a question that cannot be answered with available evidence. Example: you want to know community evacuation capacity, but no one collects that data.
Both are valid. In practice, you iterate between them: the **Define ↔ Find** loop. Note: Slide 27. From Lecture 1: two starting points for data work. In the crash course, students experienced both: some started with the FloodArchive dataset and looked for angles; others started with a disaster preparedness question and looked for data. The Define-Find loop from Part 0 applies directly here. --- ## The funder is your client The RFP is about helping **the funder**, in this case CLI. The opening line says: *"CLI's field staff possess deep expertise in the regions where they operate, but the organisation currently lacks the technical capacity to systematically complement that expertise with data analysis and modern tools."* The funder's problem is clear: they have expertise but lack data capacity. Your proposal helps **them** build that capacity. Downstream benefit to communities is important, but the story **cannot skip the funder** to focus only on beneficiaries. The funder needs to see themselves in the proposal, as the user of what you build, not just the one who pays for it. Note: Slide 28. This is a common mistake in civic-sector proposals: writing as if the beneficiaries are the audience. They are not. The funder is reading the proposal, and they need to understand how YOUR work helps THEM do their job better. The community benefits, but through the funder's improved capacity. --- # Walkthrough --- ## Start from concrete examples Abstract thinking about "disaster preparedness" produces abstract proposals. **Case studies ground your thinking.** A case study is a real situation: a place, a hazard, a population, a gap. It gives you something specific to point at. From a case study, you extract a **common thread**: something that recurs across contexts and that your approach can address. That common thread becomes your **angle**, the central argument of your proposal. The sequence: 1. **Case study** — a concrete situation you can describe 2. **Angle** — the insight that connects the case to a broader need 3. **Proposal** — the plan that turns the angle into action Note: Slide 29. In the crash course, students started with Indonesia as the worked example. The case study was not chosen randomly; it was selected because Indonesia has severe flood risk, remote communities, and available open data. Starting from a real case prevents the "boil the ocean" problem where proposals try to solve everything abstractly. --- ## Case study → angle → proposal
Case study
**Indonesia flood monitoring.** Remote communities face recurring floods. Health facilities are scattered across islands. No systematic way to know which facilities are in flood-prone areas. Field staff rely on experience and informal networks. The gap: **CLI has field knowledge but no tools to map risk systematically.**
Angle → Proposal
**Angle:** Prevention + staff safety + emergency response. If field staff know which facilities are at risk *before* a flood, they can prepare. **Proposal:** A facility risk dashboard that overlays flood history, health facility locations, and population data. Field staff see which areas need attention. The organisation gains a planning tool it currently lacks.
The case study is specific. The angle connects it to the funder's need. The proposal makes it actionable. Note: Slide 30. The Indonesia example is the worked case study used throughout the crash course. The sequence matters: starting from the case study (not from the tool) ensures the proposal addresses a real problem. The dashboard is the deliverable, but the value is the methodology: using data to complement field expertise, exactly as the RFP asks. --- ## What makes a good angle A good angle is: **Grounded** — rooted in a real case study, not abstract aspiration. You can point to specific data, specific places, specific gaps. **Realistic** — achievable within the project's budget and timeline. A 6-month project cannot build a national early warning system. **Approachable** — an entry point for an organisation learning to use data. The funder's staff should be able to understand and use what you build. **Impactful** — it opens new possibilities the funder did not have before. Not incremental improvement, but a new capability. The angle is the bridge between "here is a problem" and "here is what we can do about it." It is the core argument of your proposal. Note: Slide 31. The four criteria come from the crash course discussions. "Approachable" is particularly important for civic-sector organisations that are early in their data journey. The RFP itself says CLI "currently lacks the technical capacity." A proposal that requires advanced data science skills the funder doesn't have will fail in implementation. --- ## Funders expect something they can see and use Many funders expect something **"shiny"**: a dashboard, a map, a monitoring tool. Something they can show to their board, their donors, their partners. The RFP signals this: *"Describe how the proposed project would be carried out, including how data would be used to support and enrich existing assessments."* It wants something visible that demonstrates the approach. The real value is the **strategy behind the deliverable**: the methodology, the data pipeline, the capacity built in the organisation. But the deliverable makes the proposal feel concrete. Think of it as packaging: the dashboard is the box, but the contents are the data skills and processes the organisation gains. Note: Slide 32. In the crash course, every group built a visible artifact: a map, a dashboard, a data portal. These served double duty: they demonstrated the proposal's feasibility (show, don't tell from Section A), and they gave the funder something tangible to evaluate. The deliverable is not the point, but it is the proof. --- ## Three roles shape every project
| Role | Responsibilities | Example | |---|---|---| | **Funder** | Funds the project, coordinates stakeholders, provides field access | CLI provides funding, connects you to field teams, gives access to regions | | **Implementing partner** | Field presence, local knowledge, training delivery | Local NGO with staff in the target communities | | **Technical partner** | Data analysis, tool development, methodology, capacity building | Your team: builds the dashboard, trains staff, documents the process |
The technical partner (you) provides **specific expertise**, not overall project leadership. The funder and implementing partner own the context. Your job is to make data work useful for them. Note: Slide 33. This reflects the decolonization discussion from Section A (slide 23): the shift toward local organisations with local staff leading implementation. The technical partner is a service provider, not the project owner. Understanding this dynamic is critical for writing a proposal that positions you correctly. --- # Behind the Approach --- ## Don't pitch incremental improvement **Weak:** "We'll help you do what you already do, but with data." This tells the funder nothing new. They already know they could use data. The RFP itself says they want to *"complement the expertise of field staff with data-driven approaches."* Simply echoing this back is not a proposal. **Strong:** "Here is something you thought was impossible, and here is how data makes it possible." Show the funder **new capabilities**: things they did not know they could do. The Indonesia example: field staff already knew some areas flooded. The dashboard shows them *which health facilities are in flood zones*, information they could not produce without combining datasets. That is a new capability. Note: Slide 34. The distinction between incremental and transformative proposals came up repeatedly in the crash course. Incremental proposals compete on price and efficiency. Transformative proposals compete on vision and evidence. For a strategic RFP like this one, the funder is looking for ideas they haven't had yet. --- ## Find the most compelling point Keep data presentation **simple**. A proposal is storytelling (Section F, slide 97): one message per chart, the title states the finding. If you believe evaluation matters, **embed it in the methodology**, but don't make it the headline. "Better evaluation" is not a compelling pitch on its own. The most compelling point is usually: **"Look at this specific thing your data can already show you."** A single, well-chosen example beats a comprehensive framework. The funder's reaction should be: *"I didn't know we could do that."* This is the "show don't tell" principle from Section A: demonstrate the approach, don't just describe it. Note: Slide 35. From the crash course: students who tried to show everything (all countries, all hazards, all facilities) produced weaker proposals than those who focused on one country, one hazard type, and told a tight story. Simplicity signals clarity of thought. --- ## Show new capabilities Many organisations already collect data but **don't know what to do with it**. The RFP hints at this: CLI's field staff have *"deep expertise"* but the organisation *"lacks the technical capacity to systematically complement that expertise."* Your proposal shows what becomes possible **with what they already gather**: - Field reports mention flooding → combine with geospatial data to map risk zones - Staff know which facilities are hard to reach → overlay with population data to prioritise - Regional offices track incidents informally → a dashboard aggregates and visualises across regions You are not asking the funder to collect new data. You are showing them the value of **data they already have**, combined with open sources they didn't know existed. Note: Slide 36. This connects directly to the enrichment exercise in Section E (slides 88-92): combining the funder's existing knowledge with open data sources (FloodArchive, Overpass API, population datasets) to produce something neither could produce alone. The proposal demonstrates this combination. --- # FAQ --- ## How do you frame data projects with communities? A question that arose in the crash course: **should the proposal teach communities to collect data?** The goal is not teaching communities new things. It is helping them **demonstrate that what they already know and do is powerful**. Communities in climate-vulnerable regions already have knowledge: they know which areas flood, which paths are impassable in the rainy season, where people go when they evacuate. This is knowledge that formal data systems often miss. The proposal should **surface existing knowledge**: use data to highlight the impact of what communities already do, not to replace it with external systems. The data validates and amplifies community expertise, making it visible to decision-makers who allocate resources. Note: Slide 37. This discussion connects to Section A's decolonization theme (slide 23). The shift from "we bring knowledge to communities" to "we help communities demonstrate their existing knowledge" is fundamental. Data tools should amplify local expertise, not substitute for it. --- ## Should evaluation be part of the proposal? If you think evaluation matters (and it often does), **embed it in the methodology**, not in the headline. "We will help you measure the impact of your preparedness work" is important but not compelling as a lead pitch. The funder knows they need evaluation. Pitching it as the main offering tells them nothing about what you will actually build. Instead: include evaluation metrics as part of the methodology section. The dashboard tracks facility risk over time → this naturally produces evaluation data. The monitoring tool logs response times → this feeds into programme evaluation. Evaluation is a **by-product of good methodology**, not the product itself. When embedded well, it strengthens the proposal without distracting from the core story. Note: Slide 38. This came from a student question during the crash course. Several students wanted to make evaluation central to their proposals. The advice was consistent: evaluation strengthens a proposal when it is woven into the methodology, but weakens it when presented as the headline, because it implies you are selling measurement, not action.