IDR | Is Data Failing Us?
By Natasha Joshi, Chief Strategy Officer, Rohini Nilekani Philanthropies
Last year, we at Rohini Nilekani Philanthropies interviewed 14 social sector leaders and funders to inquire what they thought were the biggest issues in society today, and what role philanthropy could play in addressing them.
It was no surprise that mental health, absence of climate resilience, urban infrastructure strain, rural distress, and unemployment-induced crime came out on top.
The group also evaluated India’s funding landscape and arrived at some interesting observations. First, even though individual and family wealth has multiplied in India, it has not led to a proportionate increase in giving. And second, which might explain the first, was a frustration with donors wanting to draw straight lines between the money they have given and benefit accrued on the ground. This, our respondents said, has led to a standardisation and over-quantification of complex human work. The final point of feedback was that donors have started confusing numerical scale for social change.
These three reflections led to one insight: Social change needs activism, but we have gotten stuck with ‘datavism’, that is, a push to define all outcomes of social programmes in quantifiable, visible ways.
This shift is regrettable. Given the limits of data, an over-reliance on it to make sense of social programmes leaves us neither here nor there. Unlike bazaar, we cannot drive nor measure outcomes because the incentives, feedback loops, mandates, as well as a singular focus on the bottom line (that makes market action possible) is missing here. And we do not have the writ or budgets like sarkar to make social programmes work at scale universally.
Pushing nonprofits to express the value and power of social programmes in bazaar or sarkar terms just results in the sector losing its true merit which has always been samaaj-driven work—work that puts values, relationality, and care at its centre.
Philosopher and professor C Thi Nguyen puts it perfectly when he writes, “These limitations [of data collection and the content of big datasets] are particularly worrisome when we’re thinking about success—about targets, goals, and outcomes. When actions must be justified in the language of data, then the limitations inherent in data collection become limitations on human values.”
What data does not tell us
The word ‘data’ was first used in 1946 to mean transmissible and storable computer information. Today, the word has become analogous with information and, by extension, understanding. This is obviously problematic.
Data does not equal understanding, which is a deep human capacity that goes beyond articulation. We often understand, even when we cannot explain.
There is no such thing as clean and unbiased data when it comes to social programmes.
Data, information, and knowledge are not the same. Data is information a computer can process. It is a specific rendition of reality, but it is precisely that: arendition. Not the whole, and often not even accurate. Still, data-backed policies, data informed curriculums, and so on are seen as the gold standard for decision-making.
Data is limited in many ways, including how it is collected, by whom, and for whom. There is no such thing as clean and unbiased data when it comes to social programmes. The objective use of data is also rare, because who uses it and for what is again a question of incentives, power dynamics, and prior experiences.
A friend who works at a think tank once remarked, “In the West, they talk of data-backed policymaking. In India, we do policy-backed data-making.”
When source and application are both compromised, why has datavism taken over in the field of development? One obvious reason is that market economics is the dominant method for valuing goods and services. So, the same method is being deployed to ascertain value in the social realm. All the while, care is confounding, because it violates a lot of economic principles, including the assumption that people always pursue their self-interest over others.
But what do we stand to lose when we privilege data science over human understanding?
C Thi Nguyen explains this through ‘value capture’. It is the process by which “our deepest values get captured by institutional metrics and then become diluted or twisted as a result. Academics aim at citation rates instead of real understanding; journalists aim for numbers of clicks instead of newsworthiness. In value capture, we outsource our values to large-scale institutions. Then all these impersonal, decontextualizing, de-expertizing filters get imported into our core values. And once we internalize those impersonalized values as our own, we won’t even notice what we’re overlooking.”
One such thing being overlooked is care.
Interpersonal caregiving makes no sense from a market lens. The person with power and resources voluntarily expends them to further another person’s well-being and goals. The whole idea of care is oceanic and hard to wrap one’s head around. ‘Head’ being the operative word, because we are trying to understand care with our brains, when it really exists in our bodies and is often performed by our bodies.
Data tools have only inferior ways of measuring care, and by extension designing spaces and society for it.
Outside of specific, entangled relationships of care, humans also have an amorphous ability to feel that they are part of a larger whole. We are affiliated to humanity, the planet, and indeed the universe, and feel it in our bones rather than know it to be true in any objective way.
We see micro-entrepreneurs, inventors, climate stewards, and scores of people, both rich and poor, across circumstances who engage in collective care to make the world a better place. This kind of pro-sociality doesn’t always show in ways that is tangible or immediate or measurable.
Datavism, which we seem to have learned from bazaar, has convinced capital allocators that the impact of social programmes can and should be expressed arithmetically. And, based on those calculations, acts of care can be deemed successful or unsuccessful.
What datavism misses
Datavism tends to favour marginal improvements in measurable outcomes at the expense of social costs, because the former is easy to assess in the short run, while the latter only shows up over time.
With the emergence of artificial intelligence (AI), we risk seeing a proliferation of what economist Daren Acemoglu calls ‘so-so technologies’: technological advances that disrupt employment and displace workers without generating much of a boost in productivity or quality of service. Think self-checkout kiosks at grocery stores or automated customer service over the phone.
Datavism reduces the creative potential of technology as well. This is explained by author Lata Mani in The Integral Nature of Things: Critical Reflections on the Present. By seeing technology only as a tool, datavism ignores the fact that technology “reorganises perceptions and generates its own longings”, and becomes part of the social process instead of just a mediator.
Technology is now being applied to almost every programme in the philanthropic sector, but questions of whether and how it enables relationships of care are mostly absent.
If technology is truly meant to serve us, then putting care in the mix feels non-negotiable.
To do this, we must stop treating emotion as the enemy of objectivity. As we ride the limitless curve of technological change, everyone is occupied with what tomorrow holds. But leading futurists will tell you that getting in touch with one’s emotions and desires best predicts our interaction with whatever the future holds. In Imaginable, Jane McGonigal describes it as getting one’s mind unstuck, which means practising hard empathy as a way of understanding human wants.
Data can have diminishing returns on understanding
The question isn’t whether data helps us make sense. Of course, it does. The real question is whether our sensing tools fit the environments we’re trying to understand.
If understanding is your goal, data works better when the problem statement is narrow, and the environment is simple and controlled.
As the environment becomes more complex, the link between data and understanding starts to complicate. Data gives you some grounding, but much of the understanding comes from locus, experience, and trial and error. Such understanding takes time, and kicks in non-linearly. Do we have this patience?
In the essay ‘The End of Understanding’, Stanford University lecturer and science journalist Grace Huckins says, “Never has it made sense to ask whether science is about developing new technologies and interventions or about understanding the universe—for centuries, those two goals have been one and the same. Now that big data and AI have dissociated those two objectives, we have the responsibility to decide which matters most. Data has given us permission not to understand the world around us.”
Social science is its own realm
As we face the future, it is imperative to shore up social capacities, so we are resilient enough to tackle the unknown unknowns when they arrive. The social sector has always been the best site for this investment. The very recent, once-in-100-years pandemic showed us how civil society was the first responder, and thick networks of community and care took us past the initial months where science was on the backfoot.
What is the role, then, of philanthropy in building the social sector up?
Anecdotally, social change leaders will tell you there is a progress plateau happening in development, and navigating the polycrisis requires a deeper contention with systemic challenges, power dynamics, identity, and incentives, which datavism wholly ignores.
More initiative is necessary to bring social science and impact measurement together, in a way that foregrounds care, dignity, and joy.
Donors need to know the result of their donations, and nonprofits want to understand the effects of their programmes. So, in that way, monitoring and evaluation is integral to field effort. While frameworks to understand and track processes in complex adaptive systems exist, they are cumbersome and require high expertise. In fact, that is the most popular critique offered by datavists. But instead of throwing the baby out with the bathwater, can one modify what is considered rigorous and who is considered an expert?
Can we collectively explore how field and forum can be combined? Emotions such as pride, honour, disgust, and vengeance need a place in our way of making meaning when it comes to field-based work. More initiative is necessary to bring social science and impact measurement together, in a way that foregrounds care, dignity, and joy.
From datavism to abundance
If we make our ways of seeing and understanding more abundant, it might free capital to flow more easily in all directions, instead of being pushed out through the narrow funnel of data and ‘impact’ alone.

It’s important to end by saying that the intention of this article is not to knock the hard work of all the people who collect, analyse, and present important data; we do it too. The idea is to firmly recognise the limitations of data and de-centre it when we talk of people, places, and species that exist beyond its totalising logic. The idea is to know that, in this field, there are always more questions than answers.
Keywords
You may also want to read
IDR | The limits of AI in Social Change
– Gautam John, CEO, Rohini Nilekani Philanthropies More actors—from grantmaking to service delivery—are exploring the use of AI. However, the excitement around scale and efficiency often overshadows a critical question: What does it mean[...]
Purposeful Capital for a Livable Future | AndPurpose Forum, Bengaluru
At the AndPurpose Forum held in Bengaluru in July 2025, Gautam John (CEO, RNP) spoke about how philanthropic capital and collective action can drive systemic change. The panel included Shobha[...]
Alliance Magazine | What if we funded justice differently?
Justice has often been philanthropy’s stepchild. In numerous donor forums I’ve attended, we’ve eagerly rallied around education, health, and livelihoods. However—when the conversation turns to justice, ensuring people can access[...]
