Next Billion Stories

Ethical Research for the Next Billion Users

Towards building a code of ethics for researching the ‘Next Billion Users’. This is part of our series, Next Billion Stories, in collaboration with strategic design studio Obvious

Healthy Crossroads in Pregnancy Care was a project that aimed to understand everyday practices of pregnancy care in rural India, at the community level. While we were holding a session with community health workers, one of the women narrated an incident about a colleague who went beyond the line of duty to protect a pregnant woman in a socially difficult situation. The incident offered us a detailed vignette about the sort of work that goes unseen and is invisible in the larger systems of healthcare infrastructure.

Fictionalised scenarios of frontline health workers captures nuances while protecting their specific identities. Credits: Swati Sharma, Zainab Akbar, Anushri Ghode, and Haritha Pitla.

While we felt it was important for us to make such work known, we had to seriously consider the possible implications of our actions.

Sharing the story publicly would mean potentially putting the community health worker in danger of disciplinary action by her superiors. Yet, the story was very powerful, and would certainly help in the design of any future systems.

After deliberation, we fictionalised the story and completely de-located it, to prevent the healthcare worker from being identified. This dilemma is just one of many that could arise while conducting research with the next billion users. While this community is currently being targeted by corporations and developmental organisations alike, we still lack a framework of ethics to guide the manner in which research is done with them, particularly in commercial contexts.

Lessons From Academic Research

Academic research to a large extent, is conducted with processes and guidelines in place to protect the people whom we research. Consider for example the Minimum Ethical Standards adopted by the Information and Communications for Development (ICTD) research and practitioner community, which outlines the ethical considerations any research with a development agenda needs to consider.

Similarly, most academic institutions have established research ethics review boards. More importantly, research that engages end-users, for eg. participatory design research, or Human-Computer Interaction (HCI) research, have begun to articulate emerging ethical concerns and ways forward to address them, as exemplified by the ‘In-Action Ethics’ framework proposed by Christopher Frauenberger, Marjo Rauhala and Geraldine Fitzpatrick.

Towards Ethical Research in a Commercial Context

However, a common framework for commercial user-research amongst the Next Billion Users is currently missing.

We recognise that arriving at such a framework must be a collaborative journey that takes the views of onboard multiple stakeholders.

Towards this we have begun a series of collaborative workshops where we facilitate groups of diverse user researchers and designers to articulate ethical challenges as well as their individual best practices to address these challenges. We present some early patterns from two workshops (conducted at Design Up 19 and Obvious) here.

1. Enriching vs Extractive

The foundation of any ethical research framework is the approach, which must choose to be enriching rather than extractive from the outset. This means discarding stereotypes of researcher-respondent relationships and creating a collaborative system where everyone is a co-creator. This requires thinking through ways in which one can consider, engage and determine with user communities what should be researched, how that research should be conducted, and how the data should be shared.

Working with Community health workers to determine what areas and issues should be researched and how. Source: HCPC.

2. Respect for Trust

There is an inherent tendency amongst the respondents from the next billion to easily trust figures vested with an apparent authority. They will accede to most researcher requests for information, without worrying about implications.

Situations such as these push researchers to think through their roles: Is one merely a collector of data, or a facilitator or can they even be an educator/activist for better data and digital literacy?

3.  Sensitivity to Context

Tanushree Jindal, design researcher at Obvious speaks of her experience while working with nurses in rural Punjab, as part of research for a healthcare app. 

“The nurses are part of tight-knit social communities and we were very mindful of the fact that nothing we did or said should in any way impact the way they were regarded by their community. During testing, for example, patients watching the nurse enter data in the app, thought she was playing a game on her phone and ignoring them – and we knew it was incumbent on us to ensure we had to find a solution to this.”

Tanushree Jindal, design researcher at Obvious

4. Recognising the Delta between Mental Models

It is key to recognise the difference between the mental models of the researchers versus the next billion. This is particularly important in areas of safety and privacy, where their current paradigms may not allow them to fully appreciate any risks posed by their participation in the study. Traditional forms of seeking consent do not make much sense in these research scenarios, providing more a  box ticking solution for the research team as opposed to any meaningful participation from the subject.

5. Managing Data and its Legacy for Multiple Stakeholders.

Increasingly user research has multiple audiences and stakeholders, particularly so if we also consider end-user communities as having a stake in the data they share. Additionally, data that is stored over time will invariably be used in multiple ways, beyond the original scope of the research project. The question of who will manage the data and how it is being used over time, beyond the time and scope of the project becomes crucial. Some possibilities are to give control of this to the end-user communities, and / or put together a third-party gatekeeper and arbitrator.

6. The Tightrope between Personal Code and Organisational or Business Needs.

Often times the organisation/client needs are at loggerheads with a researchers code of protecting the users and their data.

This appears to be mostly due to a lack of awareness about the role of user research and the qualitative value it brings through nuanced stories and personas, where the raw data linked to individual users are not important.

Researchers have to however tread this balance between what the organisation or client expect from the research versus what they should be delivering. Educating clients about the values of qualitative user research is one way forward in doing so.

Future Forward: In Search of a Pattern Language

Should we have a common code of ethics that every researcher follows or should we follow the footsteps of academic research and submit to ethics review boards is an open question. However, we believe the answer to the complex problem of ensuring research with and for the next billion users need to be thought of in a different way.

In his seminal book, Christopher Alexander and team documented proven simple, humanist solutions to complex design problems; a work compiled over ten years. We believe that a similar approach for design research for digital products may provide the data that the researcher needs, while protecting the identity and interests of the respondents. This demands that the community of user researchers collectively work on developing a pattern language of ethical research, where common and uncommon problems and challenges are brought forward as well as possible approaches to resolve them, accessible to all.

Leave a Comment

All comments are moderated according to our comment policy. Your email address will NOT be published. All fields are required.

The Hard Copy is a resource for building and growing digital–first brands. Sign up to get case studies and advice in your inbox every week.

Related Articles