The context: a resolve to save lives
Two hundred million Indians have hypertension or high blood pressure – most people with high blood pressure live with no awareness of their condition: only 25 to 42 per cent of rural and urban Indians know that they are hypertensive.
In November 2017, the Ministry of Family Health and Welfare in India announced the India Hypertension Management Initiative (IHMI), a collaborative project with the mission to reduce deaths caused by hypertension. They were joined by Resolve to Save Lives (RTSL), an initiative of Vital Strategies that works to strengthen public health systems all over the world, especially in low and middle-income countries. Hypertension cannot be cured. But it can be treated with inexpensive medication (and quite well). In the many steps to treat billions of people globally, the first and most necessary was the collection of information. Tracking hypertension is essential for actionable reporting and improved patient management.
In thinking about the long-term goal to bring effective, low-cost treatment to a large population of people, RTSL asked: how do we efficiently monitor hypertension in order to prevent escalation to avoidable health complications?
How do you test a product in a real-world healthcare setting in remote rural India?
The first piece of information recorded by a healthcare worker (this might be a nurse or an ASHA worker) in an Indian clinical setting is the blood pressure of a patient. This immediately made it apparent that capturing this data on a digital platform could aid in the creation of both personal records and to allow for the mapping of longitudinal data for the population and insight into control rates.
RTSL developed an initial high-fidelity prototype for the primary health care worker: a nurse in the Indian context, who could use the app to track the blood pressure and prescribed medications for treatment for each patient, if required. Testing the prototype app (and idea) in Punjab proved that the idea had merit. This was the point at which we were brought in to collaborate with RTSL on building an app for nurses that they could use on their smartphones, called Simple.
We hit the ground running, using clickthrough prototypes to enable the user group (the nurses) to provide us with feedback that helped to improve product design. This is a cost-effective, critical strategy we use in every stage of a project. The high-fidelity prototype created the visual layout for capturing basic information for a patient – name, age, blood pressure, their last visit and the prescription of medication.
This project presented us with some atypical challenges. We were working with hospitals in rural India where phones are often offline and consume battery looking for network. All data collected had to be stored locally on the device. Unsurprisingly, quite unlike working on product research for an e-commerce app or ride-sharing platform where we can readily find users (sometimes even among our own social circles) and easily available for feedback – it was challenging for the nurses to find time to help us test our prototypes. These nurses work in a high-intensity environment which requires them to attend to hundreds of patients every day, which meant we needed to be cognisant of the demands we made on the nurses’ time. We attempted to do this without interrupting patient care and learning as much as possible through interaction and observation. It was crucial that we could bring something back at each step for testing that actually worked.
Given these constraints, we decided we needed an app that would allow us to learn as much as we could. We built more detailed prototypes that we tested with nurses in a studio setting in Bangalore. This extensive user research helped us evolve our understanding of the process and iterate accordingly — only the features that succeeded in these experimental prototypes would make their way into the final version.
Responding to new challenges
In our work, we often realise that even the best design practices need recalibration and reinvention. We needed an effective alternative for Simple. Could we build a beta product to test in the real world, and have it simulate real life scenarios? We needed to a scenario in which the nurses could use features in the app and indicate necessary modifications on the go. If a nurse could not find the correct record for Amar Singh, for example, we needed to resolve this and bring back an updated feature for her to give us further feedback. This meant that before we entered the field, we needed a workable app on which we could conduct the user-testing. We couldn’t go back with only slightly better prototypes. The product we tested needed to be real – handle real data and available for download on the Play Store.
This beta app need to be engineered with basic features to conduct actual testing in the field, which were healthcare centres located in Bathinda, Punjab. A nurse needed to be able to create a record for every patient, pull up records for returning patients, and update relevant information such as age (date of birth) and blood pressure. Finally, they needed to enter medication details for the patients’ treatment. The nurse could also use an ID card that we designed. They needed to figure out how to modify mistakes and update recorded information, feeding in details for the next scheduled check-up for each patient. However, when we went into the field, we realised that our assumptions for how this information could be recorded were still off the mark.
The Experimental Version
And this is precisely what we built. We engineered a high-fidelity, front-end only, “seemingly real” app with React Native and ClojureScript. This was an app that didn’t worry too much about performance or security but allowed us to enter real data. We ran ten simultaneous and rigorous control studies in the field and the studio until problems tapered away. As soon as we knew something worked, we built the feature into the production app (a native app). We learnt a little bit more, made ongoing changes, but our process was much faster. Compared with testing click-through prototypes, turning design into code allowed us to think ahead for all possible use cases, states, and flows: from user’s point of view and a technical one. The completed user study meant a design solution that was validated and ready in its entirety.
We took all our successful experiments and created an app that had been rigorously tested on the ground, while simultaneously gathering important, large data sets.
We launched Simple in November 2018, phasing the roll-out to five clinics at a time. The app is now at work with real patients in 129 facilities. This includes facilities in Bathinda, Mansa, and Gurdaspur districts in Punjab, and 80 private clinics in Mumbai in collaboration with PATH. We have quantitative data for 10,000 patients, and feedback from nurses from the ground helps us continue to iron out other kinks that emerge.
Simple will be launched in many more locations in the months to come. And at each stage, we intend to keep learning and keep improving!