Exploring the world of self-tracking: who wants our data and why?
Benjamin Franklin used to keep charts of his time spent and virtues lived up to. Today, we use technology to self-track: our hours slept, steps taken, calories consumed, medications administered. But what happens when we turn our everyday experience — in particular, health and wellness-related experience — into data?
“Self-Tracking” (MIT Press) by Gina Neff and Dawn Nafus examines how people record, analyze, and reflect on this data — looking at the tools they use and the communities they become part of, and offering an introduction to the essential ideas and key challenges of using these technologies. In considering self-tracking as a social and cultural phenomenon, they describe not only the use of data as a kind of mirror of the self but also how this enables people to connect to, and learn from, others.
They also consider what’s at stake: who wants our data and why, the practices of serious self-tracking enthusiasts, the design of commercial self-tracking technology, and how people are turning to self-tracking to fill gaps in the healthcare system. None of us can lead an entirely untracked life today, but in their book, Gina and Dawn show us how to use our data in a way that empowers and educates us.
We caught up with Gina to explore the self-tracking movement:
Ed.: Over one hundred million wearable sensors were shipped last year to help us gather data about our lives. Is the trend and market for personal health-monitoring devices ever-increasing, or are we seeing saturation of the device market and the things people might conceivably want to (pay to) monitor about themselves?
Gina: By focusing on direct-to-consumer wearables and mobile apps for health and wellness in the US we see a lot of tech developed with very little focus on impact or efficacy. I think to some extent we’ve hit the trough in the ‘hype’ cycle, where the initial excitement over digital self-tracking is giving way to the hard and serious work of figuring out how to make things that improve people’s lives. Recent clinical trial data show that activity trackers, for example, don’t help people to lose weight. What we try to do in the book is to help people figure out what self-tracking to do for them and advocate for people being able to access and control their own data to help them ask — and answer — the questions that they have.
Ed.: A question I was too shy to ask the first time I saw you speak at the OII — how do you put the narrative back into the data? That is, how do you make stories that might mean something to a person, out of the vast piles of strangely meaningful-meaningless numbers that their devices accumulate about them?
Gina: We really emphasise community. It might sound clichéd but it truly helps. When I read some scholars’ critiques of the Quantified Self meetups that happen around the world I wonder if we have actually been to the same meetings. Instead of some kind of technophilia there are people really working to make sense of information about their lives. There’s a lot of love for tech, but there are also people trying to figure out what their numbers mean, are they normal, and how to design their own ‘n of 1’ trials to figure out how to make themselves better, healthier, and happier. Putting narrative back into data really involves sharing results with others and making sense together.
Ed.: There’s already been a lot of fuss about monetisation of NHS health records: I imagine the world of personal health / wellness data is a vast Wild West of opportunity for some (i.e. companies) and potential exploitation of others (i.e. the monitored), with little law or enforcement? For a start .. is this health data or social data? And are these equivalent forms of data, or are they afforded different protections?
Gina: In an opinion piece in Wired UK last summer I asked what happens to data ownership when your smartphone is your doctor. Right now we afford different privacy protection to health-related data than other forms of personal data. But very soon trace data may be useful for clinical diagnoses. There are already in place programmes for using trace data for early detection of mood disorders, and research is underway on using mobile data for the diagnosis of movement disorders. Who will have control and access to these potential early alert systems for our health information? Will it be legally protected to the same extent as the information in our medical records? These are questions that society needs to settle.
Ed.: I like the central irony of “mindfulness” (a meditation technique involving a deep awareness of your own body), i.e. that these devices reveal more about certain aspects of the state of your body than you would know yourself: but you have to focus on something outside of yourself (i.e. a device) to gain that knowledge. Do these monitoring devices support or defeat “mindfulness”?
Gina: I’m of two minds, no pun intended. Many of the Quantified Self experiments we discuss in the book involved people playing with their data in intentional ways and that level of reflection in turn influences how people connect the data about themselves to the changes they want to make in their behaviour. In other words, the act of self-tracking itself may help people to make changes. Some scholars have written about the ‘outsourcing’ of the self, while others have argued that we can develop ‘exosenses’ outside our bodies to extend our experience of the world, bringing us more haptic awareness. Personally, I do see the irony in smartphone apps intended to help us reconnect with ourselves.
Ed.: We are apparently willing to give up a huge amount of privacy (and monetizable data) for convenience, novelty, and to interact with seductive technologies. Is the main driving force of the wearable health-tech industry the actual devices themselves, or the data they collect? i.e. are these self-tracking companies primarily device/hardware companies or software/data companies?
Gina: Sadly, I think it is neither. The drop off in engagement with wearables and apps is steep with the majority falling into disuse after six months. Right now one of the primary concerns I have as an Internet scholar is the apparent lack of empathy companies seem to have for their customers in this space. People operate under the assumption that the data generated by the devices they purchase is ‘theirs’, yet companies too often operate as if they are the sole owners of that data.
Anthropologist Bill Maurer has proposed replacing data ownership with a notion of data ‘kinship’ – that both technology companies and their customers have rights and responsibilities to the data that they produce together. Until we have better social contracts and legal frameworks for people to have control and access to their own data in ways that allow them to extract it, query it, and combine it with other kinds of data, then that problem of engagement will continue and activity trackers will sit unused on bedside tables or uncharged in the back of drawers. The ability to help people ask the next question or design the next self-tracking experiment is where most wearables fail today.
Ed.: And is this data at all clinically useful / interoperable with healthcare and insurance systems? i.e. do the companies producing self-monitoring devices work to particular data and medical standards? And is there any auditing and certification of these devices, and the data they collect?
Gina: This idea that the data is just one interoperable system away from usefulness is seductive but so, so wrong. I was recently at a panel of health innovators, the title of which was ‘No more Apps’. The argument was that we’re not going to get to meaningful change in healthcare simply by adding a new data stream. Doctors in our study said things like ‘I don’t need more data; I need more resources.’ Right now we have few protections for individuals that this data won’t be able to harm their rights to insurance, or won’t be used to discriminate against them and yet there are few results that show how the commercially available wearable devices are delivering clinical value. There’s still a lot of work needed before this can happen.
Ed.: Lastly — just as we share our music on iTunes; could you see a scenario where we start to share our self-status with other device wearers? Maybe to increase our sociability and empathy by being able to send auto-congratulations to people who’ve walked a lot that day, or to show concern to people with elevated heart rates / skin conductivity (etc.)? Given the logical next step to accumulating things is to share them..
Gina: We can see that future scenario now in groups like Patients Like Me, Cure Together, and Quantified Self meetups. What these ‘edge’ use cases teach us for more everyday self-tracking uses is that real support and community can form around people sharing their data with others. These are projects that start from individuals with information about themselves and work to build toward collective, social knowledge. Other types of ‘citizen science’ projects are underway like the Personal Genome Project where people can donate their health data for science. The Stanford-led MyHeart Counts study on iPhone and Apple Watch recruited in its first two weeks 6,000 people for its study and now has over 40,000 US participants. Those are numbers for clinical studies that we’ve just never seen before.
My co-author led the development of an interesting tool, Data Sense, that lets people without stats training visualize the relationships among variables in their own data or easily combine their data with data from other people. When people can do that they can begin asking the questions that matter for them and for their communities. What we know won’t work in the future of self-tracking data, though, are the lightweight online communities that technology brands just throw together. I’m just not going to be motivated by a random message from LovesToWalk1949, but under the right conditions I might be motivated by my mom, my best friend or my social network. There is still a lot of hard work that has to be done to get the design of self-tracking tools, practices, and communities for social support right.
Note: This post was originally published on the Policy & Internet blog on . It might have been updated since then in its original location. The post gives the views of the author(s), and not necessarily the position of the Oxford Internet Institute.