I arrived at work today only to be told by my mobile phone how long it will take to get home. My phone knows where I shop, live and park and calculates travel times as well as verbalising traffic delays. Its luminous screen bounces to life when I exclaim “are you serious?!”, as it misinterprets my words as “Hey Siri”. My phone is always listening. My phone has diagnosed my washing machine’s error code. To other Gen Xers like me, this new world we live in is both frightening and intriguing. For we oldies remember uncoiling the land line phone cord and the sound of the circular number dialer. How long did it take to dial a number? I really should have cherished those moments. Like spending the day away from home roaming the streets on push bikes, uncontactable to anyone except the fellow cycling crew. The days when if you didn’t know something you had to consult the Funk and Wagnall’ reference books to find the answer or just be left wondering. Ah memories! Yes, once, young people of today, we were free. Information was our own. I don’t mean to be a harbinger of the doom but it’s time to PANIC.
Tiny bits of information about ourselves—our practices and our preferences—are able to be quantified and retained by others.
If you can imagine all of those little decisions we make, such as driving to particular locations, searching the internet or making a social media-enabled comment, we also need to imagine that all of these tiny bits of information about ourselves—our practices and our preferences—are able to be quantified and retained by others. Big data has become the term most associated with this phenomenon: it includes record-keeping in relation to a range of data points: geo locations, communications, transactions, health data and consumption practices to name but a few.
The conundrum of big data is in its ownership. Recently a judge in Ohio ruled that a pacemaker’s data could be used in evidence in an arson case. Obviously this raises ethical concerns. Would you give permission to have a medical device fitted to save your life if you knew it could be used to interpret your movements or could be used against you in a court of law? Who owns that data? Do you if it’s in your body or does the medical professional who fitted the device? Who has access to such data? Is it retained or deleted after a certain amount of time?
Should governments be able to access data from your home appliances in the fight against terrorism? Such questions have been at the forefront of people’s minds this week after an investigation by The Guardian revealed private Medicare details were available for sale. The matter is now being investigated by the Federal Police however this, and other examples of the failure of computer systems to protect private data, reveal how both governments and businesses are struggling with securely protecting data.
These data are our stories.
Not only is big data about the security of its keepers. The other conundrum of big data is in its ownership. The Australian Census captures information about Australians: to whom should this data be available? The Australian Bureau of Statistics makes data available via a publicly accessible website. Although data are crunched and broken down into various categories, people outside academia and research environments might not find the information enlightening. As with any type of quantitative measure, the findings from the 2016 Census can tell us the type of dwelling and household chores of the ‘typical Australian’ yet it cannot tell us anything of the experience of being a ‘typical Australian’. This is where qualitative data—information which is contextualised by individual experience and analysed for patterns and shared experiences with other research participants—has held its own.
I think that one of the reasons people don’t see the algorhythms of the micro practices—the search terms they use online or a comment on an article or the use of an app—as concerning is because these are seen as generating big data and only numbers. This logic would have it that that makes us unique—our experiences, our movements and our sense of ourselves is safe. But the collection and storage of big data should not be seen outside more stringent ethical and legal frameworks. Because numbers or not, these data are our stories. Our practices and our everyday lives. Recent research argues that we cannot ignore ethics or emotion in the face of a seemingly growing rationality. Similarly, ethical and research frameworks must be deployed to make sense of this new landscape.
Big data ought to be accountable to the same codes and practices that ethical researchers adhere to.
As a social researcher who explores intimacy, relationships and parenting, I am always asked to demonstrate that my research will not have negative impacts on my participants, that I will anonymise data and that I will adhere to all of the provisions outlined in the national human research ethics framework. Prior to any research taking place, a clearly argued plan including ethical justifications are presented to an ethics committee for scrutiny. Research takes a long time and is highly governed. And rightly so. Yet say I invented an app which encourages people to support one another through relationship issues, I would have access to a lot of data which I could ‘own’ and use to analyse relationship patterns such as the most consistent problems, solutions, help-seeking behaviour and so on. My app users would not even know their stories or accounts were being analysed. To a researcher this breaches our ethical code. Why would big data dodge research ethics? Are ethics and legislation overwhelmed by rapid technological advance?
The growth of big data is a challenge for our times but is not an ahistorical conundrum. Big data’s reach stretches out to not only the academic community, who must ask who ‘own’ data, but our daily routines and communications are influenced and informed by its pull. Although there are many historical examples of unwilling participation in research such as the Tuskegee experiment in which people with syphilis were left untreated or the Nazi experiments on hypothermia, ethical codes of research highlight the importance of willing participation: consent, in research. Informed consent involves the risks and benefits of participation in research being weighed up by the potential participant. Aside from the broader question about whether numerical measurements speak to the experiences of individuals, big data ought to be conceptualised as research and accountable to the same codes and practices that ethical researchers adhere to in our contemporary world. But technology is fast. And ethical and legal frameworks slower. Until they catch up, just remember that your information is not yet your own.