Data Duality

Technology has great potential to better our lives and further humanity’s goals. Nearly everyone walks around with a smartphone and can’t imagine how we ever functioned without it. Now we’re entering the age of wearable technology that seeks to make the interaction between man and machine seamless. The technology we rely on serves a dual purpose. While making my life easier, it also records my existence through a complex array of various sensors. It then transmits that information through what I like to call the “datosphere”, an invisible field of ones and zeros that carries information to a convenient corporate storage center. In this paper, I will explain what happens to our data and the ethical concerns it poses.

Everytime we use one our smart devices we generate a stream of data. There’s the data from the intended action, but also the byproduct information collected during that action. Generally, this is referred to as metadata. The best example is when we take a picture with our brand new phone. The data we intend to gather is what makes up the picture itself and is necessary to recreate it digitally. The metadata collected can range from something as simple as the type of our phone to the location and time of day the picture was taken. Then the data is collected, categorized, and stored for analysis. This process and information have all been placed under the label of Big Data.

Andrea De Mauro et al. define Big Data as, “ the Information assets characterized by such a High Volume, Velocity and Variety to require specific Technology and Analytical Methods for its transformation into Value.” Without the jargon, this means once a large amount of valuable data has been collected, it’s analyzed to determine trends and suggest profitable ways the data can be used. For example, let’s say you google popular camping sites near you. Your search is recorded and the data analyzed. The next day when you go online, don’t be too surprised by how many ads you see for camping gear.

When we add wearable tech into the mix, things start to get more personal. Devices such as a smart watch, fitness trackers, and fabrics woven to include biosensors, are capable of tracking your heart rate and rhythm, location, activity level, and respiratory rate. The question is, what happens to all the collected data? All the very personal, intimate data. Gicel Tomimbang explains data generated by wearables is unprotected due to its legal classification as consumer generated data, rather than patient generated data. However, wearable technology is gathering data of the same type as a cloud connected pacemaker so why isn’t it provided the same protection?

The culmination of these efforts can be seen the new health tracking app, created by Apple and the insurance company Aetna, called Attain. While the iWatch collects data through a series of sensors, Attain records the information and suggests how the user can improve their health. Then Apple and Aetna have access to the stored data. While both companies have announced their dedication to user privacy, we don’t know what’s done with the data. By 2020 the Big Data industry is expected to generate $34 billion.

Selling consumer data is a growing industry with little regulation. The main condition is that the data must be deidentified, essentially our names must be erased. However, in 2018 the New York Times questioned just how hard identifying someone with this data is. According to Valentino et al. they were easily able to use the data to obtain the identity of 46 year old math teacher based on her obtained data.

Ethically, this creates several challenges. Companies argue by using the devices users are presumed to have provided consent. However, without being fully informed of what our data will be used for and without being able to decide who can use it and for what purposes, is this really consent at all? Furthermore, if the data being gathered is the same type of information gathered by medical devices, why isn’t it provided the same protections? The more we explore Big Data, the more we realize how many ethical unknowns we face.

The use Big Data is a complicated moral dilemma. By exploring how Big Data is used and the ethical problems associated with it, we can begin to determine what form, if any, regulations regarding the use of personal data should take.

Sources:

  1. Farr, Christina. “Apple Watch Can Now Detect Your Irregular Heart Rhythms and Other Problems: Here’s How It Works.” CNBC, CNBC, 7 Dec. 2018, www.cnbc.com/2018/12/06/apple-watch-ecg-sensor-review.html.
  2. Farr, Christina. “Apple and Aetna Are Teaming up on a New App to Help Track and Reward Healthy Behavior.” CNBC, CNBC, 29 Jan. 2019, www.cnbc.com/2019/01/28/apple-aetna-team-up-on-attain-health-tracking-app.html.
  3. Hill, Charlotte. “Wearables – the Future of Biometric Technology?” Biometric Technology Today, vol. 2015, no. 8, Sept. 2015, pp. 5–9., doi:10.1016/s0969-4765(15)30138-7.
  4. Mauro, Andrea De, et al. “A Formal Definition of Big Data Based on Its Essential Features.” Library Review, vol. 65, no. 3, 2016, pp. 122–135., doi:10.1108/lr-06-2015-0061.
  5. Tomimbang, Gicel. “Wearables: Where Do They Fall within the Regulatory Landscape?” Wearables: Where Do They Fall within the Regulatory Landscape?, International Association of Privacy Professionals, 22 Jan. 2018, iapp.org/news/a/wearables-where-do-they-fall-within-the-regulatory-landscape/.
  6. Valentino-devries, Jennifer, et al. “Your Apps Know Where You Were Last Night, and They’re Not Keeping It Secret.” The New York Times, The New York Times, 10 Dec. 2018, www.nytimes.com/interactive/2018/12/10/business/location-data-privacy-apps.html.