The three dilemmas of data gathering
December 21, 20171.8K views0 comments
What companies do with customer data needs to be considered more closely.
“Alexa, order more tissues.”
“Siri, set a reminder to phone the doctor.”
“OK Google, turn on the light in the hallway upstairs.”
After an initial phase of euphoria about how the digital world improves our daily life with virtual assistants, among other gadgets and services, slowly but surely, we’re coming firmly back to earth. Post-honeymoon, the breakneck speed of digital development has created an increasing number of concerns. The societal debate is now focused, in particular, on our right to privacy and the increasing market power of the digital giants.
European politicians have meanwhile woken up. After a somewhat premature Dutch law limiting the cookies companies could install on computers, the European Union will implement the General Data Protection Regulation in 2018, providing European citizens with more transparency and control of their own data.
The activities of Google, Facebook and the rest of the Big Five are also now pursued critically with regard to abuse on several fronts, including market power, payment of taxes and news distribution, resulting in probes and fines by the EU and, in the United States, executives appearing before Congress.
Hopefully, over the medium term, this will lead to a world in which we have much more control over our own data, where it is better used and with fair digital competition, rather than relying on the internet giants to decide what’s best. However this is just the beginning: Besides privacy and the market power of the Big Five, there are three other ethical dilemmas related to digital which have barely been addressed so far and which firms should be thinking about if they want to avoid a crisis of trust.
Buying data
The first question is: How far can you go when buying data? For example, car insurance companies offer discounts if people share their driving behaviour or movement data. In the United Kingdom, drivers can install trackers in their cars that log information about their driving in exchange for lower insurance premiums. At first glance, this seems like a fair deal, as the drivers who are more reckless pay more. But in practice, trackers may mean that people on lower incomes are economically compelled to disclose their entire digital life in order to get a discount or free services.
Digital privacy could become reserved for the rich as they would be the only ones who can afford both higher insurance premiums and a costly speeding ticket.
The end of solidarity?
A second dilemma is how far can you go in terms of treating people differently as their digital profiles become more refined? Will people with a high hereditary risk profile still be able to obtain life or health insurance? Can you charge higher prices to customers who are less able to understand the offering at hand? Our solidarity with others, one of the mainstays of modern society, is likely to suffer when we have a completely transparent view of each individual’s specific risks. This will have major implications for those amongst us who are less lucky in life.
Influencing behaviour
The third dilemma is: To what extent is using data to actively influence people’s behaviour acceptable? We are already accustomed to seeing “relevant” advertisements thanks to retargeting or remarketing which greatly increase the likelihood that you will click the buy button. Another example is Netflix offering up yet another episode in the middle of the night. The debate about the impact of social media on our daily life is just starting. Also the type of news we are exposed to can easily be manipulated as recent discussions around the filter bubble and “fake news” have shown. All these recommendations pushed under our noses are based on in-depth knowledge of our unconscious impulses and therefore seem to increasingly deprive us of free will.
Who decides what is enough
For companies, it’s almost impossible not to take part in this data rat race. By now, every self-respecting firm has to have a refined strategy to tempt its customers into sharing as much personal data as possible. Big data analysts are recruited to crunch these data and rush to create ‘use cases’ or applications to earn more money from these insights.
Few companies can afford the luxury of having an open debate around the three ethical dilemmas because market forces push them to play the game. Therefore there’s a strong risk that we’ll end up in a situation that none of us find desirable, or in economic terms: The pursuit of individual digital business interests can lead to significant negative externalities for society as a whole. People might feel forced to share data they do not want to share in order to survive. There could be a decline in solidarity and we could be nudged into doing things we do not really want to do.
In the coming years, limiting the digital debate to only privacy and market power will therefore not be enough. We must broaden the discussion and remember that the dilemmas concerning the gathering and use of data are too important to be left to companies alone.
Annet Aris is an Adjunct Professor of Strategy at INSEAD. She is also a board member of Thomas Cook PLC in London, ASML Holding N.V. in Veldhoven, ProSiebenSat.1 Media SE in Munich, ASR Nederland N.V. in Utrecht and Jungheinrich AG in Hamburg.
Annet was named one of the 50 most inspirational women in the European technology sector for 2016 by Inspiring Fifty. Marking her position as an important role model, she is a permanent member of the Inspiring Fifty: Europe Hall of Fame.
This article is republished courtesy of INSEAD Knowledge. Copyright INSEAD 2017.