As a society, we are more connected than ever. With that connectivity comes excellent opportunities but risks - many hidden in plain sight—such as the increased use of Artificial Intelligence.
In this article, we will discuss some of the issues raised in today’s world, our addiction to convenience and instant gratification; where our data is constantly gathered, analyzed, correlated, and used in ways most of us can’t conceive, and maybe exposing us to unimaginable dangers never before imagined. In doing so, we will use an analogy — the legendary deal between Faust and the Devil.
Like us, you have probably moved everything you can to the cloud, from your pictures and music to your money; it’s all stored out there somewhere in cyberspace, and it’s readily accessible to you and to the people you choose to share it with—all through your smartphone and your many other connected devices. You do so because you find it convenient, think of it as secure, and it fits in with busy lives. It all seems so effortless compared to how you used to do things a decade ago.
With little to no thought, and in exchange for having instant access to your friends and your ‘stuff’ at any time of day or night and from anywhere, you readily hit the ‘accept all’ button about terms and cookies or input your personal data on demand: after all, it’s the only way to get what you want to be done, and everyone else does it. However, you may unknowingly consent to and provide a constant stream of data to third parties, your network operator, your favorite search engines, social network, app providers, and an untold number of advertising services and buyers for your data.
In the legend, Faust is a ‘scholar’ - a man at the pinnacle of his career, yet he wants more. In exchange for more knowledge and magical powers, Faust makes a deal with the Devil (through his agent, Mephistopheles)—selling his soul in the future for power now. With his newly granted powers, Faust can indulge every whim and learn the world's knowledge. However, in the end, and as agreed, the Devil appears and claims Faust’s soul.
For this article, we want to focus on one aspect of the legend—what Faust knowingly traded away in exchange for his version of instant gratification.
In our analogy, your digital “soul” (your data) is what you are bargaining away in exchange for free access and more conveniences. But what are you giving out? Do you know what is happening to all that data? Do you appreciate the correlation between your digital alter-ego and your real-life wellbeing?
In many stories and myths, the cost of satisfying wishes can be high and appear in unexpected ways. As a consumer, you have little absolute control over how your data is collected, its use, and by whom. In reality, consumers don’t care all too often—so long as we grant their wishes with little effort on their part for “free.” However, in this competitive commercial world, those organizations may present you with choices that prioritize their needs rather than yours.
What’s more, they use technology and psychology to formulate messages tailored for you that they are constantly whispering in your ear: that you need this product, to shop at that store, or to “buy now to avoid disappointment.”
It all appears reasonable; they seem to know what you want (or should want) and feed on your insecurities and your desire to be happier, better looking, more popular, and so on.
Why care about how it works, as long as it does?
Despite the horror stories about data breaches or misuse of personal data, most people have little idea of the feeding frenzy around their data and how much governments or businesses harvested. Even fewer know how that data is analyzed to form an insight into who you are, what you do, why, and how to exploit those insights.
For this article, Peter explored what a popular search engine provider ‘knows’ about him; he is a married, middle-aged male, a business owner, who is also interested in astronomy, web-hosting, science fiction, and about 150 other topics. However, it has also been incorrectly identified that he is interested in beauty services, coffee makers, country music, and about 50 other things. Two things worry us about this; 1) they know a lot about Peter, and 2) about a quarter of what they “know” is wrong (for now).
Bearing in mind that business decision-making (especially using artificial intelligence) is only as accurate as the data used to inform it, they will make many incorrect assumptions and ill-informed decisions. Also, Peter has no insight into how those decisions are being made or control of what data is being collected about him and why (and if you have ever clicked “accept all” on the cookie banner, you are just as vulnerable as Peter the rest of us).
In a previous article, we discussed the deliberate misuse of this power to manipulate consumer behavior—so-called “dark psychology.” Still, whether it’s done for good or ill, everyone needs to be careful about how this power is used. AI may eventually solve some of the most intractable problems of our age or create a whole host of new ones.
Like any innovation, artificial intelligence is a tool—a double-edged sword if you will; it is neither ethical nor immoral, intrinsically good or evil—that’s down to the discretion of those who wield it. It is up to humans to set limits on where AI is applied and how it acts.
Suppose you teach an AI that only the ‘ends’ are essential. In that case, it will surely come up with some unpalatable (to us)‘means’ (utilitarianism)—as Facebook discovered, unconstrained AI will find the most efficient ways to achieve an end, but few humans would say that it is because of their “efficiency.”
According to behavioral economics, humans have over 150 ways our logic lets us down (cognitive biases); if they are known, they can be anticipated and used (even manipulated). Marketers have been doing it for decades, and now organizations are teaching the machines to use their vast computing power to focus on the individual, down to the level of personal feelings and motivations.
AI can now anticipate your mood, your daily movements, and even your need for healthcare and make recommendations that directly influence your decision-making process; from which route to take as you drive home, which movie to watch, or which product to buy, and perhaps even influence the way you vote. But it’s still humans who decide how and whether an AI is working well or not (at least for now).
Taking our analogy further, the ‘Devil’ (AI in the hands of evil big business or government) has tremendous insight and overwhelming power. You, by comparison, have very little—it is not a bargain between equals. Your data is the food and fuel that powers and informs these systems. For the most part, today, we all agree’ to giving away our golden goose (mixing metaphors) for perceived convenience or social “popularity.”
That doesn’t mean that you are powerless—just that you need to exercise your power over what is yours—your data. Moreover, it needs to be as simple and natural an experience as it is for walking without thinking or stretching out your hand to grasp your teacup when you want a sip.
To achieve this, we must make a simple binary shift in our thinking and reset the supporting frameworks we live by. For starters, we need to humanize the language of technology so that it is intelligible to all and not the incomprehensible jargon and technobabble it currently is. We must also redirect our investments to more human and life-centric projects rather than profit. An excellent example of this is the EU’s Next Generation Internet (NGI and the important vital work being done by organizations like W3C, ISO, the IEEE, and the many other interest groups, where thankfully, a great deal of work is being done on ethical AI standards and related technology for good. Most importantly, we must put competitive greed aside and foster collaboration.
When we venture outside our silos and bring together our specialized viewpoints into a co-creation process that questions the limits of what is possible and resets the permissible, we will illuminate paths to prosperity otherwise hidden in plain sight. Like Lubna always says, “If you only saw red, and I only saw blue, how would either of us ever see purple?”
We know our analogy is not perfect, but individuals must consciously safeguard their power (digital selves). Governments and enterprises also have a responsibility not to abuse their overwhelming advantage either if they won’t do it voluntarily, or we will need policy and regulation with the power to back it up.
Let’s be realistic; we can talk and ask all we want, but we cannot expect change to happen if we do not reset our definition of success. We cannot dream of planetary wellbeing if fiduciary responsibility is to shareholders, not stakeholders, and corporate success only measured profit.
We must develop our human socioeconomic and geopolitical systems on a broader front. The current models are concentrating knowledge, control, and wealth in the hands of a few de facto superpowers. Otherwise, as the world becomes increasingly “smart” and connected, individuals are in danger of becoming irrelevant. We will be well served by shifting our thinking from ‘what’s in it for me?’ to ‘what’s in it for us all?’.
We call on you, the reader, to get involved, think about what ‘accept all’ may mean, and exercise your right to decide how much of your data to share, with whom, and why.
Remember, just like Faust, be careful what you wish for and what you give away in exchange!
About Peter Dorrington:
Peter is the founder of XMplify Consulting and an expert in using data and behavioral sciences to lead transformation in Experience Management (XM).
Peter has been focused on developing and using predictive behavioral analytics to understand why people do what they do, what they are likely to do next, and how businesses should respond. As the inventor of Predictive Behavioural Analytics, Peter is an internationally recognized expert in Customer Experience analysis. In addition to this, an executive advisor, an award-winning blogger,
About Lubna Dajani:
Lubna is a pioneering information and communication technology innovator and design thinker with over 25 years of executive experience with multinational brands. A champion and role model for diversity, inclusion, and women in STEAM, she is a trusted advisor, board member, and mentor to social enterprises and accelerators, including SOSV and Springboard Enterprise. Lubna is also an active contributor to several standards and industry bodies, including IEEE, W3C, and the Sovrin Foundation.
Lubna is committed to applying technology, science, and the arts to elevate the human experience and regenerate planetary wellbeing.
Image by Tumisu from Pixabay