Move very fast and break many things

Digital gangsters and the Big Other

WHEN FACEBOOK TURNED ten in 2014, Mark Zuckerberg, the founder and nerdish face of the social network, announced that its motto would cease to be ‘Move fast and break things’, and become ‘Move fast with stable infra[structure]’. For the company that had debuted on the Fortune 500 list at 482 the previous year, speed was still of the essence, but so was reliability.

This year, Facebook turns fifteen, sits at seventy-six on the Fortune 500, and resembles a ‘clumsy teenager…[who] gets into yelling matches with Dad’ and sulks ‘when caught doing something it shouldn’t’.[i] Its motto now embodies the awkward aspirations of a conscientious adolescent to ‘Bring the world closer together’. With well over two billion regular monthly users and revenue of more than US$40 billion last year, it has the capacity to do this in a way that has been unimaginable for most of human history.

In the process of amassing this extraordinary network of connections, wealth and power, much has been created; but much has been broken as well. Just how much is becoming clearer. Asking for forgiveness rather than permission has been the preferred modus operandi of the digital disruptors. They have plundered data that no one seemed to own, and intruded into traditional supply chains to offer new ways to do things (think Uber, or Airbnb) so that ordinary folk can make or save a few dollars.

As a result, our lives have become richer, our work more efficient, our access to information unprecedented. In the wake of all this, however, there is a trail of broken industries, regulations and conventions, and a rapidly growing number of legal actions. And now the lure of a new multi-trillion-dollar global data-trading industry is exciting governments and financiers.

Some would argue Facebook and its confrères in the age of FAANG – Facebook, Apple, Amazon, Netflix and Google – have put serious chinks in the future of democracy, transformed the nature of capitalism, undermined the capacity of the nation state, challenged long-standing values of trust and privacy, and are now working to change the very nature of what it means to be human.

As Eric Schmidt, executive chairman of Alphabet Inc. (the holding company for Google, YouTube, the Android operating system and many more) until 2018, observed some time ago: ‘The internet is the first thing that humanity has built that humanity doesn’t understand, the largest experiment in anarchy that we have ever had… More powerful than most realise, the world is profoundly altered by its adoption and success.’[ii]

We have become so accustomed to the utility and benefits that these companies provide – ‘the involuntary merger of personal necessity and economic extraction’[iii] – that it is hard to step back and make sense of the civilisational scale and scope of what is at stake. But there is a growing consensus that doing so is urgent, before the winners take all and unaccountably reshape our economic, social and political lives. A British parliamentary inquiry into fake news concluded in February 2019:

While the internet has brought many freedoms across the world and an unprecedented ability to communicate, it also carrries an insidious ability to distort, to mislead and produce hatred and instability. It functions on a scale and at a speed that is unprecedented in human history.[iv]

This case is made with excoriating detail by Shoshana Zuboff in her magnum opus, The Age of Surveillance Capitalism (Profile, 2019), which goes beyond the oft-repeated concerns about scale and privacy in the Wild West phase of the internet. Zuboff argues that as Google, and then Facebook, sought to create financially viable businesses, they developed the tools to match, track and monitor the activity of their users and turn that data into money. In the process, they transformed capitalism and created a Big Other: an omnipresent aggregator and observer of the minutiae of our lives. In her view, this now threatens the norms of society and demands that, just as in previous periods of unchecked capitalism, something must be done.

In the same month as Zuboff’s book was published, those gathering in Davos at the World Economic Forum pushed data security (including AI and automation) up alongside climate change as one of the greatest global challenges, while Japanese Prime Minister Shinzo Abe, as host of this year’s June meeting of the world’s largest economies, declared, ‘I’d like Osaka G20 to be remembered as a summit that started worldwide data governance.’


FOR MOST OF their lives, the companies grouped under the FAANG banner have been regarded as the good guys, transforming the world as we know it. They opened new ways of engaging with friends and people we have never met, provided information with a click on a screen and created elegant, useful products. They entertained us, making our work more efficient, our lives easier and more interesting.

Those in the creative industries had a more jaundiced view of this transformation, as billions of dollars and tens of thousands of jobs in journalism, music, video and publishing evaporated into the digital mist. Jonathan Taplin detailed this in Move Fast and Break Things (Pan, 2017), which described the personal and social cost of the decimation of these industries.[v] The biggest-ever transfer of wealth out of the cultural sector[vi] relied on taking content and giving it away, while the new gatekeepers levied an invisible toll. At the same time, we readily posted our own content – which others turned into money. The New York Times has estimated that we provide fifteen million hours of free labour every year to fill these platforms.

The utopian promise embodied in Google’s motto ‘Don’t be evil’ helped fuel the early excitement about what the internet might offer – and the potential in that promise seemed to most to be a reasonable trade-off. The algorithmic blur and network of connected data centres made it virtually impossible to know what was in the brew that made it happen, and the use of our data was virtually invisible. But it seemed to work and we learnt to trust.

Then in 2018, the very real limits of the utopian vision became manifest. The companies that had been gestated in warm libertarian fluid revealed their greedy, uncaring souls, provoking a public backlash, and even challenges from their employees.[vii] Trust began to evaporate, people no longer so readily believed what they saw online. They began to worry about the consequences of lost privacy and many closed their accounts.

As Nicholas Carr, the author of The Shallows (Norton, 2010), recently wrote, ‘Once it embraced surveillance as the core of its business, Google changed. Its innocence curdled, and its idealism became a means of obfuscation.’[viii] The value of and potential for exploitation of the ‘digital exhaust’ the FAANGs traded and matched every time we turned on a smartphone, hit a keypad, sent an email, searched the internet, passed a surveillance camera, checked a map, bought a book, booked a hotel, downloaded a movie or uploaded images of a celebration became more than a huge number on a balance sheet. It started to become personal. Why after you sent an email to your mum discussing a possible island holiday were you blitzed with ads for tropical retreats? How did they know who would be susceptible to particular brands of fake news? Why did teenagers get ads for pimple cream the day before a big party? Could our choices really be predetermined and shaped by those who own the menu?

In the trade-off between innovation and privacy, most had been prepared to accept the non-negotiable terms the companies proposed in their incomprehensible, overlong user agreements; to click ‘accept’ and move on. We were also complicit in this exchange, compromising the privacy of others, and too often behaving in an uncivil manner. Tim Berners-Lee, who is often described as the father of the world wide web, sought to address this when he issued a manifesto in October 2018 based on the principle of ‘personal empowerment’ by really owning your own data.[ix] As Tim Cook, the CEO of Apple, said at Davos, ‘Right now all these secondary markets for your information exist in a shadow economy that is largely unchecked – out of sight of consumers, regulators and lawmakers.’

Complacency about this shadow economy was itself disturbed last year as a series of political scandals rocked the FAANG’s foundations and provoked reluctant national regulators into action: the future could no longer be left to those who were creating the new world and making up the rules as they went. Cambridge Analytica’s misuse of the Facebook data of about eighty-seven million people, including to influence the Brexit referendum in the UK, was the first serious demonstration of the capacity of Facebook to move on and break really precious things: laws, conventions, institutions. Then Facebook’s role as the unobservant host of lucrative Russian propaganda and fake news designed to influence the 2016 US presidential election became clearer. Owners and executives were called before parliamentary inquiries on both sides of the Atlantic, and those who turned up promised to do better, to develop codes and community standards, and to employ humans to ensure compliance.


IN MAY 2018, the European Commission adopted a new privacy standard, the General Data Protection Regulation (GDPR), which limits the data that can be collected and requires disclosure of its use. The response was swift: no doubt you, like me, received countless emails asking you to agree to new privacy agreements. If you bothered to read them you would have found that they set out clearly what data was being collected and for what purpose – although the scale and scope remained hard to comprehend.

For many of the FAANG companies that have taken advantage of generous tax incentives to use Ireland as their European base, the GDPR demanded a swift recalibration. Google pushed responsibility for compliance back to those whose content found a home on its platforms and in its search engine. Facebook effectively moved the data of 1.5 billion of its non-European users from Dublin to the US by changing the fine print in its user agreements, sidestepping the GDPR, while other US sites ‘went dark’ rather than be subjected to the new regulation. EU officials complained that these companies were trying to avoid the regulation by gaming the system. Nonetheless, dozens of legal actions were filed, and huge numbers of people requested their digital data, which the companies were required to provide within thirty days.

The following month, the European Commission adopted a new Convention on Data Protection, and the Californian legislature, home to the majority of the FAANG companies and associates that have flourished in their shadow, introduced stronger data protection and privacy laws to take effect in two years. Tim Cook fired an opening shot from behind enemy lines, declaring with bravado that ‘our own information from the everyday to the deeply personal is being weaponised against us with military efficiency’.[x][xi]

In October 2018, Facebook was fined £500,000 for breaching the privacy of more than a million British citizens by allowing Cambridge Analytica to access their data, and failing to correct or address the misuse. Had this occurred under the new GDPR regime the fine would have been much more: 4 per cent of annual global revenue, or roughly US$1.6 billion. In January 2019, the French data regulator CNIL fined Google €50 million for failing to make its data policies clear enough to users; in February, the German Federal Cartel Office ordered Facebook to stop matching the data of its users with Instagram, WhatsApp, Spotify, TripAdvisor and many other embedded apps.

The European Commission had already demonstrated a willingness to impose significant fines for anti-competitive practices. In July 2018, Google was fined €4.3 billion for bundling its search engine and Chrome apps into the Android operating system; the year before it was fined €2.7 billion for manipulating search results to direct outcomes to its related businesses; and in the same year Facebook was fined €110 million for knowingly breaking its commitment to not automatically match WhatsApp and Facebook user data. Inevitably, these actions are then subject to protracted legal appeal, but as an earlier generation of tech companies – including Microsoft and Intel – found, breaching European anti-competition regulations is a costly and distracting business.

In 2014, the European Commission introduced a precursor of the GDPR that included the right to be forgotten if information published online was wrong, ‘inadequate or no longer relevant’. Nearly 800,000 people have requested that Google exercise this right in relation to their data held on three million websites – a request that Google has granted in nearly half
of the cases. This pattern is repeated with many companies and organisations that generate content and is raising real issues as people seek to rewrite their past by demanding that sometimes even truthful, but damaging, information about them be removed from the online record. Not surprisingly, the FAANG companies and others that produce content and collect and collate data are building strong legal, corporate and governance divisions to defend their businesses.

In the battle to establish the rules of the game of what Tim Cook calls the ‘data-industrial complex’, Apple – the largest of the FAANG companies, ranked fourth on the Fortune 500 – is playing the unlikely role of white knight. Early this year, it blocked all Facebook employees from using apps installed on their iPhones after Facebook breached the privacy terms of its user agreement by paying children to collect and use their data. When Apple discovered Google staff were also abusing the agreement, they cut off access to Gmail, Google Maps and a staff app used to order coffee. The access was later restored, but Apple demonstrated that even in the virtual world things matter. With more than 900 million iPhones in the hands of customers around the world, Apple has unique power (even as it prepares to move increasingly into the non-material world, predicting that most of its future growth will come not from hardware but from the services it sells – with a margin of more than 60 per cent).


SHOSHANA ZUBOFF HAS studied the technology industry for decades. As one of a handful of female professors with an endowed chair at the Harvard Business School, she had a box seat from which to observe the transformation of American business that has occurred as an economy shaped by corporate mass production gave way in the last decades of the twentieth century to one in which finance was dominant. Now an emerita, she believes we have entered a new era, which she calls ‘surveillance capitalism’, under the watchful eye of the Big Other. This is made possible in a world where almost half of the world’s population is ‘computer mediated in a wide range of daily activities far beyond…work’. The market that is central to modern capitalism and was once thought to be the product of invisible forces is now transparent, granular and knowable – even if the methods are opaque. The essential resource of the information economy is data that can be extracted, repurposed, matched and applied ‘to predict and modify human behaviour as a means to produce revenue and market control’.[xii]

This raw material comes from four key sources: economic transactions; sensors in bodies, things and places; government and corporate databases; and public and private surveillance cameras. This is then supplemented with the ‘data exhaust’ we generate with every click, everything from a Facebook like or Google search to photos, page views and movements.

As Zuboff notes, ‘Once we searched Google, now Google searches us. Once we thought of digital services as free, but now surveillance capitalists think of us as free.’

Match it all together and rich insights into human behaviour become clear. Layer database on database and the predictive power can be used for market research to check, shape and reward behaviour. The scale is extra­ordinary and could produce a population-wide model of crude behaviour modification, like gold stars on the Big Other’s fridge chart. With so many data sets bristling with information, and bumping into each other, the old protection of anonymisation is no longer sufficient to prevent uninvited intrusions into the fabric of our lives, as the UN Special Rapporteur on privacy, Professor Joseph Cannataci, said when he visited Australia last year.

When the Chinese government began trials of its social credit system, which it plans to implement nationally in 2020, it demonstrated its ability to use billions of interactions every day on WeChat and Alibaba to monitor, reward and punish people on the basis of countless micro transactions: wise purchasing decisions and compliant behaviour could produce a reward, such as a free holiday or discount prices; bad decisions such as critical commentary about the state could see privileges withdrawn, and jobs and places in good schools disappear. The trade-off for greater efficiency and more reliable services comes at a very clear price, but early reports from China suggest that most people are prepared to accept the deal in a society that has a different cultural understanding of privacy and a virtually unbroken history of authoritarianism.[xiii]

In the West, we are horrified by this intrusiveness, yet we accept another form of intrusion – as long as it’s invisible. The potential for intrusion has now stretched well beyond the sale of advertising to match our demonstrated interests as revealed from the data exhaust we leave behind or crude matching of our demographics. By adding government-held data to the mix, it becomes even more lucrative.

Surveillance capitalism is now embedded in every other business, and financiers, lawyers and regulators are beginning to shape what the multi-trillion-dollar global data trade might look like, where it might be hubbed and how it might be regulated. In 2013’s Open data: Unlocking innovation and performance with liquid information, McKinsey declared that the global business was already worth US$3–5 trillion. Danny Gilligan, managing partner of Reinventure Group, a corporate venture capital company that counts Westpac as its largest strategic partner, recognised the opportunity and, in a report entitled Global Data Wars, sketched how Australia could play a leading role in this new industry and usurp Singapore as the regional frontrunner.


IT IS HARD to imagine what this might mean in practical terms, so it is helpful to consider this 2014 boast by Google’s chief economist Hal Varian:

If someone stops making monthly car payments, the lender can instruct the vehicular monitoring system not to allow the car to be started and to signal the location where it can be picked up. Insurance companies can rely on similar monitoring systems to check its customers are driving safely and thus determine whether or not to maintain their insurance or pay claims.

In a world where products are run by computers more than mechanics, control is two steps away from the user, and ownership is more like a lease. As Zuboff comments:

Surveillance capitalism is no more limited to advertising than mass production was limited to the fabrication of the Model T Ford. It quickly became the default model for capital accumulation in Silicon Valley, embraced by every start-up and app. It was a Google executive, Sheryl Sandberg, who played the role of Typhoid Mary bringing surveillance capitalism from Google to Facebook.

Zuboff shows how this is no longer restricted to the internet sector. Every corporate and government activity and service is touched by it. As she says, ‘Nearly every product or service that begins with the word “smart” or “personalised” is simply a supply chain interface for the unobstructed flow of behavioural data on its way to predicting our futures in a surveillance economy.’[xiv]

The data that informs these assumptions is derived from our electronically mediated actions, but we rarely materially benefit from its aggregation, extraction and sale; yet the average data aggregator earns an estimated US$10,000 a year per user. Jaron Lanier, a Silicon Valley critic and author of Ten Arguments for Deleting Your Social Media Accounts Right Now (Henry Holt, 2018), argues that we should be paid for the use of our data, and be prepared to pay for services that do not collect our data. Although people are increasingly concerned about the misuse of their data, the corporate approach of asking them to pay to prevent that data being used elsewhere is not gathering much support and, not surprisingly, there are few companies offering to pay to use our data. As one reviewer commented, ‘Lanier’s solution is wildly inadequate to the immensity of the problem. Refusal to participate in digital services that don’t charge their customers is little more than a juice fast for the social media damaged soul.’[xv][xvi] That is part of Zuboff’s critique: it is virtually impossible for us to own our data; it is a concoction created by others and sold back to us in the name of personalisation (like Netflix and Amazon making suggestions for your entertainment).

The emerging world of matched and modified data services is replicating the approach pioneered in the early days of the age of FAANG, in which once unimaginable goodies were made available apparently for free. Now apps are being developed for social good: to prevent and monitor domestic violence, to ensure disadvantaged children receive nutritionally appropriate food, to ensure traffic moves more easily and emergencies are better managed. But as we have seen, it may not be long before the data is used to lock the door of a smart fridge to ensure that the pre-diabetic member of the family cannot open the freezer and take another ice-cream, or for other much more overtly commercial purposes. Human agency is being removed from the individual and given to the data aggregator, with little insight into how or why these conclusions are reached.

Zuboff was one of the early sceptics of technological determinism, arguing that the way technology was developed and deployed was a social construct, one which people made and, if they paid attention, could change. She now argues that after two decades in which surveillance capitalism has flourished unimpeded, reclaiming human agency is essential if digital technology is to be used for the benefit of all, not just the few.


THE CELEBRATION OF the seventieth anniversary of the Universal Declaration of Human Rights in December 2018 was a moment to reflect on what had been achieved as a result of one of the most ambitious documents in history. With the conflagration of the Second World War an intense and bitter memory, the determination to codify essential rights was a remarkable testament to human resilience. The Declaration created a framework for international law and domestic politics, and even when its articles were honoured in the breach the intention was clear and broadly applicable. These seventy years comprised one of the healthiest, richest, most innovative and peaceful periods the world has ever known, when a rising tide transformed the lives of billions of people – generally for the better.

Reflecting on the Declaration in the context of the early twenty-first century is an interesting exercise, especially in a country like Australia, which has still not formally ratified a document it was instrumental in crafting. What would a declaration drafted now look like? Clearly within the burning awareness of climate change, an updated Declaration would ascribe a right to the Earth itself. A BBC survey showed that most people thought that access to the internet was an essential human right. Yet Margaret Atwood, whose dystopian novel The Handmaid’s Tale (McClelland & Stewart, 1985) was translated to the screen as a bitter metaphor for the times, wonders if the very idea of what it means to be human has become a matter of contention as artificial intelligence and robotics disrupt the nexus between mind and body.

The document adopted in 1948 had a long intellectual and political history, grounded in Western philosophy, the value of the person, rule of law and political, economic, social and cultural rights. The commitments to privacy, freedom of thought and expression in the physical world are a good starting point for the virtual world. In his 2018 PEN HG Wells lecture,[xvii] Dave Eggers – author of the dystopian novel The Circle (Knopf, 2013), set in an internet company run by ‘three wise men’ – argued for greater and more effective regulation as well a need to accept personal responsibility for our own behaviour online.

With a few amendments to the Declaration, he also wrote, it would be possible to ensure that the anarchic digital world was reshaped with recognition of rights, public order and general welfare. This could create a ‘latticework of rights and responsibilities of all humans – one that might keep us responsible to each other and invested in our mutual wellbeing’. [xviii]

Most important of all, though, is that we need to ensure that humans in the twenty-first century will be allowed to enjoy analogue lives… When any government service requires the ownership of a smartphone to access basic services, then their rights are being compromised… So we must put the brakes on moving every last element of our lives into the digital realm. We must ensure humans can live offline as much as humanly possible.


CENTRAL TO A twenty-first century Declaration would be a new right to understand. So much that happens in an algorithmic bubble is not fully understood, even by those involved in crafting the formulas that make the system work. This new level of complexity and opaqueness makes the machine age seem simple – as when a knee bone is connected to a hip bone, a hinge to a door, an inlet valve to an outlet device.

Now the interconnections and network effects of distributed systems, complex computing and electronics make most things we rely on everyday virtually incomprehensible. The process by which Netflix delivers a popular television show to the hundreds of millions of people who choose to watch it on screens around the world requires a degree of sophistication, capacity and co-ordination that would not only have been unimaginable a decade ago, but physically impossible. In the post-mortems after the global financial crisis, even the most numerate fund managers admitted that the highly technical financial products that had brought the world’s financial system to its knees were almost impossible to comprehend.

James Bridle is a London artist working at the interface of computing and creativity whose work includes New Dark Age: Technology and the End of the Future (Verso, 2018). He is particularly concerned about the consequences of this mass incomprehension:

Today we hear a lot about the benefits new technologies such as artificial intelligence and mass automation will bring to our lives, but even those tasked with constructing them have little understanding of the effects they will have on our societies. Lack of understanding – the feeling of being lost and powerless in the world – leads to fear, apathy and rage. It’s hardly surprising then that…those are currently the dominant emotions felt across the globe.

The right to understand, then, would be a useful addition to what is expected for us today… Only through mass understanding…might we hope to get a firmer grip on an increasingly strange and inscrutable world.[xix]

In the past, trust was the ineffable but essential ingredient that underpinned human relations and commercial markets. It was essential to the embrace of the FAANG companies, when we clicked without reading the terms of agreement. Contracts have long been a way of mitigating the uncertainty and unreliability of human exchanges: if it was not honoured there would be consequences. In a data-driven world, Shoshana Zuboff argues that the ease of depending on computer-mediated monitoring to check that something is done ‘eliminates the need for and therefore the possibility to develop trust’. It creates ‘an arid wasteland’. She argues, ‘Consensual participation in the values from which legitimate authority is derived, along with free will and reciprocal rights and obligations, are traded in for the universal equivalent of the prisoner’s electronic ankle bracelet’:[xx]

These [computer-mediated] arrangements describe the rise of a new universal architecture existing somewhere between nature and God that I christen the Big Other. It is a ubiquitous networked institutional regime that records, modifies and commodifies everyday experience from toasters to bodies, communication to thought, all with a view to establishing new pathways to monetisation and profit. Big Other is the sovereign power of a near future that annihilates the freedom achieved by the rule of law. It…supplants the need for contracts, government and the dynamism of a market democracy.[xxi]

Tim Cook argued at the Davos forum – where the prospect of a new trillion-dollar data economy was getting the competitive juices flowing through the arteries of many of the richest companies and countries in the world – that this bleak future could be tackled by a global commitment to adopt four key principles.

First, the right to have personal data minimised. Companies should challenge themselves to strip identifying information from customer data or avoid collecting it in the first place. Second, the right to knowledge – to know what data is being collected and why. Third, the right to access. Companies should make it easy for you to access, correct and delete your personal data. And fourth, the right to data security, without which trust is impossible.

To many of those sensing a lucrative new opportunity, his proposals no doubt sounded like the way to kill the goose that was laying enough golden eggs to make Midas look like a pauper.

When Shinzo Abe’s Osaka G20 communique laying down the ground rules for global data governance is released in June, it will be helpful to keep these four points in mind before deciding if human agency has seriously embraced, or even acknowledged, the task of curtailing the excesses of surveillance capitalism.


ABOUT A DECADE ago, I had a meeting at Google’s appropriately hipster waterfront premises in Sydney. The company had given some money to an organisation with which I was associated, and I had been delegated to see if there was a chance of more money, further collaborations. The young Irishman responsible for the project was charming, but distracted. He was focused on a presentation he was preparing for the Federal Cabinet on how Google could transform education in Australia. This was not long after the introduction of NAPLAN testing, when the promise of a national curriculum was generating considerable excitement, and education was popping up on management consultants’ spreadsheets as a lucrative new business opportunity.

For a company like Google, with apparently bottomless pockets and endless ambition, the potential was enormous. It could provide machines, software, data analysis, virtual classrooms, global connections – and capture a generation of children for its goods and services and way of seeing the world. Apple used to call the early adopters in schools and universities ‘Appleseeds’.

As I listened to the shape of the pitch, I could well imagine how it would excite the men and women sitting around the Cabinet table. Not only was it innovative, world’s best practice and cutting edge, but potentially someone else would pick up the bill. I then thought about the brilliant educators I knew who would have given their eye teeth for an opportunity to address the Cabinet and describe a new way of educating a generation of schoolchildren. They would have talked about values, research, inclusion, capacity and cohesion, but would never be given the chance.

The gap between computer-mediated education and chalk and talk was crystallised in a 2001 phrase: ‘digital natives’, those born after 1980. It was initially used to describe the gap between those who had grown up with computers and the internet, and their teachers. It did not take too long before this generation of young people became a marketing target for the new services and platforms the online world made possible. Digital natives became both a description and a slogan flung as an insult, and it spoke to mutual incomprehension. They read differently, learn differently, relate differently, use a different language – get used to it.

While this is no doubt true to some degree, the ability to think critically and the development of skills and knowledge remain important irrespective of the tools used. But the consequences of capturing and holding a generation of children by intervening in the education system can now be measured. The results are mixed; heavy use of social media has been shown to undermine wellbeing as well as to increase anxiety and depression. The methods designed to acculturate young people to a tech-enabled world, to be always on and to accept the values that inform the products and services are taking a toll. As Grafton Tanner wrote earlier this year in the Los Angeles Review of Books:

Educational apps…have intensified the impulse to condition young people to act like adults as early as possible. Now students are actively integrated into a system that collects data about their behaviour, quantifies it and packages it for parents and the school itself… It teaches students to understand life as being inseparable from digital technology and it normalises both surveillance and the kind of isolating individualism that can cause mental illness.[xxii]

Shoshana Zuboff has an even darker reading of the phrase ‘digital natives’. She is not the first person to describe the behaviour of the FAANG companies as imperialistic, modern-day colonialists: taking goods owned by others, claiming them, profiting from them, imposing their laws and modes of behaviour, just as the European conquistadors and conquerors once did.

Zuboff takes this analogy a step further, describing it as a ‘tragically ironic phrase’ and suggesting that the digital natives have a parallel with the first generation of Indigenous people displaced by colonial settlement. As she told The Observer:

Historians call it the ‘conquest pattern’, which unfolds in three phases: legalistic measures to provide the invasion with a gloss of justification, a declaration of territorial claims, and the founding of a town to legitimate the declaration… The first surveillance capitalists also conquered by declaration. They simply declared our private experience to be theirs for the taking, for translation into data for their private ownership… We were caught off guard by surveillance capitalism because there was no way that we could have imagined [the consequences of this] action, any more than the early peoples of the Caribbean could have foreseen the rivers of blood that would flow from their hospitality toward the sailors who appeared out of thin air waving the banner of the Spanish monarchs. Like [them] we faced something truly unprecedented.[xxiii]


MY CHILDREN WERE early adopters of Facebook. It was just a toddler when they were in their later years of high school. It swept their world like a virus; the contagion was swift and deep. To continue the Zuboff analogy, it was not unlike the impact of viruses on Indigenous Australians after the First Fleet arrived: everyone was touched and many succumbed.

As a parent of strong-willed adolescents, I watched with dispassionate interest: teenagers need their space, a zone without intense parental oversight.

It quickly became clear that this was where they shared their most intimate experiences, their fears and hopes, as well as the practical details of life and strikingly creative endeavours. Meanwhile, I listened to countless conversations about whether parents should seek to be their children’s friends on Facebook, and was in the camp that said, ‘No, give them their space.’ Of course, in this exchange, it was up to the teenagers who they would accept as friends, so a fear of rejection kept some of those who wanted to befriend their children online awake at night.

After a few years I decided that if I was to engage with the world as it was changing I needed to join, and so I did. By that time, if Facebook were a child it would have been preparing to go to kindergarten. I filled in the online form, wondered about providing my real birth date and agonised over which photo to choose. I eventually decided that a black-and-white shot taken not long before in the Roman Forum was the image I would like to present in this domain. It was a photo I could be proud of, but not quite how I looked in everyday life. I then forgot about it.

Some weeks later I returned to my page, and there on the screen were ads for wrinkle-reducing creams. I was aghast. There was nothing in the photo that suggested I needed such products. This was an early example of data-matched advertising. I saw the photo; they saw my age and possibly my expenditure on skin-care products and identified me as a target for wrinkle creams. If that was how they were going to treat me I thought, bugger them. I signed off and didn’t return for years.

When I came back, I chose a photo that shows my hair and the back of my head. I don’t get ads for shampoo or conditioner; though as my hairdresser asks his Google Assistant to select music to play when I am there, the matching and selling may have just moved to a new level of sophistication. On the other hand, they may have written me off as a perverse and reluctant convert, although like seventeen million other Australians I access Facebook more than once a month – quite a lot more actually, several times a day. And I am friends with my kids, although they rarely post anything personal anymore. Like good digital natives they learnt the danger of collaborating with the invaders and have become sophisticated users of the tools, but cautious and protective of their rights.


OVER THIS DECADE, the value of Australian advertising online grew from virtually nothing to more than $7 billion a year according to the International Competition and Consumer Commission’s Digital Platforms Inquiry: Preliminary report, with nearly half going to Google, and a fifth to Facebook. Meanwhile, the classified advertising that once made Australian newspapers extraordinarily profitable fell from $2 billion to $200 million a year.

The ACCC and Human Rights Commission are now prominent among international regulatory agencies investigating how to ensure that our rights are protected in this new world – exploring modes of regulation and oversight to protect privacy, maintain trust and ensure access. The ACCC has expressed caution about joining the European GDPR regime, as other non-European countries such as Japan and Brazil have done; but stronger privacy legislation will be essential. New consumer data-protection rules are an important step, but as the UN Special Rapporteur Professor Cannataci commented, the rights of the consumer cannot be conflated with the rights of citizens. Our rights as citizens are more fundamentally challenging to the surveillance capitalists, who will welcome interventions to make their businesses more efficient and profitable, but resist more fundamental challenges. From my reading of the ACCC’s preliminary report and discussion papers, the fundamental structural change at stake – that this is not just another industry but one that has the capacity to profoundly change everything – is beyond their purview. Addressing this will require much more than a new regulator, or the occasional digital detox, as the debate about the rushed-through legislation giving law enforcement agencies access to encrypted data has shown. The House of Commons select committee concluded: ‘We must use technology to free our minds and use regulation to restore democratic accountability [and] make sure people stay in charge of the machines.’

A controlled study by researchers at Stanford and New York University in which thousands of people deactivated their Facebook accounts for a month revealed this year that those who detached were happier, less anxious and depressed, less informed but more open-minded. They used the extra hour of free time a day to spend with family and friends and watch more television. But at the end of the study most were loath to remain offline, principally because of scale and the promise of connection.[xxiv]

After devoting much of the last decade to researching and writing her book, Shoshana Zuboff is clearly enjoying her moment in the spotlight, which has arrived with pitch-perfect timing as serious questions about the modus operandi of FAANG and their confrères are being asked. She does not deny the benefits of the digital domain, or its possibilities to improve and enrich lives, but she – like an increasing number of people around the world – is concerned that in its current manifestation it will do irreparable harm, benefit the few and create digital serfs of the rest of us. ‘In any confrontation with the unprecedented, the first work begins with naming…the first necessary step toward taming. My hope is that careful naming will give us all a better understanding of the true nature of this rogue mutation of capitalism and contribute to a sea change in public opinion, most of all among the young.’[xxv][xxvi]

25 February 2019


Books and reports discussed

Zuboff, S 2019, The Age of Surveillance Capitalism, Profile, London.

Zuboff, S 2015, ‘Big Other: Surveillance Capitalism and the Prospects of an Information Civilisation’, Journal of Information Technology, vol. 30, pp. 75–89.

Lanier, J 2018, Ten Arguments for Deleting Your Social Media Accounts Right Now, Henry Holt and Co., New York.

Taplin, J 2017, Move Fast and Break Things, Pan, London.

Wu, T 2016, The Attention Merchants, Knopf, New York.

Carr, N 2010, The Shallows, Norton, New York.

Australian Competition and Consumer Commission 2018, Digital Platforms Inquiry: Preliminary Report, <>.

Australian Human Rights Commission 2018, Human Rights and Technology Issues Paper, <>.

House of Commons, Digital, Culture, Media and Sport Committee 2019, Disinformation and ‘fake news’: Final Report, <>.



[i] Benton, J 2019, ‘Happy Birthday Facebook!’, Nieman Lab, 4 February, viewed 24 April 2019, <>.

[ii] Taplin, J 2017, Move Fast and Break Things, Little Brown, New York.

[iii] Naughton, J 2019, ‘“The goal is to automate us”: welcome to the age of surveillance capitalism’, The Guardian, 20 January, viewed 1 February 2019, <>.

[iv] House of Commons, Digital, Culture, Media and Sport Committee 2019, Disinformation and ‘fake news’: Final Report, <>.

[v] Taplin, J 2017, Move Fast and Break Things, Little Brown, New York, p. 11.

[vi] Schultz, J 2016, ‘Australia must act now to preserve its culture in the face of global tech giants’, The Conversation, 2 May, <>.

[vii] Losse, K 2019, ‘The False Promise of Silicon Valley’s Quest to Save the World, The New Republic, 7 February, viewed 10 February 2019, <>.

[viii] Carr, N 2019, ‘Thieves of Experience’, Los Angeles Review of Books, 15 January, viewed 24 January 2019, <>.

[ix] Berners-Lee, T 2018, ‘One Small Step for the Web…’ Medium, 29 September, viewed 12 February 2019, <>.

[x] Seetharaman, D 2018, ‘With Facebook at “War,” Zuckerberg Adopts More Aggressive Style, The Wall Street Journal, 19 November, viewed 2 February 2019, <>.

[xi] Warren, T 2018, ‘Google fined a record $5 billion by the EU for Android antitrust violations’, The Verge, 18 July, viewed 6 February 2019, <>.

[xii] Zuboff, S 2015, ‘Big Other: Surveillance Capitalism and the Prospects of an Information Civilization’, Journal of Information Technology, vol. 30, pp. 75–89.

[xiii] Tisne, M 2018, ‘It’s time for a Bill of Data Rights’, MIT Technology Review, 14 December, viewed 19 December 2018, <>.

[xiv] Naughton, J 2019, ‘“The goal is to automate us”: welcome to the age of surveillance capitalism’, The Guardian, 20 January, viewed 1 February 2019, <>.

[xv] Weisberg, J 2018, ‘The Autocracy App’, The New York Review of Books, 25 October, viewed 15 October 2019, <>.

[xvi] Weisberg, J 2016, ‘We Are Hopelessly Hooked’, The New York Review of Books, 25 February, viewed 15 October 2019, <>.

[xvii] Eggers, D 2018, ‘The violations start with us’, The Times Literary Supplement, 18 December, viewed 12 February 2019, <>.

[xviii] Atwood, M, McKibben, B, Enright, A, Bridle, J, Eddo-Lodge, R, Cohen, J, Laing, O, Eggers, D 2018, ‘Human rights for the 21st century’, The Guardian, 8 December, viewed 4 February 2019, <>.

[xix] Atwood, M, McKibben, B, Enright, A, Bridle, J, Eddo-Lodge, R, Cohen, J, Laing, O, Eggers, D 2018, ‘Human rights for the 21st century’, The Guardian, 8 December, viewed 4 February 2019, <>.

[xx] Zuboff, S 2015, ‘Big Other: Surveillance Capitalism and the Prospects of an Information Civilisation’, Journal of Information Technology, vol. 30, p. 81.

[xxi] Zuboff, S 2015, ‘Big Other: Surveillance Capitalism and the Prospects of an Information Civilisation’, Journal of Information Technology, vol. 30, pp. 81–82.

[xxii] Tanner, G 2019, ‘Classroom Management: Simon Sinek, ClassDojo and the Nostalgia Industry’, Los Angeles Review of Books, 28 January, viewed 4 February 2019, <>.

[xxiii] Naughton, J 2019, ‘“The goal is to automate us”: welcome to the age of surveillance capitalism’, The Guardian, 20 January, viewed 1 February 2019, <>.

[xxiv] Crawford, K 2019, ‘Tuning out’, Stanford Institute for Economic Policy Research, 29 January, viewed 24 April 2019, <>.

[xxv] Naughton, J 2019, ‘“The goal is to automate us”: welcome to the age of surveillance capitalism’, The Guardian, 20 January viewed 1 February 2019, <>.

[xxvi] Kulwin, N 2019, ‘Shoshana Zuboff on Surveillance Capitalism’s Threat to Democracy’, New York Magazine, 24 February, viewed 26 February 2019, <>.

Get the latest essay, memoir, reportage, fiction, poetry and more.

Subscribe to Griffith Review or purchase single editions here.

Griffith Review