Holiday reading roundup: How the future looked, before the pandemic
As far back as mid-March, people were suggesting that the best thing to do with 2020 was hit the fast-forward button and move on swiftly to 2021. In the long slog since, endless Zoom calls and panels have explored the kind of future we might want to build, as and when we can. This year’s book reviews wrap-up therefore focuses on futurist titles, even though all of them were written before SARS-CoV-2 reared its ugly protein spikes.
The countries that have done best in this crisis have been those that benefited from recent epidemic experience. Their prompt response may be what David Weinberger, co-author of the well-known The Cluetrain Manifesto, means when he writes in Everyday Chaos about a “normal chaos” that looks positively restful compared to our present situation.
Weinberger begins with the complexity hidden behind the most mundane operations — a short drive in a car during which you pull over to let an ambulance past, for example. Even such common events defy our basic assumptions: we think we understand what’s happening, physical laws determine what happens, we can exert control by doing the right things, and change is proportional to its effect. Then machine learning and A/B testing blow these up and people stop caring so much about why and shift to doing what the data says. The book attempts to chart this fundamental shift from a world we thought we could understand, even if we didn’t yet, to a world we know we don’t understand, but can operate using machines as levers. ‘New tools’, Weinberger calls them, and tells us to love the complexity.
A decade or so ago, participants at a futurist conference asked if artificial general intelligence could solve climate change if correctly deployed. Hopes like this led science fiction writer Ken McLeod to coin the phrase “the Rapture for nerds”. In AI in the Wild, Peter Dauvergne assesses this idea more soberly: what, he asks, can AI and machine learning do for global sustainability?
On the plus side, machine learning tools will help improve the efficiency of, and eliminate waste from, all sorts of systems from electrical grids to agriculture. On the downside, AI will obey the desires of the powers who control it, who will be motivated to hide its failures and costs. Dauvergne believes that AI will accelerate mining and extraction of natural resources, generate “mountains” of electronic waste, and “turbocharge consumerism” via its effect on advertising. Technology is a form of power and requires good governance. If we want it to bring sustainability, we need to put in place the political and economic reforms to make it do so.
Over time, the consultant and author David Birch has progressively argued that identity is the future of money and that government-backed currencies will be supplemented by alternative currencies issued by communities. In his latest book, The Currency Cold War, he charts a course for digital currencies. Birch is not talking about bitcoin, which he thinks is more likely to simply pave the way for “new kinds of markets that trade in digital assets with no separate settlement”.
A key element of Birch’s prospective future is vastly more currencies — millions of them — than circulate today, some backed by private companies, some backed by governments of all sizes. An average consumer need not worry: apps and algorithms will take care of the conversions. The “cold war” of his title is the battle he foresees between nations seeking to take over the global currency function served by the US dollar in the 20th century. Unlike the past, digital currencies will compete on speed and convenience.
If you believe, as Birch does, that these upheavals are inevitable, then it’s logical to consider how to manage the change. He proposes that the US and UK should develop a global digital identity infrastructure; create a global e-money licence; provide a digital diligence system that is alternative to and less exclusionary than the KYC regimes operating now; and create new payment systems that work with all of these. As he says in the book, and has repeated at numerous events since its release, government-backed digital currencies are not his idea, it’s coming from “serious” people like Mark Carney, the former governor of the Bank of England.
Even in ordinary times, raising children inevitably involves envisioning their future. In Parenting for a Digital Future, LSE academics Sonia Livingstone and Alicia Blum-Ross watch numerous real-life parents navigate the tricky, shifting digital landscape. The parents they meet — some the same ones they visited four years ago for Livingstone and Julian Sefton-Green’s The Class (2016) — all hope that digital technologies will give their children better lives, but are unclear about how this will happen at a time when two children in the same family, just five years apart, may be grappling with very different technologies.
Today’s 14-year-olds, for example, may choreograph video dances for TikTok, which didn’t exist in 2015 when, at that same age, their 19-year-old siblings were testing out Instagram filters…which in turn didn’t exist in 2010 when today’s 24-year-olds were deciding whether they preferred Twitter, Tumblr or Reddit. Today’s 29-year-olds grew up without smartphones and tablets. As Livingstone and Blum-Ross write, “The question was not just ‘What kind of future will my child have?’ but also ‘What kind of world will they live in?'”
In addition, today’s larger social context poses additional challenges today’s grandparents didn’t face: rising inequality, the concentration of wealth, the decreasing stability of jobs, and the loss of certainty that education will provide a secure career path. None of these are within any individual parent’s control, but most that the digital world is, which pushes parents in conflicting directions: take advantage of new digital opportunities, but limit screen time.
The authors conclude with a series of sensible policy recommendations: support parents; recognise their contributions within schools and educational institutions generally; and increase attention to the design and governance of the digital environment. But will anyone listen?
The suggestion that ‘privacy is dead’ automatically raises the suspicion that the speaker is the CEO of a large Silicon Valley company who wants it to protect his company’s business model. In Life After Privacy, however, US political philosopher Firmin DeBrabander is not that interested in either technology or business — he’s not even all that invested in whether privacy is dead or alive.
Instead, what DeBrabander is really asking is whether privacy is necessary for autonomy and democracy. Unlike thousands of privacy advocates all over the world, his answer is ‘no’, even while charting the increasingly pervasive “surveillance economy” and our willingness to hand over intimate details. Privacy has always been endangered, he writes, and yet democracy survives. Rather than enabling democracy, privacy is a by-product of an effective democracy. He seems to mean this as the comforting thought that democracy will survive, even though our privacy is vanishing. A privacy advocate might counter that DeBrabander is quite the optimist, especially since he was writing before the 2020 US presidential election. It’s more usual to observe that allowing a surveillance framework to be built is dangerous because it will be available as a weapon for any police state that comes to power if democracy fails.
The ten years since open data was going to change the world have not been an easy ride. Data collected by government organisations for their own use has proved difficult for outsiders to understand and use. File formats are an issue. Gaps feeding historical bias into new uses and algorithms are an issue. The cost and resources required to maintain, clean, and update the data are issues. Solving these logistical problems takes time sufficient for the rest of us to forget the potential we imagined we’d be unlocking by now.
In the coffee table-style book Data Action: Using Data for Public Good, Sarah Williams offers a guide to using data ethically and responsibly, copiously illustrated with both modern and historical data-derived charts, graphs, and other images. John Snow’s cholera map and William Playfair’s innovative 1786 graph showing England’s economic strength share space in the book with The Guardian’s counts of American police killings and machine learning analyses of satellite photographs.
Correctly used, Williams concludes, data can change how we see the world, thereby sparking policy change and civic action. Among her most important warnings: consider whether your planned use of the data will do more harm than good. Not a bad reminder with which to launch 2021.
RECENT AND RELATED CONTENT
2021 outlook: Here are the technologies, questions that will matter
Safer networks at home: Working remotely in 2021
Back to the office in 2021? Here are ten things that will have changed
2021: Now that disruption is business as usual, we must rise above crisis mode
Forecast 2021: Artificial Intelligence during COVID and beyond