The following expert contributions offer deep, broad insights that represent the diversity of thought expressed by leading expert commentators in this canvassing.
If we develop guardrails, the core elements of democracy will be strengthened
Amy Webb, founder of the Future Today Institute, wrote, “There are too many variables in play to predict just one plausible trajectory for the future of our democratic institutions. If we enter a decade of synthetic media without restrictions, increased algorithmic determinism and financial incentives that favor competition over collaboration, the core strengths of our democracies will have eroded. Citizens will be more vulnerable to misleading information and will be served the kinds of content that capture their attention. However, if we develop guardrails, norms and standards now that encourage transparency, authenticity and collaboration, our democratic institutions could be significantly strengthened. I see movement along both trajectories.”
Ongoing “strategic distraction” and organized chaos lead to bitter partisan divisions
Technology has already revolutionized our notion of what democracy means. Barry Chudakov
Barry Chudakov, principal, Sertain Research, said, “By 2030 I expect democracy to still be caught in a dilemma: freedom vs. intrusion. Civil liberties will continue to be a fraught area with digital xenophobes on one side concerned that ‘others’ will seek to harm democracy and so any countermeasures are justified, and civil libertarians on the other side who will argue that the surveillance state has gone too far and pushed democracy toward Big Brother Panopticon totalitarianism. Technology has already revolutionized our notion of what democracy means. It used to mean one person, one vote. Now it means, one device, one voice. Every voice will be heard via Twitter, Snap, YouTube, Facebook or Instagram. The question we will still be wrestling with in 2030: Who is this person? How will essential democratic institutions achieve authentication? The fundamental challenge to these institutions is – and will continue to be – identity. That is, the multiplication and falsification of identity, from which flows the falsification and distortion of information. At the same time, as we wrestle with confirming identity, democratic institutions confront the reality of the internet as a vast copy machine, where behaviors and attitudes can be mimicked and adopted like trying on a new shirt. What do we do when these behaviors and attitudes are reprehensible or downright evil? The copy machine remains, and we are left with our outrage – which is not enough. The ongoing threat to democracy is organized chaos. This strategic distraction deploys asymmetric information warfare to inflame social differences into bitter partisan divisions. While at the same time, because artificial intelligence systems designed to engage with humans will collect and convey increasing quantities of data, these systems must be built on empathy for the ethical development and deployment of AI.”
“Our use of technology disconnects us from the local realities in which we live”
Douglas Rushkoff, well-known media theorist, author and professor of media at City University of New York, said, “I think the damage has already been done, or at least that the degree to which the public is misinformed remains fairly constant. Direct-mail campaigns from Republicans against John Kerry told voters that Kerry meant to take away their guns and Bibles. People in Czarist Russia were told that Jews conducted blood rites with murdered Christian children. It’s hard to see social media or deepfake videos doing much more damage. So, when I say things will stay about the same between now and 2030, I take into account that they’re already in pretty horrific shape. Democracy, as currently configured, isn’t working so well in America, and tech exacerbates certain problems while also correcting others. The main way that tech impacts democracy is more subtle than disinformation and Russian propaganda. Our use of technology disconnects us from the local realities in which we live. While TV may have misinformed us about what was going on in the non-local world, our digital devices often keep us from even engaging with the local world. We become de-socialized, less empathetic. Less capable of thinking civically.”
“There will be a lot of noise from politicians, not many solutions.”
Mike Roberts, Internet Hall of Fame member and pioneer CEO of ICANN, said, “Among the effects of the internet on social discourse are 1) amplification of voices (often without enough thought behind them); and 2) a speeding-up of the action-reaction dimension of expression. We are currently in a phase of reaction to having allowed too much power to accrue to social media platforms. Consensus on remedies is difficult to achieve because of the factors noted above, and also because the problem itself is difficult to deal with. Perhaps the single most difficult aspect is moderation, i.e., censorship of expression – how far is too far, etc. We are lucky that the big platforms evolved in the U.S., with our history of First Amendment protections. So, bottom line, there will be a lot of noise, especially from politicians, not many solutions and not much overall movement.”
Innovation in civic technologies can possibly enhance social cohesion, equity and justice
Alexander B. Howard, independent writer, digital governance expert and open government advocate, said, “Democracies will look a lot like they do today: stable, peaceful and equitable in countries that succeed in maintaining good governance, sclerotic and messy in flawed democracies captured by corporate influence, and devolving toward authoritarianism, or outright dissolving into civil wars in others. In the U.S., unless fundamental reforms have been enacted in some states that address money in politics, gerrymandering, government corruption and climate change, citizens will understandably remain skeptical about the meaning of their public participation in national elections, turning toward the endless rivers of infotainment and diversion instantly available on ubiquitous screens and projections. Many people will experience civic life through personalized feeds of infotainment from technology companies and media companies mixed with digital services and information from municipal, state and federal governments and updates from our friends and family. Government agencies at every level will have replaced retiring Baby Boomers with automated services, augmented with artificial intelligence, putting a high premium on algorithmic transparency accountability and accessibility. Many more of the newspapers that play key roles in communities will be gone, and, despite the best efforts of state governments and foundations – and public media – radio and digital nonprofits won’t replace all of their civic function everywhere, creating news deserts. That void will be filled up by the descendants of today’s social media platforms and media companies, which will gain more power in shaping both conversations and civic participation. At the same time, continued innovation in civic technologies will have the potential to enhance social cohesion, equity and justice when they are deliberately built and designed with the public they connect and empower, enhancing the capacity of journalists, watchdogs and whistleblowers to make institutions transparent and hold powerful people and organizations to account for abuses of power. The role of schools and libraries as community hubs for information access and civic life will continue to be critical.”
Our brains may not be capable of dealing with emerging technologies of manipulation
Juan Ortiz Freuler, policy fellow at the Web Foundation, wrote, “Technology will be leveraged to increase the number of issues on which citizens are consulted directly. People will have a chance to engage in a greater number of public issues and will have access to more information regarding issues of public interest and how the state operates. Yet, in parallel, the degree to which citizens are surveilled is already increasing. A further-developed surveillance infrastructure will allow governments to easily clamp down on any form of participation that could affect core interests. The ways in which coordination between private-sector companies and governments on national security issues takes place today suggests that ‘signals’ of potential future crimes might increasingly lead to state interventions before any actual crime is committed. Furthermore, if the current trend toward allowing the private sector to both consolidate and run black-box algorithms for personalization and content-curation continues, these companies will take greater control over the shaping of public opinion. We’ve seen this trend, from surfing across blogs to find lists of links, to search engines that deliver a curated list, to artificial intelligence assistants (Siri, Alexa, Cortana) that deliver one specific reply to a query. Developments in augmented reality and virtual reality promise to increase this control further by allowing the companies that develop the tech to embed tailored information in contexts our brains won’t be capable of distinguishing from the natural environment we evolved in over millennia.”
Dominance of digital overlords is devastating to journalism, small businesses, governance
When governments can flip a switch and turn the internet off, it’s hard to see how citizens stand a chance against repression. Andrew Nachison
Andrew Nachison, chief marketing officer, National Community Reinvestment Coalition, commented, “In the U.S. between now and 2030, I see a mix of government inaction and perpetual discord, and a mix of rising citizen activism and activation on the one hand, enabled by clever and increasingly capable tech platforms, and widening despair, detachment and digital dropouts. I worry that things will get worse, that inequality and corruption, which tech has done nothing to abate, will lead to violence and civil collapse. The dominance of a handful of digital overlords has brought us magical capabilities and services, like being able to search for information on nearly anything, or buy nearly anything you need, or keep up with friends, family and news, all with a few finger taps. But the costs have been devastating to local journalism, small businesses and governance. Facebook turns out to be the world’s most powerful engine for censorship and political manipulation, and there’s no sign it will do enough, on its own, to materially change itself. I also don’t know that breaking up the company will change much. Facebook doesn’t need Instagram or WhatsApp to be Facebook. Unless vastly stronger consumer protections are put in place to protect privacy, ensure transparency and put real control and economic benefit in the hands of content creators and users, Facebook will still be Facebook. Ditto for Google. But that’s just the U.S. story, which is similar in the UK but not everywhere. State censorship and control of the internet seems to be on course to suppress and more or less crush democracy, and even talk of it, in places like China, Russia, Iran and North Korea. When governments can flip a switch and turn the internet off, it’s hard to see how citizens stand a chance against repression. My optimism rests with progressive visions for digital governance and citizenship in outlier countries, like Estonia, and civic tech innovators promoting similar visions. Maybe they will succeed and spread. By 2030? I doubt it. I’m more hopeful for 2130.”
“Advancement is far outstripping our ability to understand and govern it”
Susan Etlinger, industry analyst, the Altimeter Group, responded, “Technology advancement is far outstripping our ability to understand and govern it. Early in this decade, we began to see the implications of what we called ‘big data’ on privacy and human rights. As artificial intelligence and machine learning became more commonplace, different issues came into focus: perpetuation and amplification of bias, the need for transparency, the need for interpretability and auditability of algorithms, and, more broadly, the need for norms and governance structures for intelligent technologies. By the end of 2016, following both the U.S. and UK elections, we began to see how social media platforms could be used to weaponize information at scale and undermine the foundations of democracy. Now, as the decade comes to a close, we are starting to see synthetic data – e.g., data that is artificially created – become commonplace, along with ‘deepfake’ technology that can essentially create any kind of reality the creator desires. Today we have the ability to amass massive amounts of data, create new types of data, weaponize it and create and move markets without governance structures sufficient to protect consumers, patients, residents, investors, customers and others – not to mention governments – from harm. If we intend to protect democracy, we need to move deliberately, but we also need to move fast. Reversing the damage of the ‘fake news’ era was hard enough before synthetic content; it will become exponentially harder as deepfake news becomes the norm. I’m less worried about sentient robots than I am about distorting reality and violating the human rights of real people at massive scale. It is therefore incumbent on both public and private institutions to put appropriate regulations in place and on citizens to become conscious consumers of digital information, wherever and however we find it.”
If people “prefer peace over anarchy, tyranny is the more likely outcome”
Russ White, infrastructure architect and internet pioneer, said, “It is important to begin by noting a ‘pure democracy’ in itself is not necessarily the best form of government. Direct democracy tends to play into the worst aspects of mass media, particularly the media ecology built around internet technologies, producing mob rule. The question then becomes: Who controls the mob? Generally, this will be the strongest influencer(s), and the platform(s) they ‘live on.’ Given this, if technology companies continue along their current path, by 2030, democracy will be outwardly thriving, but inwardly failed. People will be able to vote, but their votes will be shaped by the commercial interests of the influencers and platform owners, rather than by deep reflection on the nature of humanity and justice. Either the social media platforms and influencers will take the situation in hand and control the mob through technological tyranny, resulting in peace, or they will not, resulting in anarchy. As people always prefer peace over anarchy, tyranny is the more likely outcome. The ideal, but not likely, outcome is that people will start taking responsibility for their knowledge and lives, and a techlash will develop around using technology responsibly. This path would result in (re)forming a republican, federalist government designed to allow maximum variation within beliefs while keeping the peace among various groups. Building this, however, requires acceptance of personal responsibility and social institutions who can take the lead – not likely/available in our current environment.”
People need to be educated about manipulation techniques
Esther Dyson, internet pioneer, journalist, entrepreneur and executive founder of Way to Wellville, wrote, “Tech will both strengthen and weaken democracy, depending on how ‘we’ use it, and depending on how we define ‘we.’ Democracy depends on a shared sense of community and right now we are creating too many warring communities when we should be enlarging them. We also need to educate people on how they can be manipulated through tech and give them the understanding and the tools to manipulate themselves more effectively.”
No authoritative information = no democracy
Isaac Mao, director, Sharism Lab, said, “Information and its channels are everything. Moving toward 2030, if we can’t understand and regulate it well, then disinformation could totally overwhelm people’s limited bandwidth for input. Professional journalism and democratic institutions are eclipsed in such an emergency. There will be no authority of information, which will definitely mean no democracy. Technology is neutral, but will provide many wild ways to mislead people if big technology companies and totalitarian regimes control the information channels with lures and algorithms. Humans’ brains can be easily misled to chase fake news, distorted facts and/or censorship traps without realizing it. They can’t even find credible ways to verify the authenticity of information because every channel can be tainted. Even though individuals have gained the power of sharing, their voices are not easily heard. It’s the biggest threat to our future.”
There will “anti-institutional, insurrectionist movements” seeking solutions
Ethan Zuckerman, director, MIT’s Center for Civic Media, and cofounder, Global Voices, said, “The problems facing democratic institutions are less about technological change and more about a 40-plus-year slide in trust. Many institutions aren’t working well for citizens of democracies. Technologies are helping people articulate their loss of trust, but they’re also helping people organize outside traditional institutional channels. My prediction is that we’ll see an increasing number of anti-institutional, insurrectionist movements that seek solutions by working around existing institutions and using technical tools as a key part of their movement building.”
Political parties fracture as issue-based microtargeting becomes effective
Loren DeJonge Schulman, deputy director of studies and senior fellow, Center for a New American Security, previously senior adviser to National Security Adviser Susan Rice, said, “My expectation is that citizens will begin to put more of a premium on aligning with candidates or movements that 1) are able to tailor their engagement to the narrow interests of particular voters and 2) allow them to preserve their technology comfort zones while protecting them from technological threats. I believe parties will fracture, as voter and fundraising issue-based microtargeting becomes more feasible and effective. Individual polling could become less reliable as means of access to specific voter blocks declines or fragments across generational or value (e.g., privacy) divides.”
In data-driven democracy, points-based participatory citizenship could be a status symbol
Thomas Frey, founder and senior futurist, DaVinci Institute, said, “Is there a difference between a good citizen and a great one? Is it OK to only do the bare minimum of what it takes to be a citizen? Would we be a better country if we all tried a bit harder? Citizenship means different things to different people. We typically have a back-of-the-mind rating system in place that tallies things like standing and singing during the pledge of allegiance, installing a flag on the front porch during holidays and openly thanking our veterans into an overall citizenship quotient. But should there be a more formal ranking system, and more importantly, how would it be used? As a status symbol, the reinvention of citizenship is long overdue, and the possibilities are endless. We are moving quickly into a data-driven world where numeric values will be assigned to virtually everything we do. Here are a few quick examples: -File our taxes on time and we receive an additional 3,000 points, but for every day we’re late, we lose 200 points. -Go in for regular health checkups we receive 1,000 points, but if we shrug off an appointment, we lose 2,000 points. -Receive a parking ticket we lose 1,500 points. Once we pay the fine, we get our 1,500 points back. -When an election is held, you receive 500 points for casting your vote.”
“By 2030, we’re likely to have long lost our willingness to believe most media outlets”
Jamais Cascio, distinguished fellow at the Institute for the Future, wrote, “Although in the longer run we’re likely to develop effective counters to many of the politically pathological technologies, over the 2020s, the explosion of information-manipulation tools will outpace our ability to adapt to and contain those technologies. By 2030, we’re likely to have long lost our willingness to believe most media outlets. Surrounded by falsehoods and fakes, we’re more likely to ignore scandals than be outraged by them. The ease with which convincing fake images, audio and video can be created renders nearly all sources suspect; it’s too easy to dismiss everything as false, and too often correct. However, when something does break through the barriers of skepticism, the reaction will often be disproportionately great. At the same time, we’ll be in the early days of tools and practices that will help filter through the falsehoods and return a measure of trust to the system. They won’t have broad use yet, but we’ll start to see benefits.”
We will adjust, but not without tension and informed public participation
Like the growing pains of democracy during the rise of newspapers, then radio, then TV, the adjustments will not be smooth, but they will be made. Paul Jones
Paul Jones, founder and director of ibiblio and a professor at the University of North Carolina-Chapel Hill, wrote, “Communications technologies, especially at their early adoptions, can be subject to centralization, control and exploitation, creating new identities (imagined communities) and, often, polarization within populations. But in the longer run, as the social formation of each technology is more established, communications enrich our daily lives and become the field and even background of our extended interactions. At the moment, democracy is both under attack and surging in the streets. Not to be caught up in presentism or to be utopian, but to be optimistic – our present technologies point toward more oversight, control and polarization, but in the longer run we have seen both mass media and personal communications tend to empower democratic institutions. By 2030, we will have adjusted to the abuses of data aggregation, of surveillance, of misinformation, and will be honoring – not without tension and required attention – informed public participation. Like the growing pains of democracy during the rise of newspapers, then radio, then TV, the adjustments will not be smooth, but they will be made.”
These worrisome trends need not continue; we have adapted before and can do so again
Andrew Lippman, senior research scientist and associate director, the Media Lab, MIT, wrote, “Two things seem clear: 1) In the U.S. and some other countries, people have lost faith in the traditional institutions that build a common social core. In part, this is due to the multiplicity of outlets that address fringe elements. These were not economic in the past when there was more friction in publishing. 2) The increased use of artificial intelligence manipulation of data and the visceral impact of much news allows falsehoods to penetrate more effectively than in the past. This does not bode well for an informed and thoughtful populace in the near term. However, I am not in a position to gauge how much this is the fault of the internet or of other aspects of society, of which there are many. Nor do I think that the current trends need continue. We have generally been able to adapt to media evolution and invention, so I suspect that we can do so again, although it may take some real work.”
We are undergoing important change in our conception of free speech
David Weinberger, senior researcher at Harvard’s Berkman Klein Center for Internet & Society, said, “Who knows?… We’re undergoing an important change in our conception of what ‘free speech’ means. We could afford to let speech be much freer back when so few voices could actually be heard and the range of opinions was far more constricted. Back then, the filtering out of harmful ideas was accomplished by only giving the mic to a homogenous set of folks. (White men of a certain class, if you were wondering.) Now that everyone has the mic, the filtering – if we decide we actually prefer our free speech to stay within particular boundaries – has to be done by the platforms. So, it’s quite possible – but who knows? – that the online platforms where we hear the bulk of public speech will enforce limits that in the past we would have rejected as overly inhibiting – not only on hate speech, but also on speech that promotes ideas that we consider to be harmful to the public weal. There’s certainly a slippery slope possible here, but, as with all slippery-slope arguments, that’s only a problem if we choose to slide down it. It’s also possible that platforms will segregate according to which sets of views they find harmful, in which case the divisions among us will get yet more severe.”
“Will the nation-state as we know it survive intact? No way to yet tell”
Jeff Jarvis, director of the Tow-Knight Center and professor of journalism innovation at City University of New York, wrote, ‘The internet as a grand network connecting people with people, people with information, information with information and machines with machines. Already we see, for example, that new voices not represented by institutions including government and mass media can now speak. Thus, we have, for example, #metoo and #livingwhileblack. Thus, we also have a backlash from entrenched forces – read: old, white men – who fear loss of power and who so far would seem to rather destroy institutions than share power in them. Who will win? There is no way to yet tell. We also see globalization not only in commerce – affecting jobs and economies – but also in social interaction. Thus, borders are challenged and so are nations. Is this challenge a reason why we see the rise of nationalism? We see now that wars can be fought with data and without national armies or weapons. We see that virtual currencies can challenge the monetary power of nations. Will the nation-state as we know it survive intact? No way to yet tell. At the same time, governments are trying to regulate the net – which actually means they are trying to regulate the behavior of citizens on the net – goaded on by their own worries and by the spending of political capital by legacy media and other threatened industries and institutions. Can the net, built to withstand the disruption of nuclear attack, withstand effort to balkanize it by government? Will liberties prevail? Too soon to know.”
Digitization is “the biggest thing since oxygenation”
I want to believe that the dark underbelly of the digital world that is distorting democracy will be exposed and its impact lessened over the next decade. Gina Glantz
Doc Searls, internet pioneer and former editor-in-chief of Linux Journal, said, “In these early years of our new digital age, social media (a collection of new and likely epiphenomenal developments) in particular are amplifying homophily: the tendency of people to gather among those with whom they share characteristics, loyalties, affinities and other forces that attract people into tribal groupings. Blaming and demonizing other tribes comes naturally to humans, and we’re at a stage right now when doing that is just too damn easy. We’ll get past it, but in the meantime, tribalism is making enemies of groups that used to merely disagree. This naturally affects governance in all forms, especially democratic ones. We are in the early stages of the Digital Transition: a time when everything that can be digitized is being digitized. This includes all forms of studying, communicating and remembering things. Plus, everything that doesn’t need to be physical: a sum that is huge beyond reckoning. Recently I asked Joi Ito, at that time the head of MIT’s Media Lab, how big this is. ‘Is it bigger than electricity?’ I asked. ‘Movable type? Writing? Speech? Stone tools?’ ‘No,’ he said. ‘It’s the biggest thing since oxygenation.’ That happened around 2.5 billion years ago. And I think he’s right: It’s that big.”
Hope for greater participation in the most fundamental democratic processes
Gina Glantz, a political strategist and founder of GenderAvenger, said, “I want to believe that the dark underbelly of the digital world that is distorting democracy will be exposed and its impact lessened over the next decade. I hope by 2032 safeguards will have been created so that voting can take place electronically, encouraging much greater participation in the most fundamental of democratic processes.”
“Casual participants vastly outnumber engaged and thoughtful ones”
Larry Keeley, cofounder of Doblin and professor of innovation at Kellogg Graduate School of Management and IIT’s Institute of Design, said, “Technology will, of course, both materially strengthen and weaken participative democracy. The ‘balance’ will depend on individual users. Sophisticated users will be able to harness more and better tools for evaluating political issues, topics, candidates and ‘leaders.’ They will increasingly be able to see integral fact-checking, historic patterns, even be able to use predictive analytics tools to evaluate what that individual is likely to prefer in the future. Indeed, there will be a new class of tool emerging that will allow any of us – even curious elected officials (wherever they may still be found) to use simulators to manage complex questions, such as: Should we have higher or lower minimum wages? How about a guaranteed minimum income? Should we invest in more or less health care, and focused on which ages in particular? Should we invest in more infrastructure? How much? Should we give everyone free high-speed Wi-Fi? Etc. Of course, at the same time, for unsophisticated users, there will be ever more (and more sophisticated) tools designed to engage, enrage, compel, cater to and amplify one’s previously held views, prejudices or suspicions. These tools will be everywhere. So, I answered that, on balance, technology will hurt participative democracy, simply because I think casual participants vastly outnumber engaged and thoughtful ones. Wish that were not the case. Neil Postman nailed it with his title: ‘Amusing Ourselves to Death’ – and he wrote that book BEFORE the advent of the internet.”
Technology will be used to control citizens; perhaps also to decrease atmospheric carbon
Barbara Simons, past president of the Association for Computing Machinery, commented, “If climate change is not treated as an emergency and as the existential threat to civilization and much life on earth that it is, civilization as we know it will be destroyed. In all likelihood, non-democratic regimes will be created that are fascist in nature because of the limited amount of resources available. Technology will be used to control citizens. Perhaps it also will be used to decrease the amount of carbon in the atmosphere, but that remains to be seen.”
Democracy is challenged by an Asian model of governance in a complex environment
Philippe Blanchard, founder of Futurous, an innovation consultancy based in Switzerland, said, “The democratic model was born as a philosophical response similar to the ‘wisdom of the crowds.’ The collective decisions would be the best solution to find answers answering the needs of the community as well as ensuring the cohesiveness of the community. We are now living in more complex, multidimensional environments: 1) That complexity means that it is more difficult for the general public to understand the impacts of the political decisions. 2) The pace of change (technology, sociology) is conflicting with the institutional pace. In addition, we need to review different elements to ensure the relevancy of democracy: 1) Education of the citizens, and accessibility of information 2) Institutional structures of representation (direct democracy vs. indirect) 3) Regulation. But we need also to understand the fundamental differences in our respective cultures. The Greek philosophy structured the Western thinking (primacy of the concept, the model as per Plato’s idea) versus the Chinese/Asian philosophy, where the context prevails over the concept (Qi, the energy). The Chinese philosophy of efficiency only arises from the question of the ‘coming’ and not of ‘being’ and metaphysics. It does not ask the question of the self, the subject or the separation of practical theory but only the question of efficiency from the natural course of things. It is interested in the process, the procedure that leads to rather than the state. What interests the Chinese philosophy is therefore not the action but the ‘potential of the situation,’ which contains its own transformation. The availability of big data is therefore the best way to assess and influence this potential of situation. Alongside the availability of the tools, the question of ‘democracy’ is therefore also challenged as the only relevant governance model.”
Will the future serve a wider range of interests than profit incentives?
Anthony Nadler, associate professor of communication studies at Ursinus College, and fellow at Columbia University’s Tow Center for Digital Journalism, said, “One way of thinking about technological development is as a process of discovery and innovation that simply unfolds along a predestined path. But I hope the techlash helps to challenge this way of thinking about the future of technology. When it comes to issues like the growth of online disinformation or exploitation of user data – just to draw on a couple poignant examples – today’s tech crisis is not simply the inevitable outcome of digital technology. These problems stem from particular choices about how our contemporary digital architecture has been designed to serve the commercial purposes of the dominant players in the market. The question for the next 10 years, then is not simply a matter of what new technologies will be invented or which technical problems will be solved. It’s going to be a matter of … which groups and whose perspectives will have a decisive input into how technology is designed and what values and goals it will be built to prioritize.”
The remaining sections of this report cover many more predictive comments from technology experts and futurists as they elaborate on the potential future of democracy in the digital age, sharing their views on today’s trends and what they mean as we enter the next decade of digital life. Their comments are gathered under the specific themes that were briefly highlighted at the start of this report. Many of the answers cross over to touch upon multiple aspects of the digital future most do not neatly address only aspect of the likely future. Some responses are lightly edited for style and readability.