Ems 2
Ems 2
Ems 2
TECHNOLOGY
Civic Space Future Trend Report
BY POONAM JOSHI
The potential impacts of emerging digital technology1 on civil society are widely dis-
cussed, but much more could be done to prepare civil society for a digital world. Wheth-
er digital technology will have a positive or negative impact on civic space and philan-
thropy will depend on a range of factors:
• The feasibility of reforming the practices and business models of the technology
companies;
• The governance of the internet and the degree to which states instrumentalize
technology for their own geopolitical and domestic goals;
• The capacity of civil society — beyond a small group of digital rights experts —
to integrate a focus on digital technology into every aspect of their work.
This briefing explores the trends in emerging digital technology most likely to shape
civic space, and opportunities for civil society advocates to mitigate and capitalize on
the changes ahead.
1 For the purposes of this paper, emerging digital technology refers to hardware and software built using information and
communication technology and/or the internet, such as artificial intelligence applications, facial recognition tools, online social
networks etc.
The digital economy is predicated on the accumulation, analysis, and sale of vast
amounts of data by a handful of largely US-based platforms: Facebook, Amazon, Al-
phabet (the parent company of Google), and Apple. This data — based on the brows-
ing habits, social media profiles, online purchases, and Google searches of users — can
be used to send targeted messages to influence the behavior of increasingly specific
groups of social media users for private profit, public good, or malign purposes.
The next generation of digital technologies will enable companies to extract data not
just from online spaces, but from the built environment. The Internet of Things2 will
enable companies to gather personal data from cameras, smartwatches, fitness track-
ers, toys, and automated travel. Facial recognition technologies and sensors in Smart
Cities3 will also allow the harvesting of data from users in public spaces.
While this data could be used to improve fitness, health care, transport, or energy ef-
ficiency, without regulation or oversight, it could equally aid manipulation or surveil-
lance by malign states, companies, or other non-state actors on an unprecedented scale.
2 In the broadest sense, the term Internet of Things (IoT) encompasses everything connected to the internet, but it is in-
creasingly being used to define devices that “talk” to each other, from simple sensors to smartphones and wearables. Through
combining these connected devices it is possible to gather information, analyze it and create an action that improves the
experience for the user.
3 A Smart city is an urban area that uses different types of electronic IoT sensors to collect data and then use insights gained
from that data to manage assets, resources and services efficiently. This includes data collected from citizens, devices, and
assets that is processed and analyzed to monitor and manage traffic and transportation systems, power plants, utilities, water
supply networks, waste management, crime detection, information systems, schools, libraries, hospitals, and other commu-
nity services.
4 Ameliorative actions taken by the tech platforms are documented in Governance Innovation for a Connected World, edited
by Eileen Donohoe and Fen Osler Hampson, Centre for International Governance Innovation, 2018.
‘‘
However, effective regulation of the internet, AI, and future
digital technology will only be possible with the cooperation
of multiple government agencies and the private sector com-
panies, a challenging outcome to achieve given the divergent
values and agendas of Western democracies and authoritarian
states on this issue.
The battle for who governs the internet reflects a broader geo- Whether
political struggle for values and influence between authoritari-
an states and Western democracies. China, Russia, and Iran, in technology will
particular, have raised concerns about the decentralized gover-
nance of the internet, partly as a reaction to the internet being
be harnessed
driven by the US and other Western democracies. A decentral- for the public
ized internet runs counter to the desire of these states to man-
age information and communications centrally. good or in
One country where the US tech platforms have very little pow- ways that
er and influence is in China, which has protected its domestic
internet market from foreign competitors and built its own set threaten civic
space will be
of social networks, including the Twitter—like Sina Weibo and
WeChat/Weixin, which is similar to WhatsApp. Censorship is
baked into these platforms, as platform operators must mon-
itor online content and remove offending posts or risk losing
their operating licenses.
tently limit freedom of expression. For example, in April 2019, Comprehensive data protection laws,
the British government proposed sweeping new powers to re- which should apply to both the gov-
ernment and private sector, could
move “harmful” content from the internet, which could easily address many of the human rights
be used as a pretext to censor speech. risks posed by AI. One model is the
European Union’s General Data Pro-
At the same time, the US, Israel, and several European coun- tection Regulation (GDPR), one of the
tries have a record of exporting surveillance technologies to strongest and most comprehensive
attempts to regulate the collection
governments with poor human rights records, leaving them and use of personal data by both gov-
in a weak position to challenge China’s actions. In June 2018, ernments and the private sector. The
a number of European Union member states — including the GDPR limits data processing to per-
missible purposes, with protections
UK, Poland, Sweden and Ireland — attempted to block curbs on for sensitive data. It also requires
the export of surveillance equipment to abusive regimes. opt-in consent, which limits the use of
personal data for training AI systems.
In contrast, the European Union has been at the vanguard of Rights provided for by the GDPR, and
safeguarding privacy through the introduction of data protec- similar laws, offer a framework to pre-
vent against unaccountable uses of
tion laws in 2018, challenging the monopoly of US tech giants.7 AI that impact individual rights while
The EU is also seeking to set global standards on the ethical and ensuring a level of control of personal
legal framework of AI, particularly where it is adopted by pub- data and accountability for the use of
AI and machine learning systems.8
lic authorities and used in healthcare, policing, and transport.
Combating Disinformation
OPPORTUNITIES FOR ACTION Digital rights experts are calling for
laws that mandate that all automated
Regulatory Innovation accounts are clearly labeled to show
the source of an ad, the funding be-
hind it, and the scope of its reach. This
Civil society has a critical role to play in informing government could disrupt targeted digital political
efforts to articulate law and policy in the field of AI that speaks advertising and the amplification of
bot networks.9
7 The European Union has launched several actions, including challenging Google’s alleged
monopoly and demanding that Apple pay the Irish government 13 billion euros in back tax-
es.
8 See: Human Rights in the Age of Artificial Intelligence, Access Now, November 2018, p
30-31. GDPR was enacted in 2016 by the European Union, and went into effect May 25,
2018, across the EU’s 28 Member States.
9 See: https://luminategroup.com/storage/275/Digital-Democracy-Charter.pdf
Civil society groups would benefit from training and tools to enable them to defend
civic space at the national level vis-à-vis governmental tech initiatives, such as na-
tional AI strategies or laws on surveillance, as well as in the dozen multi-stakeholder
fora where global policy is discussed.10 A recent collaboration between ICNL and Stan-
ford’s Global Digital Policy Incubator to conduct the first Tech Camp for Civic Space
Defenders serves as an example. Particular support is required to enable civil society
representation from repressive governments, especially China and Russia, in stake-
holder forums.
10 Policy is debated in multilateral bodies like the International Telecommunication Union, engineering groups like the In-
ternet Engineering Task Force, the human rights system of the UN, normative bodies like the Internet Governance Forum
or the OECD, domain name organizations like Internet Corporation for Assigned Names and Numbers (ICANN) or regional
registries.
11 See: https://undocs.org/A/73/348 29 August 2018, pp 20-21 and Human Rights in the Age of Artificial Intelligence, Ac-
cess Now, November 2018, p 32.
The 2018 report, Malicious Use of AI, predicts that AI will figure prominently in the
security landscape of the future and that more can and should be done as a matter of
urgency to prevent the use of AI by malign actors. Some commentators12 argue that pol-
icymakers have yet to seriously grapple with AI’s repressive implications, particularly
with expression and assembly in the context of rising authoritarianism and democratic
backsliding.
12 Steven Feldstein reports that from 1989 onwards, popular revolts and electoral defeats have become the most common
causes of departure for dictators, compared to coups in the period 1946-1988. He argues that as the gravest threats to au-
thoritarian survival are from discontented publics on the streets or at the ballot box, autocrats are embracing digital tactics
for monitoring, surveilling and harassing civil society movements, and for distorting elections as strategies that are both cost
effective and carry less political risk. He also notes that democratic governments may have an incentive to use AI to monitor
the activities of political opponents and civil society and take pre-emptive action against potential challenges to their author-
ity. Finally, governments that depend on Chinese technology to control their populations will feel increasing pressure to align
their policies with China’s strategic interests. See: Feldstein, Steven. The Road to Digital Unfreedom: How Artificial Repression
is Reshaping Repression, Journal of Democracy, January 2019.
‘‘
rect or are complicit in hate speech towards certain groups.
13 See: Human Rights in the Age of Artificial Intelligence, Access Now, November 2018, p
21
14 Heat mapping involves the detection and strength of signals sent out from mobile devic-
es to create a “heat map” that indicates where protesters are gathering.
Rise of Misinformation
We are now in an era of information warfare where repressive states are realizing that
manipulation via digital platforms could be a much more powerful tool than suppres-
sion or surveillance to achieve their goals. A growing number of states are advanc-
ing their goals by using various tools to spread misinformation, including AI-powered
bots,15 deepfakes,16 and manipulating social media algorithms to interfere in elections17
and erode public trust in fact-based evidence.
15 A chatbot, or chat bot, is a machine that has a conversation with humans via text or audio. An AI powered chatbot is a
smarter version, which uses natural language processing (NLP) and machine learning (ML) to better understand the intent of
the human and provide more natural, near human-level communication.
16 Deepfakes are images or videos that are altered using neural networks and machine learning, making them both realistic
and difficult to detect.
17 Strategies include using bot-driven, large-scale information-generation attacks to swamp information channels with false
or merely distracting information, making it more difficult to acquire real information.
18 CAF’s 2018 report, Machine made goods: Charities, Philanthropy & Artificial Intelligence provides a comprehensive guide
to the practical and ethical implications of technology for philanthropy.
Combating Surveillance
Until recently, activity on the issue of surveillance was limited
to the investigation and exposure of the European, American,
and Israeli tech companies that supplied spyware to repressive
regimes.
19 Zuboff, Shoshana. The Age of Surveillance Capitalism: The Fight for a Human Future at
the New Frontier of Power. New York: Public Affairs, 2019
‘‘
users facing challenges to access due to disabili-
ty, visual impairment or language – e.g., Tailored
Chatbots that provide advice or service;
Registration Technology
CAF has highlighted several ways AI could be used to improve the registration, over-
sight, and compliance of nonprofits.23 These include: using AI to scan a large amount
of financial data to spot early compliance issues rather than using enforcement as a
tool; embedding laws and regulations in smart contracts governing how organizations
operate so that it would not be possible to break them, thus minimizing the need for
enforcement; and recording transactions on blockchain so that accurate, real-time in-
formation on spending would be available to everyone, and annual reporting would no
longer be necessary.
Movement technology
Informal and movement-based groups need robust, diverse, value-driven software
that keeps them safe and can be used securely in dangerous places. Aspiration Tech is
the leader on producing tools that support movement building, including open-source
software that supports secure communications and collaboration.
A 2017 report for the Ford Foundation sets out a long term strategy for providing sup-
port to develop technology that helps movements. The report identifies needs, includ-
ing the development and dissemination of software that enables activists to browse
securely, divorce their identity and location when using devices, and decentralize how
data is held across different countries so access can’t be blocked.
21 Large INGOs and aid agencies like UNICEF are experimenting with using blockchain for their internal money flows. Mean-
while, start-ups like Disberse are trying to build platforms that can harness blockchain technology to make cross-border pay-
ments more efficient, transparent and cost-effective.
22 See: https://www.cafonline.org/about-us/caf-campaigns/campaigning-for-a-giving-world/future-good/blockchain
23 See: https://www.cafonline.org/about-us/publications/2016-publications/block-and-tackle-using-blockchain-technolo-
gy-to-create-and-regulate-civil-society-organisations
www.icnl.org
facebook.com/ICNLAlliance twitter.com/ICNLAlliance