‘Democracy Disrupted: Digitalisation and Human Rights’
On balance, has the growth of digitalisation been harmful to human rights, or helpful? It’s a stark question, but one without a simple yes or no answer, and nobody attending the Friedrich Naumann Foundation and Liberal International conference in Johannesburg on December 6 and 7 was formulating it in such crude terms. But the experts and interested parties gathered were focussed on defining the parameters of the question, on presenting case studies of solutions and failures, and on sharing strategies for promoting democratic ideals and human rights.
Ultimately, the question is moot. Digitalisation, which I’m using as a shorthand for the largely unchecked growth of the social media platforms, the vulnerabilities created by a reliance on third party-controlled technologies, and the capacity for disinformation and targeted violence that the internet brings, is here to stay. Well, until the next big, unpredictable thing. So the task at hand is to identify the fault lines, manage the disruption, and take advantage of the undeniable opportunities digitalisation brings for democracy and human rights.
The first panel of day one asked the question, “Can political and human rights be safeguarded in the digital age?” Both Ann Riedel (chairperson of LOAD e.V, an association for liberal internet policy) and Agnieszka Walorska (founder of Creative Construction, kickstartAI and hidence) expressed a fair degree of worry about the state of legislation, and about the untrammelled power that social media platforms and algorithmic microtargeting seem to have. Walorska was particularly passionate about the addiction to digital platforms that is exacerbated by the use of dark design, and spoke about what she called the “negative freedoms; the freedom from being controlled”.
Also flagged was the threat (and threat is a word that popped up again and again over the duration of the conference) of surveillance states like China (although also other countries) exporting their techniques and technologies to nations and governments seeking to clamp down on democratic freedoms. Currently, “Chinese tech companies — particularly Huawei, Hikvision, Dahua, and ZTE — supply artificial intelligence surveillance technology in 63 countries”.
But the primary questions raised by Walorska’s presentation (and also in Riedel’s) were ethical ones, with the Chinese messaging platform TikTok her exemplar. How do we protect the most vulnerable of our digital population — children — on a platform most people don’t understand, and which provides limited oversight?
In her presentation, Riedel echoed Walorska, speaking of the human rights that people tend to pay less attention to, such as freedom of expression and the right to privacy. A general consensus was reached by the audience that Riedel’s suggestion about the primary role that education could play was going to be important, but could only be a part of the solution. Perhaps her most important insight was, “we must understand people’s digital fears”. Adding to this, Walorska suggested that “we need easier privacy laws that everyone can understand”. Speaking of the frightening amount of people’s health data owned by third party companies, she said that a “long term goal should be that humans own their health data, and they lease it to companies”.
Education around digital rights and data privacy is a difficult task, of course. Walorska asked us to vividly picture the average social media platform’s terms and conditions printed out, a high stack of practically unreadable data. Can the majority of people even understand what algorithms do, and how they do it? Both panellists spoke about the huge technological complexities of dealing with digital attacks on human rights, and indeed on political rights, and floated the complex question of how business models need to change to avoid unethical practices. The emphasis on ethics and philosophical analysis was the ideal precursor to day two’s focus on practical experiences of how authoritarian regimes and bad actors deploy digital weapons, and how to counteract this.
This writer’s contribution was to show how Code for Africa uses data to give citizens actionable information, so as to enable them to use evidence-based activism when lobbying governments for change. One representative example would be Code for Africa’s sensor journalism programme, sensors.AFRICA, which deploys low-cost air and water quality sensors to communities and newsrooms, allowing those affected by pollution to directly challenge the relevant authorities with calls for change. Another would be Code for Africa’s strategy of enabling newsrooms to start their own data desks, and providing training in data journalism. Fundamentally, the liberation of data is one of the crucial areas for the promotion and preservation of democracies.
Day two began with a panel entitled “Who checks your status? Digitalisation in authoritarian societies”. The three panellists were Olga Karatch (director of Belarusian civil rights movement Our House), Ana Corina Sosa Machado (democracy activist and daughter of Venezuelan opposition leader Maria Corina Machado), and Johnson Yeung (chair of the Board of Hong Kong Civil Hub, Board Member of Amnesty International Hong Kong).
Machado, speaking about the strictures of Hugo Chavez’s authoritarian rule, and the poverty and abuse of human rights associated with it, described three areas where technology has been used to assert control. The first is shutting down access to information, and she highlighted the shutting down of over 150 traditional media outlets, and spoke about how effective this has been because the digital space is much easier to control and track. It’s a sobering reminder of how human rights has been put on the back foot with the erosion of traditional media platforms.
The second is control of infrastructure. The Venezuelan government controls 70% of the network infrastructure, and has the monopoly in internet connectivity — thereby allowing them to track citizens and throttle connectivity. It’s an effective ploy, and one is reminded of the fact that India is still, as of writing of this story, responsible for the longest shutdown (on its 134th day on Monday, 16 December), ever imposed in a democracy. Only authoritarian regimes such as China and Myanmar have cut off the internet for longer.
The third is the actual technology used to attach human rights, areas where the propaganda lessons of Russia (what a subsequent panellist, Dmitry referred to as the “Russian university of disinformation”) and surveillance tech of China play a large part. Machado spoke of how government uses identity documents and electronic voting systems to collect data which gets translated into electoral power, and used to manipulate election results.
But Machado also reminded us of the upside of the age of digitalisation. Activist communications takes place on social media, as well as news gathering to counter the fact that media houses are almost all state controlled. Technology has been critical for the country’s 7 million exiles, enabling them to connect and stay in touch. In Machado’s case, her mother — restricted for the last five years from leaving Venezuela — was able to virtually attend her and her brother’s graduations. But even about this, she sounded a cautionary note: an average conversation with her mother entails switching between three platforms to avoid being tracked.
Belarusian activist Olga Karatch was denied permission to visit South Africa, and so appeared via video linkup. She began and ended her address with what she called “famous Belarusian jokes”. “If you are in the position of being a dissident, you must make sure you lead an interesting life so that the KGB doesn’t get bored”, and: “Even if you are paranoid, and think that someone is watching you, it doesn’t mean that nobody is watching you.” Both jokes signalled a bravery in the face of oppression, but also spoke to the the levels of surveillance and control exerted by the Belarusian government (Belarus ranks 153 out of 180 countries on the Press Freedom Index). Again, a warning note was sounded about Russia’s role in spreading disinformation and surveillance techniques, and about the use of these techniques to limit human rights. For example, all social networks were shut down by the government in 2010. Karatch pointed out a pattern that we can see all over the world: the tools of disinformation are mostly used against women. Karatch’s main lesson was that freedom is a power, which is why governments try and curtail and control it.
Johnson Yeung demonstrated one of the paths that human rights and social justice movements can take to use digitalisation as a force for good. The Hong Kong movement is a largely faceless, networked movement, where consensus building is based on online deliberation and social media capital, and where decentralisation means that there is no clear leader. Predominantly made up of young, digital citizens, it’s a model for how to to use the potential of digital. Bots coordinate actions, apps allow citizens to track and dox police officers, and live broadcasting allows real time observation of atrocities and triumphs.
All the concerns discussed above where succinctly encoded in the results of the Liberty International survey of liberal parties, presented by vice president Astrid Thors. The top three threats to human rights were fake news, lack of accountability by big tech platforms, and threats to privacy by government surveillance. The best methods to combat these were fact checking, education and more legislation, and a stricter enforcement of existing laws.
Other contributions by panellists all sounded the same note: there are opportunities to use digitalisation for good, but the opposition by governments is formidable. The Ukraine’s Dmitry Litvinenko, for example, spoke of Russian control as intending to cause despair and paralyse civil society, and urged that we establish a stronger definition of what disinformation is. The Democratic Alliance’s Francine Higham detailed South Africa’s Bell Pottinger case study, where a political party used the media’s investigative work into a disinformation campaign run by crooked businessmen and politicians to effect actual change (driving PR agency Bell Pottinger into administration and disgrace because of their racially divisive disinformation).
After two days of case studies and discussion, delegates had a deeper understanding of the scale of current problems, of what future problems to expect in different territories, and of possible solutions and opportunities. The conclusion, though, was inescapably that disinformation, digitally-driven authoritarianism, and data privacy is everyone’s problem. It’s going to need a massive collaboration between governments, business, media and citizens if we are to ensure that digitalisation enables human rights, rather than destroys them.
“Democracy Disrupted: Digitalisation and Human Rights” was hosted by the Friedrich Naumann Foundation and Liberal International in Johannesburg, South Africa on 6–7 December 2019.
Code for Africa (CfA) is the continent’s largest federation of indigenous civic technology and open data laboratories with CfA labs in Kenya, Nigeria, South Africa, Tanzania and Uganda and a further five affiliate labs in Cameroon, Ethiopia, Ghana, Morocco and Sierra Leone and funded projects in a further 12 countries. CfA manages the $1m/year innovateAFRICA.fund and $500,000/year impactAFRICA.fund, as well as key digital democracy resources such as the openAFRICA.net data portal and the GotToVote.ccelection toolkit. CfA primarily supports grassroots citizen organisations and the media to help liberate data and empower citizens, but also works with progressive government agencies to improve digital service delivery. In addition to funding and technology support, CfA’s labs incubate a series of trendsetting initiatives including the PesaCheck fact-checking initiative in East Africa, the continental africanDRONE network, and the African Network of Centres for Investigative Reporting (ANCIR) that spearheaded Panama Papers probes across the continent. CfA is an initiative of the International Center for Journalists (ICFJ).