Taking on the tech giants: the lawyer fighting the power of algorithmic systems – The Guardian

In July 2019, Cori Crider, a lawyer, investigator and activist, was introduced to a former Facebook employee whose work monitoring graphic content on the worlds largest social media platform had left deep psychological scars. As the moderator described the fallout of spending each day watching gruesome footage, Crider was first struck by the depth of their pain, and then by a creeping sense of recognition.

After a 15-year career defending detainees of Guantanamo Bay, Crider had learned the hallmarks of post-traumatic stress disorder. But unlike Criders previous clients, the moderator had not been tortured, extradited or detained. They had simply watched videos to decide if they were appropriate for public consumption.

Its not as if I havent spent time with graphic material, Crider says, looking at me darkly across a Brixton caf table in a brief December window between lockdowns. Ive spent my entire career with torture victims and drone survivors. But theres something about these decontextualised images, videos and sounds coming up again and again and again that does something really weird to your psyche that I dont think we yet understand.

She began to investigate. The moderator introduced her to other moderators, who in turn led her to others. Today, she has spoken to more than 70 moderators scattered across the world. Every single person Ive spoken to has a piece of content that they will never forget, that replays in their mind, which they have flashbacks to and nightmares about, Crider says. One struggles to walk down the street without imagining the heads of nearby pedestrians exploding. Another no longer trusts male family members to look after their child after countless hours watching child sexual abuse footage induced a state of near-permanent paranoia. A third experiences recurring visions of a video in which a young boy is repeatedly mown down by a tank until his remains are subsumed into the tracks. This was the case Crider had been looking for.

A month earlier, Crider co-founded Foxglove, now a four-woman team of lawyers, community activists and tech experts dedicated to fighting for tech justice. It wages legal battles against the increasing use of opaque and discriminatory algorithms in government decision-making; the spread of harmful technologies, such as facial recognition software; and the vast accumulation of power by tech giants.

In December 2019, Foxglove engaged solicitors to pursue Facebook and the outsourcing company CPL in Irelands High Court, suing for millions in post-traumatic stress-related damages on behalf of numerous moderators, including Chris Gray. (The case is ongoing.)

But the money is ornamental to the political point Foxglove hopes to prove: that, against the odds, the tech giants can be beaten, that their workers could be the secret weapon in bringing them to heel, and that the digital world we all inhabit could be transformed by insurrections inside the system.

As the court battle rolls on, the fight against Facebook is entering new terrain. In late January, Foxglove secured a hearing with Irelands deputy prime minister, Leo Varadkar, so he could learn from moderators of the personal harm that policing the worlds news feed can cause. It is believed to be the first meeting of its kind anywhere in the world and, Crider hopes, the first step in demolishing the wall of silence, underwritten by stringent non-disclosure agreements, that holds tech workers back from collective action against their employers.

Our objective is never about winning the case, Crider says without a trace of misty-eyed optimism. Our objective is to change society.

Crider is slight, well-dressed and unnervingly self-assured. She talks quickly, losing herself in endless sentences that unspool in a thick Texan patter undiluted by more than a decade living in London. If she doesnt like a question, she asks one back. If she disagrees with a premise, she dismantles it at length. She is an unruly interviewee who would much rather be the interviewer. Im really not interested in tech as such, she announces before Ive managed to ask a question, but what I am interested in is power.

Prior to founding Foxglove, Crider, a small-town Texan by birth, had spent 15 years fighting the war on terrors most powerful players, including the CIA, FBI, MI5 and MI6, agencies she deemed to be acting unlawfully in the name of national security.

In her tenure as legal director of Reprieve, a human rights charity, she freed dozens of detainees from imprisonment and torture at Guantanamo Bay, represented grief-stricken families bereaved by drone bombings in Yemen, and forced excoriating apologies from the British government and security services for its complicity in illegal renditions and torture.

She saw how people innocent or guilty could be mangled by systems beyond their control. And she learned how to beat billion-dollar opponents with a fraction of the financial firepower. She describes her work, then and now, as asymmetric warfare.

But over nearly two decades of observing and intervening, Crider noticed a sea change in the tools used. She watched billions of dollars in military contracts hoovered up by the likes of Google, Amazon and Palantir; the vast expansion of government surveillance of its own citizens; and the exponential rise of drone warfare, whose victims she came to know and care for.

Seeing the most basic questions about a human life being made partly as a result of an algorithmic system the penny dropped for me, she says, briefly tender. It felt like something fundamentally different in the way power was operating.

Upon leaving human rights organisation Reprieve in 2018, Crider began meeting people who could teach her about technology academics, researchers, activists, and tech bloviators who absolutely need to get their asses sued. Then her friend and former Reprieve colleague Martha Dark reached out. T ogether, with public lawyer Rosa Curling, the trio founded Foxglove in June 2019.

The foxglove, a wildflower, contains compounds that, depending on the dose, can kill or cure. Its an analogy for technology that Crider says might be a little twee. It seeds itself freely, establishing footholds wherever it can. Foxglove, after its namesake, hopes to crop up where you least expect us.

To date, that has meant a series of high-profile victories against the British government. Foxgloves first major win came last summer. Some months earlier, Foxglove caught wind of the Home Office using an algorithm to influence its visa decisions. The algorithm deemed certain nationalities suspect, making it less likely their visa would be granted. It was such clear nationality-based discrimination, Crider explains, still angry. Foxglove, ever hungry for a good headline, dubbed it speedy boarding for white people, and promptly sued.

In the legal back-and-forth that ensued, Crider discovered that, like many algorithms of its kind, it was subject to a feedback loop: the more people from a given country were rejected, the more likely future applicants from that country would be too. The machine was confirming its own biases.

In August, the Home Office capitulated rather than fight its case in court. It committed to abandoning the algorithm and conducting a review of its practices. It was the first successful judicial review of an algorithmic decision-making system in the UK, now estimated to be in use by half of all local authorities in Britain.

Such systems are currently used to assess the credibility of benefits claims, predict the likelihood of an individual to commit knife crime, and in countless other tasks once performed by people alone. What concerns Crider is not any individual system, but the fact that a growing number of government bodies are relying on technology they rarely understand, and that few members of the public are even made aware such technology is in use.

We absolutely do not want to have to repeatedly sue about this, Crider says. We just want municipal controls before this tech even gets used.

She cites Helsinki and Amsterdam as exemplars: in September, both announced public-facing artificial intelligence registers outlining how algorithms used by the city governments work, how they are governed, and who is responsible for them.

These systems have to be democratically accountable to all of us, Crider argues. In leveraging the law to force hidden information into the public eye, she thinks she can trigger confrontations moments of productive conflict that activate the democratic process. But without transparency, the possibility of conflict is foreclosed. People cant be angry about things that are withheld from them. And transparency, she argues, was one of the first casualties of the pandemic.

Five days after the first national lockdown, the Department of Health outlined plans for a new data store. It would combine disparate data sources from across the NHS and social care to provide an up-to-date picture of Covid-19s spread. It would, the blog declared grandly, provide a single source of truth on the pandemics progress.

But the government wasnt building the project alone. Microsoft, Google, Amazon, Faculty and Palantir all received contracts, perhaps lured by the honeypot of data at the heart of the worlds largest integrated healthcare system. (EY, a management consultancy, estimates the commercial value of NHS health data at several billions of pounds every year.)

The fact that Faculty, an artificial intelligence firm previously contracted by Vote Leave and with shareholders including senior Tory politicians, was involved in the project raised eyebrows. But Palantir, a data-mining firm founded by Trump donor and PayPal founder Peter Thiel, rang alarm bells.

Its not even really a health company, Crider exclaims breathlessly. Its a security firm!

In her past life fighting against the war on terror, Crider had watched Palantir develop counterinsurgency technology for the CIA and US military. She had followed news reports detailing its extensive contracts with US police forces that disproportionately targeted black and brown communities. And she watched as it provided technologies that allowed the vast expansion of immigration enforcement against undocumented people across her home country.

Crider asks, Do we, the public, think that these are fit and proper partners for the NHS?

When the government refused Foxgloves freedom of information requests for the disclosure of the contracts, it partnered with progressive news site openDemocracy and threatened to sue. They released the contracts literally hours before we were due in court, Crider says, rolling her eyes. The act of disclosure forced the Department of Health to state that the intellectual property for the programs built from the data store would remain under NHS control, not be spirited off by big tech and then sold back to the health service. It meant they couldnt sell us back to ourselves, Crider grins.

The fear, in Criders mind, is that big tech establishes itself at the heart of the health service. Its privatisation by stealth, she suggests, and symbolic of a growing co-dependence between big tech and government that makes meaningful regulation of the tech giants a pipe dream.

Thats part of the reason Crider doesnt see the solution to big techs excesses coming from the governments that increasingly depend on their software and services. People power, in Criders view, is our only hope and is why the Facebook moderators fight should concern us all.

To date, Crider argues, we have missed what she sees as the Achilles heel of Silicon Valleys largest players: their relationships with their own workforce. Thats what makes Foxglove different, she muses. Were intensely focused on building tech-worker power.

We see so much discussion about the content on social media, she says, reeling off issues from misinformation to hate speech to targeted political advertising, , but almost nothing on the conditions of labour that prop up the entire system, without which there literally is no such thing as a YouTube or a Facebook. You think its a shit show now? You would never set foot in there without the work that these people do! They are not an aside to the work they are the work.

Tech workers are beginning to understand their power, Crider notes. Google workers are in the process of unionising under the banner of the Alphabet Workers Union. This month, some 5,000 Amazon employees in Alabama will vote on whether to become the trillion-dollar companys first formal union. Just last year, the Communications Workers of America began its first big union drive among tech workers, called Code.

The problem, as Crider sees it, stems from an idea propagated by the tech giants themselves: that they are merely a news feed, a helpful search engine, or a grid of pristine images, and not concrete entities with exploitative factory floors to rival any of the industrial titans of the 20th century. These companies have disrupted their way out of worker protections that people have fought for decades to win, she concludes.

Crider is unequivocal: Facebook moderators, and tech workers at large, need unions. But thats a long path. She hopes the legal case, the Varadkar hearing, and Foxgloves work connecting disparate moderators across the world will trigger a kind of class consciousness that could fuel a tech-worker uprising.

But another barrier looms large: the non-disclosure agreements that ensure the silence of Facebooks workforce.

The single greatest impediment to these workers coming together seems to me to be the fear of speaking. You cant achieve collective power if you dont break that wall down, she declares.

After 18 months working with Facebook moderators, Crider still doesnt have a copy of the contract, which moderators allege they have to sign, but are not allowed to keep. Is that even lawful? I dont think thats lawful! she says. And their testimony suggests cripplingly stringent terms: they are forbidden from speaking about their work, to anyone, including their spouses. Its like the god damn CIA, Crider shrieks.

These problems affect us all. Facebook has effectively become the public square, influencing what news we read, the arguments we have, what digital worlds we inhabit. People inside the system have the ability to change that, Crider argues, and stop the pollution of the information flows that democracy depends on. If only they had the power to act.

Crider tells me she is at home in conflict. But behind the love of a scrap is perhaps what makes Crider most dangerous: a primordial care for people in trouble, whether thats a 15-year-old boy unlawfully detained in Guantanamo Bay, or the Facebook moderator whose work has poisoned their ability to forge fulfilling human relationships.

Facebook, and whichever entity is next in the firing line, should expect a fight. Crider is not out to settle. She does not believe entrenched power can simply be persuaded into changing course. And she has no faith in the tech founders to save us from the monsters they have birthed. Foxglove wants to make it cost, both reputationally and financially, such that business as normal is unviable, whether for governments outsourcing core public services to opaque algorithmic machines, or for the tech billionaires profiting from democracys decline.

This is not about persuading them to do the right thing: its about increasing the cost to them of persisting in their shitty behaviour, she summarises. We dont need to win every time, she smirks, we just need to score enough wins that, eventually, the political calculus tips.

Visit link:

Taking on the tech giants: the lawyer fighting the power of algorithmic systems - The Guardian

Related Posts

Comments are closed.