It is very, very easy for a well-intentioned AI practitioner to inadvertently do harm when they set out to do good AI has the power to amplify unfair biases, making innate biases exponentially more harmful. Because AI often interacts with complex social systems, where correlation and causation might not be immediately clear or even easily discernible AI practitioners need to build partnerships with community members, stakeholders, and experts to help them better understand the world theyre interacting with and the implications of making mistakes. Community-based system dynamics (CBSD) is a promising participatory approach to understanding complex social systems that does just that.
Artificial Intelligence (AI) has become one of the biggest drivers of technological change, impacting industries and creating entirely new opportunities. From an engineering standpoint, AI is just a more advanced form of data engineering. Most good AI projects function more like muddy pickup trucks than spotless race cars they are a workhorse technology that humbly makes a production line 5% safer or movie recommendations a little more on point. However, more so than many other technologies, it is very, very easy for a well-intentioned AI practitioner to inadvertently do harm when they set out to do good. AI has the power to amplify unfair biases, making innate biases exponentially more harmful.
As Google AI practitioners, we understand that how AI technology is developed and used will have a significant impact on society for many years to come. As such, its crucial to formulate best practices. This starts with the responsible development of the technology and mitigating any potential unfair bias which may exist, both of which require technologists to look more than one step ahead: not Will this delivery automation save 15% on the delivery cost? but How will this change affect the cities where we operate and the people at-risk populations in particular who live there?
This has to be done the old-fashioned way: by human data scientists understanding the process that generates the variables that end up in datasets and models. Whats more, that understanding can only be achieved in partnership with the people represented by and impacted by these variables community members and stakeholders, such as experts who understand the complex systems that AI will ultimately interact with.
How do we actually implement this goal of building fairness into these new technologies especially when they often work in ways we might not expect? As a first step, computer scientists need to do more to understand the contexts in which their technologies are being developed and deployed.
Despite our advances in measuring and detecting unfair bias, causation mistakes can still lead to harmful outcomes for marginalized communities. Whats a causation mistake? Take, for example, the observation during the Middle Ages that sick people attracted fewer lice, which led to an assumption that lice were good for you. In actual fact, lice dont like living on people with fevers. Causation mistakes like this, where a correlation is wrongly thought to signal a cause and effect, can be extremely harmful in high-stakes domains such as health care and criminal justice. AI system developers who usually do not have social science backgrounds typically do not understand the underlying societal systems and structures that generate the problems their systems are intended to solve. This lack of understanding can lead to designs based on oversimplified, incorrect causal assumptions that exclude critical societal factors and can lead to unintended and harmful outcomes.
For instance, the researchers who discovered that a medical algorithm widely used in the U.S. health care was racially biased against Black patients identified that the root cause was the mistaken causal assumption, made by the algorithm designers, that people with more complex health needs will have spent more money on health care. This assumption ignores critical factors such as lack of trust in the health care system and lack of access to affordable health care that tend to decrease spending on health care by Black patients regardless of the complexity of their health care needs.
Researchers make this kind of causation/correlation mistake all the time.But things are worse for a deep learning computer, which searches billions of possible correlations in order to find the most accurate way to predict data, and thus has billions of opportunities to make causal mistakes. Complicating the issue further, it is very hard, even with modern tools, such as Shapely analysis, to understand why such a mistake was made a human data scientist sitting in a lab with their supercomputer can never deduce from the data itself what the causation mistakes may be. This is why, among scientists, it is never acceptable to claim to have found a causal relationship in nature just by passively looking at data. You must formulate the hypothesis and then conduct an experiment in order to tease out the causation.
Addressing these causal mistakes requires taking a step back. Computer scientists need to do more to understand and account for the underlying societal contexts in which these technologies are developed and deployed.
Here at Google, we started to lay the foundations for what this approach might look like. In a recent paper co-written by DeepMind, Google AI, and our Trust & Safety team, we argue that considering these societal contexts requires embracing the fact that they are dynamic, complex, non-linear, adaptive systems governed by hard-to-see feedback mechanisms. We all participate in these systems, but no individual person or algorithm can see them in their entirety or fully understand them. So, to account for these inevitable blindspots and innovate responsibly, technologists must collaborate with stakeholders representatives from sociology, behavioral science, and the humanities, as well as from vulnerable communities to form a shared hypothesis of how they work. This process should happen at the earliest stages of product development even before product design starts and be done in full partnership with communities most vulnerable to algorithmic bias.
This participatory approach to understanding complex social systems called community-based system dynamics (CBSD) requires building new networks to bring these stakeholders into the process. CBSD isgrounded in systems thinking and incorporates rigorous qualitative and quantitative methods for collaboratively describing and understanding complex problem domains, and weve identified it as a promising practice in our research. Building the capacity topartner with communities in fair and ethical ways that provide benefits to all participants needs to be a top priority. It wont be easy. But the societal insights gained from a deep understanding of the problems that matter most to the most vulnerable in society can lead to technological innovations that are safer and more beneficial for everyone.
When communities are underrepresented in the product development design process, they are underserved by the products that result. Right now, were designing what the future of AI will look like. Will it be inclusive and equitable? Or will it reflect the most unfair and unjust elements of our society? The more just option isnt a foregone conclusion we have to work towards it. Our vision for the technology is one where a full range of perspectives, experiences and structural inequities are accounted for. We work to seek out and include these perspectives in a range of ways, including human rights diligence processes, research sprints, direct input from vulnerable communities and organizations focused on inclusion, diversity, and equity such as WiML (Women in ML) and Latinx in AI; many of these organizations are also co-founded and co-led by Googler researchers, such as Black in AI and Queer in AI.
If we, as a field, want this technology to live up to our ideals, then we need to change how we think about what were building to shift to our mindset from building because we can to building what we should. This means fundamentally shifting our focus to understanding deep problems and working to ethically partner and collaborate with marginalized communities. This will give us a more reliable view of both the data that fuels our algorithms and the problems we seek to solve. This deeper understanding could allow organizations in every sector to unlock new possibilities of what they have to offer while being inclusive, equitable and socially beneficial.
- Mind Cure Announces Build-Out of Digital Therapeutics, iSTRYM: A Technology Platform for Mental Wellness Optimization & Psychedelic Research -... - January 6th, 2021
- Global Mindfulness Meditation Apps Market 2020-2025 (Impact of Covid-19) | Deep Relax, Smiling Mind, Inner Explorer, Inc., Committee for Children,... - January 6th, 2021
- Mind, Body and Soul: Shedding and growth - Ramona Sentinel - January 6th, 2021
- These Were Our Favorite Tech Stories From Around the Web in 2020 - Singularity Hub - January 6th, 2021
- The best products and tips for getting a better night's sleep - Fast Company - January 6th, 2021
- Dumb and dumber: The future of business AI depends on HI - ZDNet - January 6th, 2021
- Alertness is an evergreen state of mind for the Jewish community - Security Magazine - October 29th, 2020
- M&A Plus Insurance - Please mind the gap: managing the timing considerations of warranty and indemnity insurance - Lexology - October 29th, 2020
- A deep recession should hurt Trump's reelection bid, but this isn't a usual downturn - CNN - October 29th, 2020
- Creators Of WBUR's 'Madness' Series Talk To Host Of CBC's 'Brainwashed' - WBUR - October 29th, 2020
- Latest Update 2020: Machine Learning Artificial intelligence Market by COVID19 Impact Analysis And Top Manufacturers: AIBrain, Amazon, Anki,... - October 29th, 2020
- Global Mindfulness Meditation Apps Market Expected To Reach Highest CAGR By 2026: Deep Relax, Smiling Mind, Inner Explorer, Inc., Committee for... - October 29th, 2020
- These Texas women arent flocking to Trump. They made up their minds weeks ago. - Houston Chronicle - October 29th, 2020
- Series on Mental Illness Present in Society - The Record Newspapers - TheRecordLive.com - October 29th, 2020
- How We Got Trump Voters to Change Their Mind - The Atlantic - October 27th, 2020
- Researchers Look To Animals To Give Reinforcement Learning Systems Common Sense - Unite.AI - October 27th, 2020
- The true dangers of AI are closer than we think - MIT Technology Review - October 27th, 2020
- An Old Dog's Tale: Wild visions fill the mind at election time - Chinook Observer - October 27th, 2020
- Mind the Gap - The Indian Express - October 27th, 2020
- Breaking News - HBO's "Crazy, Not Insane," A Provocative Look at the Minds of Serial Killers, Debuts November 18 - The Futon Critic - October 27th, 2020
- Family Hardship Helps Inspire Student's Sense of Wonder and Appreciation for the Mind and Body. - Bethel University News - October 27th, 2020
- Algorithmic bias - how do we tackle the underlying problem that inhibits the full potential of AI? - Diginomica - October 27th, 2020
- Enlightening New Book 'The New Prophet' Provides Deep Meditative Truths to Awaken the Heart - GlobeNewswire - October 27th, 2020
- The Deep Dark - The Indian Express - October 27th, 2020
- Technological innovations of AI in medical diagnostics - Health Europa - October 27th, 2020
- African Mental Health Summit to emphasize the importance of cultural understanding - MinnPost - October 27th, 2020
- The All-American Mind of a Militia Member - The New Republic - October 13th, 2020
- What Psychedelic Mushrooms Are Teaching Us About Human Consciousness - Discover Magazine - October 13th, 2020
- Words of wisdom: These books put the focus on the body and the mind - The Hindu - October 13th, 2020
- The state of AI in 2020: Biology and healthcare's AI moment, ethics, predictions, and graph neural networks - ZDNet - October 13th, 2020
- Not all self-help books workbut these 8 will actually rewire the way you think, live and do your job - CNBC - October 13th, 2020
- The Boys cast reflect on "mind-blowingly fun" season 2 finale - RadioTimes - October 13th, 2020
- How to Connect With the Co-Workers Youre Missing - The New York Times - October 13th, 2020
- I'm Thinking That I'm Too Stupid to Understand "I'm Thinking of Ending Things" - The Chicago Maroon - October 13th, 2020
- How AI And Blockchain Are Driving The Energy Transition - OilPrice.com - October 8th, 2020
- Things To Keep In Mind When Buying Dietary Supplements Online - Blog - The Island Now - October 8th, 2020
- Defining the Yellow mind The Manila Times - The Manila Times - October 8th, 2020
- Farewell Convolutions ML Community Applauds Anonymous ICLR 2021 Paper That Uses Transformers for Image Recognition at Scale - Synced - October 8th, 2020
- Grid AI, From the Makers of PyTorch Lightning, Emerges From Stealth With $18.6m Series A to Close the Gap Between AI Research and Production -... - October 8th, 2020
- Ask a Therapist: How Not to Drown in the Deluge of the Negativity That Is 2020 - southseattleemerald.com - October 8th, 2020
- Local organizations earn Oregon Arts Commission grant to deliver integral arts education - The Register-Guard - October 8th, 2020
- Do Your Employees Feel Safe Reporting Abuse and Discrimination? - Harvard Business Review - October 8th, 2020
- The State of AI in 2020 and Beyond - CDOTrends - October 8th, 2020
- Good Job, Whale - The Cut - September 24th, 2020
- What the strange case of horse mutilations in France reveals about our state of mind - The Guardian - September 24th, 2020
- Review: The Flaming Lips dig deep with American Head - The Rice Thresher - September 24th, 2020
- FREE Self Development Series: "Curbing Traffic Jam in the Mind" - Patch.com - September 24th, 2020
- Happy Gut, Happy Mind: how the state of your gut affects your mental health - Evening Standard - September 24th, 2020
- How DeepMind Algorithms Helped Improve the Accuracy of Google Maps? - Analytics Insight - September 15th, 2020
- Elon Musk's brain-computer startup is getting ready to blow your mind - ZDNet - September 15th, 2020
- Far from being anti-religious, faith and spirituality run deep in Black Lives Matter - The Conversation US - September 15th, 2020
- Nvidia's Arm takeover sparks concern in the UK, co-founder says it's 'a disaster' - CNBC - September 15th, 2020
- The Guardians GPT-3-written article misleads readers about AI. Heres why. - TechTalks - September 15th, 2020
- What Is Yoga Nidra? Health Essentials from Cleveland Clinic - Health Essentials from Cleveland Clinic - September 15th, 2020
- Deep Dive: What would it take to change UNCW's mind? [Free] - Port City Daily - September 14th, 2020
- Eternal Blizzard in the Tired Mind: Kaufman delves ever deeper into the human psyche - The Stanford Daily - September 14th, 2020
- SoftBanks Arm sale hits a snag as UK opposition party warns of risk to jobs and digital sovereignty - CNBC - September 14th, 2020
- Scientific Psi? Neuralink and the smarter brain - Covalence - September 14th, 2020
- How to regulate AI, according to the 1967 Outer Space Treaty - Quartz - September 14th, 2020
- Reporter's notebook/It's time to let the games begin - The Daily Times - September 14th, 2020
- DCPS students receive online instruction from teacher in Greece - The Owensboro Times - September 14th, 2020
- Expand your mind with access to over 1,000 lectures from Tim Ferriss, Malcolm Gladwell, and more - MarketWatch - August 28th, 2020
- Global Machine Learning Artificial intelligence Market 2025 To Expect Maximum Benefit and Growth Potential During this COVID 19 Outbreak: AIBrain,... - August 28th, 2020
- It plays with the mind - Bangalore Mirror - August 28th, 2020
- We criticize because we care - Observer Online - August 28th, 2020
- Exercising toward a healthy mind - Johns Hopkins News-Letter - August 28th, 2020
- 78 percent parents don't mind if children have to skip school year due to pandemic: Survey - EdexLive - August 28th, 2020
- TSMC and Graphcore Prepare for AI Acceleration on 3nm - AnandTech - August 28th, 2020
- Frank Njenga: Over 40 years of healing the mind - Business Daily - August 28th, 2020
- Emily Dickinson is the unlikely hero of our time - The Conversation US - August 28th, 2020
- Visiongain publishes Automation in Biopharma Industry 2020-2030 report - PR Newswire UK - August 26th, 2020
- Scoop: The Trump-Navarro mind meld on the FDA - Axios - August 26th, 2020
- Emily Dickinson is the unlikely hero of our time - Huron Daily Tribune - August 26th, 2020
- 5 ways to ease pain using the mind-body connection - Harvard Health Blog - Harvard Health - August 26th, 2020
- COVID-19: Heal Your Diseases and Illness with Your Subconscious Mind - CEOWORLD magazine - August 26th, 2020
- London's 'Tech City' may never be the same again after the coronavirus - CNBC - August 26th, 2020
- Environmental Sustainability And AI - Forbes - August 26th, 2020
- BetMGM Has Crazy 100-1 Odds On The Yankees or Nats To Go Deep Tonight - Crossing Broad - July 25th, 2020
- Chest-Deep' Wildflowers Are Blooming in Reds Meadow - NBC Southern California - July 25th, 2020
- The Man-Eater part 3 - Lake County Record-Bee - July 25th, 2020