IP-AI • FEBRUARY 26, 2019
A very personal look at the future of AI
Joel Davison is a Gadigal and Dunghutti man from Sydney, Australia. He attended the March 2019 Indigenous Protocol and Artificial Intelligence workshops in Hawai’i. Here he explores the future of AI.
AI today is bound by practicality, talented developers, cutting edge research, specialised hardware and top of the line cyber security, which are all ingredients required to advance simple AI beyond current offerings. This means that the entities with the power to advance AI, those with access to pools of talent and academic connections as well as the funding for hardware and security, are those which already have much more money to invest than what is required to operate as a business. These entities, be they government or private, expect a return on investment, in this way AI advances will always be pushed in a direction that is either profitable or marketable, due to this AI is entwined with automation in our cultural lexicons and it is this connection that often dominates conversation.
If Artificial Intelligence is to replicate human intelligence, then the most direct way to profit off of said intelligence is to exploit its labor value. In this way conversations are often steered towards analysis of labor-value of existing occupations. For example, advances by large tech companies in self-driving cars has every in-tune truck driver eyeing other industries at this point, and we1 can’t2 stop talking3 about4 it5.
The vast majority of these industry shaping moves that are being made are opportunities presented only to the wealthiest organisations on the planet, due to the benefit only being realised at a huge scale thanks to the costs outlined above, talent, research, hardware and security. It simply isn’t feasible for small organisations, potentially social ventures, NGOs or co-ops, to lay stake to a portion of the market without the network and capability to take advantage of the wider market. If the benefit of Artificial Intelligence in this liberal-capitalist frame is the profit earned by extracting more labor-value by reducing the overhead of hiring humans to manually perform tasks, then by the time you have paid the up-front costs for the research, development and specialised manufacturing to begin providing self-driving vehicles as a service, you start to realise that you need to roll out your service on a massive scale to begin to realise the benefits. In this environment Artificial Intelligence becomes a winner takes all venture, where the only participants are those already winning.However, we have been seeing a shift in this landscape, a move by some of the largest organisations that changes the climate entirely. Having developed their AI and taking their time to scale and implement before they start to see their benefit, these large organisations have started to look for alternate revenue sources for their AI solutions. Most notable of these alternate revenue sources are the AI as a service platforms, such as IBM’s Watson or Google’s Tensor Flow. Suddenly, small organisations can provide the benefit of AI (or at least market that they do) without the tremendous up-front cost of research and specialised hardware. In this we are now seeing many small businesses and startups getting into the game of exploiting the difference in labor value between human intelligence and Artificial Intelligence, this time opening up smaller scales, nooks and crannies in the marketplace to be explored.
In all of these conversations we are only exploring the capital value of simple Artificial Intelligence: it’s the capitalist equivalent of only talking about the ‘why?’ of AI (the answer to which is almost always ‘money’). Little do we explore the impact of simple Artificial Intelligence, we never really ask ‘how?’, and when we do it’s always too late.
In November 2017, The Guardian broke the story of a secret police blacklist employed by the New South Wales Police,6 a “Suspect Targeting Management Plan”, which the NSW Police Commissioner called a “predictive style of policing”. This is kind of low-hanging fruit isn’t it? My intention was to share a couple of cases where organisations hadn’t stopped to ask ‘how?’, or what their impact is, but surely no one on this program even stopped to ask ‘why?’. It doesn’t take a genius to figure out how this goes terribly wrong, hell you don’t even have to look much further than Marvel, who ran a (fantastic, by the way) crossover event by the title of “Civil War 2” which featured at its center the arguments for and against ‘predictive policing’, it’s actually kind of prophetic and I love it so.
The event comes to boiling point when a new Spiderman, Miles Morales7 (A young African American, Puerto Rican man) is accused of murdering Steve Rogers, Captain America in the future. After all of the superheroes have shared their perspectives and opinions and had their brawls, the takeaway from this is the question, ‘is it ever okay to judge someone for something they haven’t done but could do?’, to which the answer is no, you shouldn’t, especially if the current criminal justice system is suited to it and especially if you don’t think very carefully about it. Unfortunately the Australian criminal justice system isn’t suited to it and very clearly the NSW police did not think very carefully about it.
‘Okay Joel so you have some comic-books-based opinions on predictive justice, but seriously how bad could it be?’It gets pretty bad. According to the NSW Police Commissioner Mick Fuller, “here were about 1,800 people subject to an STMP across the state. About 55% of them were Aboriginal”, the youngest of which is a nine year old. Currently Indigenous Australians only make up 3% of the national population, so how is it that we represent such a large portion of this database? Are we really that talented at crime? I mean, do we really commit 17 times more crime than any other Australian ethnicity? Of course not, that’s ridiculous, so how did this AI come up with this list of suspects? The truth is, we don’t know and if you ask the police they wouldn’t know either, the company that they contracted to develop the solution likely don’t know either and don’t care how, they’ve already answered their ‘why?’ (read: money). Most likely the people developing the solution don’t understand how the AI’s learning algorithms work and didn’t think about the kind of training data the AI was trained on before it started working on production data.
‘But Joel, they’d have to have thought pretty hard if they made the AI racist, it’s a machine so it’s impartial to race and ethnicity’, turns out that’s not the case,8 AI more or less come out of the box as racist. This is due to how AI are configured in these projects, to perform better than humans they need to learn more than humans in the narrow field they’re being developed for, which is one of their strengths: they can take a huge set of training data and learn from it very quickly. The data is important, however, and as it so happens the most easily accessible large datasets are user-generated and contain all of their respective prejudices. So it’s important to ask ‘what data set was it trained on?’, in this case definitely existing data on previous arrests and criminal convictions by the Australian Federal Police. ‘Hold on, the data on previous arrest and criminal convictions by the Australian Federal Police reveals a strong recurring prejudice toward the Indigenous population of Australia?’Imagine my shock.
So now the police have a racist AI that’s populating a confidential list of suspects who are majority Indigenous, who the police are now legally able to arrest before they commit a crime or do anything suspicious. Yeah, the police in 2017 criminalised being Aboriginal. That’s how bad it gets.
I’d love to say this proves the point I was making earlier about the impacts AI can have if we don’t ask ‘how?’ but it’s even worse than that. The fact of the matter is unless we are very careful, AI-as-a-service can be used to intentionally obfuscate the ‘how?’. We don’t know how the NSW police’s AI became a racist, we can make very good educated guesses about training data and configuration, but we don’t know: the AI obfuscates the process by which it came up with its database through its sheer complexity alone. The biggest problem is that in spite of this, the results are still being used with authority. Because it is an AI, a machine that ‘just runs analysis’ all it is doing is giving authority to existing and past prejudices and perpetuating said prejudices, rather than having the ability to challenge them like a human might.We haven’t been asking of ourselves ‘how?’ and when we don’t, we don’t move forward, we don’t challenge and we don’t change. We just become more efficient and I don’t think that’s the vision anyone who is passionate about AI & Computer Science imagine. If we are to use AI to move our society forward, to make real change instead of just making profit, we need to ask ‘how?’.
Clevenger, S. (2019, February 13). Self-driving truck startups TuSimple, Ike attract more investment to fuel development. Transport Topics. Retrieved from ttnews.com/articles/self-driving-truck-startups-tusimple-ike-attract-more-investment-fuel-development.
McGowan, M. (2017, November 10). The Guardian. Retrieved from theguardian.com/australia-news/2017/nov/11/more-than-50-of-those-on-secretive-nsw-police-blacklist-are-aboriginal.
Miles Morales (Earth-1610) [online wiki page]. (n.d.). Marvel Database Fandom Wiki. Retrieved from https://marvel.fandom.com/wiki/Miles_Morales_(Earth-1610).
Murphy, F. (2017, November 17). Truck drivers like me will soon be replaced by automation. You’re next. The Guardian. Retrieved from theguardian.com/commentisfree/2017/nov/17/truck-drivers-automation-tesla-elon-musk.
[Online article]. (2019, February 24). Retrieved from pressreviewer.com/2019/02/24/the-leading-companies-competing-in-the-global-mining-truck-market-industry-forecast-2018-2022/.
Orenstein, W. (2019, February 1). Automated ‘platoons’ of trucks might soon be driving on Minnesota roads. MinnPost. Retrieved from minnpost.com/good-jobs/2019/02/automated-platoons-of-trucks-might-soon-be-driving-on-minnesota-roads/.
Speer, R. (2017, July 13). How to make a racist AI without really trying [Blog post]. Retreived from blog.conceptnet.io/posts/2017/how-to-make-a-racist-ai-without-really-trying/.
Rowe, A. (2018, August 30). The trucking industry’s future: go high tech or go home. Tech.Co. Retrieved from tech.co/news/trucking-industry-future-autonomous-drivers-vr-2018-08.
Welch, D., Coppola, G., & Dawson, C. (2019, February 24). Young CEO of electric vehicle startup Rivian has Amazon riding shotgun. Seattle Times. Retrieved from seattletimes.com/business/young-ceo-of-electric-vehicle-startup-rivian-has-amazon-riding-shotgun/.
1. Walker Orenstein, “Automated ‘platoons’ of trucks might soon be driving on Minnesota roads,” MinnPost, February 1, 2019 <minnpost.com/good-jobs/2019/02/automated-platoons-of-trucks-might-soon-be-driving-on-minnesota-roads/>.
2. Seth Clevenger, “Self-driving truck startups TuSimple, Ike attract more investment to fuel development,” Transport Topics, February 13, 2019 <ttnews.com/articles/self-driving-truck-startups-tusimple-ike-attract-more-investment-fuel-development>.
3. Adam Rowe, “The trucking industry’s future: go high tech or go home,” Tech.Co, August 30, 2018 <tech.co/news/trucking-industry-future-autonomous-drivers-vr-2018-08>.
4. David Welch, Gabrielle Coppola, & Chester Dawson, “Young CEO of electric vehicle startup Rivian has Amazon riding shotgun,” Seattle Times, February 24, 2019 <seattletimes.com/business/young-ceo-of-electric-vehicle-startup-rivian-has-amazon-riding-shotgun/>.
5. Finn Murphy, “Truck drivers like me will soon be replaced by automation. You’re next,” The Guardian, November 17, 2017 <theguardian.com/commentisfree/2017/nov/17/truck-drivers-automation-tesla-elon-musk>.
6.Michael McGowan, “More than 50% of those on secretive NSW police blacklist are Aboriginal,” The Guardian, November 10, 2017 <theguardian.com/australia-news/2017/nov/11/more-than-50-of-those-on-secretive-nsw-police-blacklist-are-aboriginal>.
7. “Miles Morales (Earth-1610),” <marvel.fandom.com/wiki/Miles_Morales_(Earth-1610)>.
8. Robyn Speer, “How to make a racist AI without really trying,” July 13, 2017 <blog.conceptnet.io/posts/2017/how-to-make-a-racist-ai-without-really-trying>.
Joel Davison is a Gadigal and Dunghutti man from Sydney Australia. Living culture through an active role in language revitalisation for the Gadigal language, he is also an avid technologist and works at the Commonwealth Bank of Australia as a Robotics Analyst.
The Indigenous Protocols and Artificial Intelligence (IP-AI) workshops are founded by Old Ways, New, and the Initiative for Indigenous Futures. This work is funded by the Canadian Institute for Advanced Research (CIFAR), Old Ways, New, the Social Sciences and Humanities Research Council (SSHRC) and the Concordia University Research Chair in Computational Media and the Indigenous Future Imaginary.
How do we Indigenously Interact with AI?