AI’s black swans: Unforeseen consequences looming

Capitalizing our insecurities

AI is calling out our imperfections

An image showing the Qoves facial analysis tool calling out the imperfections on my uploaded picture and recommending products I can buy for it.
I uploaded my portrait into this tool, and it revealed flaws in my face while recommending products that I can buy to fix it. Source: Qoves AI-powered facial analysis

Qoves is a facial analysis tool powered by AI that is used to detect the superficial flaws on your face and then recommend products and cosmetics that can be used to fix them. They even have a YouTube channel where they analyze the faces of celebrities to understand the “science” behind what makes them “hot.” This is a prime example of how companies can use AI to use our insecurities to make a profit.

A screenshot of Qoves’s YouTube channel showing explainer videos of what makes different celebrities look hot.
Source: Qoves YouTube channel

Supercharged plagiarism

Steal like an a̶r̶t̶i̶s̶t̶ AI

AI-generated art is blurring the lines of content ownership. In 2022, a Genshin Impact fan artist was doing a live painting session on Twitch. Before they could finish the fanart and post it to Twitter, one of their viewers fed the work-in-progress into an AI generator and “completed” it first. After the artist posted their completed art, the art thief proceeded to demand credit from the original artist.

AI-powered plagiarism is a heated topic in the context of schools; people on both sides argue how generative tools like GPT should be banned in schools but also others who think they should teach with it. The latter argues that this will expose the students to tools they might end up using anyway in the future. The analogy of how we allowed calculators in a classroom is something that comes up often as well. The college essay is dead by Stephen Marche captures this topic wonderfully.

Ownership in the age of AI

Who owns the content generated by AI?

The work of artists is being used to train AI and generate new content in their style. Some even display remnants of their signature in it; people call this an industrial-scale intellectual property theft.

An image showing AI’s ownership ecosystem having data collectors, technologists, curators and the general crowd.
Source: Who gets credit for AI-generated art?

As the graphic above shows, there are multiple stakeholders in the generative AI ecosystem: individual data owners, people who put together the database, the developer who made the training algorithm, the artist/technologist implementing that algorithm, and the curator/artist filtering the output. So the question is, who gets paid? Below are a series of lawsuits being filed by artists and others on this topic, struggling to find the answer to this question:

AI colonialism

AI is white-washing the cultures of the world

Since all these AI models are primarily made in western countries by western researchers, they over-index on their own cultures, traditions and values in the data used to train them. However, the data that informs the cultures of the rest of the world, around 95% of the population, do not make it into the training set. They are intentionally/unintentionally ignored. When such models get deployed on globally used tools like search engines and social media platforms, the rest of the world, especially developing nations, has no choice but to adopt western cultural norms to use their technology. A new form of colonialism is born where entire cultures can be erased, called AI colonialism. Karen Hao has been reporting on this topic through multiple examples at MIT Technology Review: