The influencers aren't even real humans, and the web pages are covered in surrepticious design. We need a better internet

Two crossposts from The Conversation UK this week. This is a news moment when revelations on Facebook show that they have been fully aware of the negative and harmful effects of their social media strategies on users, particularly young girls posting and reading Instagram. So here are two analyses that show us how mentally aware and vigilant we have to be, in our online and virtual lives.

The CGI influencer Lil Miquela, from Variety

The CGI influencer Lil Miquela, from Variety

When the people we follow on social media aren’t human

By Francesca Sobande, Lecturer in Digital Media Studies, Cardiff University

Social media influencers – people famous primarily for posting content online – are often accused of presenting artificial versions of their lives. But one group in particular is blurring the line between real and fake.

Created by tech-savvy teams using computer-generated imagery, CGI or virtual influencers look and act like real people, but are in fact merely digital images with a curated online presence.

Virtual influencers like Miquela Sousa (known as Lil Miquela) have become increasingly attractive to brands. They can be altered to look, act, and speak however brands desire, and don’t have to physically travel to photo shoots – a particular draw during the pandemic.

But what can be a lack of transparency about who creates and profits from CGI influencers comes with its own set of problems.

CGI influencers mirror their human counterparts, with well-followed social media profiles, high-definition selfies, and an awareness of trending topics. And like human influencers, they appear in different body types, ages, genders and ethnicities. A closer look at the diversity among CGI influencers – and who is responsible for it – raises questions about colonialism, cultural appropriation, and exploitation.

Human influencers often have teams of publicists and agents behind them, but ultimately, they have control over their own work and personality. What happens then, when an influencer is created by someone with a different life experience, or a different ethnicity?

For centuries, black people – especially women – have been objectified and exoticised by white people in pursuit of profit. While this is evident across many sectors, the fashion industry is particularly known for appropriating and commodifying black culture in ways that elevate the work and status of white creators. The creation of racialised CGI influencers to make a profit for largely white creators and white-owned businesses is a modern example of this.

Questions of authenticity

The sheen of CGI influencers’ surface-level image does not mask what they really symbolise – demand for marketable, lifelike, “diverse” characters that can be easily altered to suit the whims of brands.

I recently gave evidence to a UK parliamentary inquiry into influencer culture, where I argued that it reflects and reinforces structural inequalities, including racism and sexism. This is evident in reports of racial pay gaps in the industry, and the relentless online abuse and harassment directed at black women.

CGI influencers are not exempt from such issues – and their existence raises even more complex and interesting questions about digital representation, power, and profit. My research on CGI influencer culture has explored the relationship between racialisation, racial capitalism and black CGI influencers. I argue that black CGI influencers symbolise the deeply oppressive fixation on, objectification of, and disregard for black people at the core of consumer culture.

Critiques of influencers often focus on transparency and their alleged “authenticity”. But despite their growing popularity, CGI influencers – and the creative teams behind them – have largely escaped this scrutiny.

As more brands align themselves with activism, working with supposedly “activist” CGI influencers could improve their optics without doing anything of substance to address structural inequalities. These partnerships may trivialise and distort actual activist work.

When brands engage with CGI influencers in ways distinctly tied to their alleged social justice credentials, it promotes the false notion that CGI influencers are activists. This deflects from the reality that they are not agents of change but a byproduct of digital technology and consumer culture.

Keeping it real

The Diigitals has been described as the world’s first modelling agency for virtual celebrities. Its website currently showcases seven digital models, four of whom are constructed to appear as black through their skin colour, hair texture, and physical features.

The roster of models includes Shudu (@shudu.gram) who was developed to resemble a dark-skinned black woman. But it has been argued that Shudu, like many other CGI models, was created through the white male gaze – reflecting the power of white and patriarchal perspectives in society.

Shudu’s kaleidoscope of Instagram posts include an image of her wearing earrings in the shape of the continent of Africa.

One photo caption reads: “The most beautiful thing about the ocean is the diversity within it.” This language suggests Shudu is used to show how Diigitals “values” racial diversity – but I argue the existence of such models shows a disrespect and distortion of black women.

Creations like Shudu and Koffi (@koffi.gram), another Diigitals model, I would argue, show how the objectification of black people and the commodification of blackness underpins elements of CGI influencer culture. Marketable mimicry of black aesthetics and the styles of black people is apparent in other industries too.

CGI influencers are another example of the colonialist ways that black people and their cultures can be treated as commodities to be mined and to aid commercial activities by powerful white people in western societies.

Since I began researching this topic in 2018, the public-facing image of The Diigitals has notably changed. Its once sparse website now includes names of real-life muses and indicates its ongoing work with black women. This gesture may be meaningful and temper some critiques of the swelling number of black CGI influencers across the industry, many of which are not apparently created by black people.

A more pessimistic view might see such activity as projecting an illusion of racial diversity. There may conceivably be times when a brand’s use of a CGI influencer prevents a real black influencer from accessing substantial work. The Diigitals working with actual black people as “muses” is not the same as black people creating and directing the influencer from its inception. However, it is important to recognise the work of such real black people who may be changing the industry in impactful ways that are not fully captured by the term “muse”.

To me, many black CGI influencers and their origin stories represent pervasive marketplace demand for impersonations of black people that cater to what may be warped ideas about black life, cultures, and embodiment. Still, I appreciate the work of black people seeking to change the industry and I am interested in how the future of black CGI influencers may be shaped by black people who are both creators and “muses”.

The Conversation approached The Diigitals for comment, and founder Cameron-James Wilson said: “This article feels very one-sided.” He added: “I don’t see any reference to the amazing real women involved in my work and not having them mentioned disregards their contributions to the industry”. The Diigitals did not provide further comment. The article was expanded to make a more substantial reference to the real women The Diigitals works with.

The rise of Dark Web design - how sites manipulate you into clicking

By Daniel Fitton, Reader in User Experience Design, University of Central Lancashire

The vast majority of websites you visit now greet you with a pop-up. This annoying impediment to your seamless web browsing is called the "cookie banner", and it's there to secure your consent, as per online privacy laws, for websites to retain information about you between browsing sessions.

The cookie banner purports to offer you a choice: consent to only the essential cookies that help maintain your browsing functionality, or accept them all—including cookies that track your browsing history to sell on to targeted advertising firms. Because those additional cookies generate extra revenue for the websites we visit, cookie banners are often designed to trick you into clicking "accept all."

The UK's information commissioner recently urged G7 countries to address this problem, highlighting how fatigued web users are agreeing to share more personal data than they'd like. But in truth, manipulative cookie banners are just one example of what's called "dark design"—the practice of creating user interfaces that are intentionally designed to trick or deceive the user.

Dark design has proven to be an incredibly effective way of encouraging web users to part with their time, money and privacy. This in turn has established "dark patterns", or sets of practices designers know they can use to manipulate web users. They're difficult to spot, but they're increasingly prevalent in the websites and apps we use every day, creating products that are manipulative by design, much like the persistent, ever-present pop-ups we're forced to close when we visit a new website.

Cookie banners remain the most obvious form of dark design. You'll notice how the "accept all" button is large and cheerfully highlighted, attracting your cursor within a split second of your arrival on a website. Meanwhile, the dowdy, less prominent "confirm choices" or "manage settings" buttons—the ones through which we can protect our privacy—scare us away with more time-consuming clicks.

You'll know from experience which one you tend to click. Or you can try the Cookie Consent Speed-Run, an online game that exposes how difficult it is to click the right button in the face of dark design.

E-commerce websites also frequently use dark patterns. Say you've found a competitively priced product you'd like to buy. You dutifully create an account, select your product specifications, input delivery details, click through to the payment page—and discover the final cost, including delivery, is mysteriously higher than you'd originally thought. These "hidden costs" aren't accidental: the designer is hoping you'll just hit "order" rather than spending even more time repeating the same process on another website.

Other elements of dark design are less obvious. Free services such as Facebook and YouTube monetise your attention by placing advertisements in front of you as you scroll, browse or watch. In this "attention economy", the more you scroll or watch, the more money the companies make. So these platforms are intentionally optimized to command and retain your attention, even if you'd rather close the app and get on with your day. For example, the expertly crafted algorithm behind YouTube's "Up Next" video suggestions can keep us watching for hours if we let them.

App design

Manipulating users for commercial gain isn't just used on websites. Currently, more than 95% of Android apps on the Google Play store are free to download and use. Creating these apps is an expensive business, requiring teams of designers, developers, artists, and testers. But designers know that they'll recoup this investment once we're hooked on their "free" apps—and they do it using dark design.

In recent research analyzing free app-based games that are popular with today's teenagers, my colleague and I identified dozens of examples of dark design. Users are forced to watch adverts and frequently encounter disguised adverts that look like part of the game. They're prompted to share posts on social media and, as their friends join the game, are prompted to make in-app purchases to differentiate their character from those of their peers.

Some of this psychological manipulation seems inappropriate for younger users. Teenage girls' susceptibility to peer influence is exploited to encourage them to buy clothes for in-game avatars. Some games promote unhealthy body imagery while others actively demonstrate and encourage bullying through indirect aggression between characters.

There are mechanisms to protect young users from psychological manipulation, such as age rating systems, codes of practice, and guidance that specifically prohibits the use of dark design. But these rely on developers understanding and interpreting this guidance correctly and, in the case of the Google Play store, developers vet their own work and it's up to users to report any issues. My research indicates that these measures are not yet proving entirely effective.

Shedding light

The problem with dark design is that it's difficult to spot. And dark patterns, which are established in every developer's toolbox, spread fast. They're hard for designers to resist when free apps and websites are competing for our attention, judged on metrics like "time on page" and the "user conversion rate."

So while cookie banners are annoying and often dishonest, we need to consider the broader implications of an online ecosystem that is increasingly manipulative by design. Dark design is used to influence our decisions about our time, our money, our personal data and our consent. But a critical understanding of how dark patterns work, and what they're hoping to achieve, can help us detect and overcome their trickery.

Look at our category on A/UK for “A Better Media”, for solutions and positive proposals. These articles are republished from The Conversation under a Creative Commons license