Back in February, the much-anticipated “Truth Social” app was launched. Intended to be a social network for “Truth”, it’s essentially a carbon copy of Twitter, except instead of Tweets, you write and share “Truths”.
When it launched, a lot of people smirked at reports of people having issues signing up or getting in in the first place. It certainly wasn’t a smooth launch, and there was a sense that something launching this broken probably wouldn’t amount to much.
However, quietly and without much noise, it’s been gaining hundreds of thousands of active users and become a hive for right-wing evangelic extremism, an echo chamber in which Trump rules and anyone against him is attacked. Indeed, it has been the number 1 downloaded app in the US at times and still sits comfortably within the top 10.
Back in August, it briefly returned to headlines for getting banned from the Play Store due to concerns with the moderation of content. So I decided to check it out for myself and see how it had been designed and exactly what was going on.
Spoiler — as you can tell by the title, I think this app isn’t just an example of deceptive UX patterns. I think this app is dangerous. It has been designed to funnel people into a vacuum of evangelical radicalization, and I believe it’s very successful in this.
It is not an app to casually download and check out. I am not doing this lightly or “out of curiosity”. I am looking because I am concerned that the initial issues it faced at launch mean it’s not taken seriously.
“Big Tent” means encompassing a range of views from across the political spectrum. Surprise! This app does not mean that at all. Right from the top of the funnel, Trump’s voice is clear.
There’s a lot here that I could say that would be stating the obvious. This is not an app for anyone across the spectrum; this is an app for the extreme right. There’s nothing open about it. If you express any views against these views, you are free (to f*ck off and be banned from the platform, that is). Honest? Global? Without discriminating?
For any extreme right-wing individual feeling attacked by the very fact that Twitter banned their lord and savior, this probably reads as sincere, addressing their pain point directly. The issue, of course, is that their pain point is based on lies, deception, and manipulation at the hands of the person who is now providing them with this place in which they can finally express their racism, xenophobia, misogyny, and homophobia freely.
From the top, the messaging legitimizes extreme views. And a very minor point — it’s America’s platform, but it’s for global conversations?
I knew there wouldn’t be many features, but I wasn’t quite expecting this.
The app comprises of people with profiles, and a feed of “Truths” from all those you follow. And that’s it.
I’ve never seen any social media app in which negative social actions are displayed so prominently. From the side menu, you can easily access your block and mute list (yes, as two distinct areas).
Instantly, this puts me in a combative frame of mind as a user. I know that the features I use most often in a lot of apps are displayed in these menus, so I have the immediate expectation that I’ll be doing a lot of blocking and muting. In fact, from a UX perspective, placing these actions here would inevitably encourage users to do these actions.
I still have the messaging from before ringing in my ears: honest, open, and free. So, I now understand that’s not to do with expression as such — I’m free to block and mute people, open when it comes to seeing everyone I’ve blocked whenever I want, and the app is honest about this being a core action it wants to encourage.
Making lists of users I’ve blocked and muted also makes it incredibly easy for me to access them and see their content. I don’t know about you, but on my Twitter, I’ve got two groups of people blocked: those who are offensive, and my exes. I don’t want to see content from either, that’s why I block them, however, I also get that human urge from time to time to go and see if they’re still being offensive, or to look up a tweet that prompted me to block them in the first place as an example of hate online (or, in the case of exes, to see if they’ve tweeted about me even though it’s been 5 years and we’ve both moved on and oh my goodness the level of ego, but you get it).
Fortunately, these lists are often buried, so I can’t be bothered to actually go and look in the first place. Now, when these lists are so easily accessible, whenever I get that urge, there’s barely any friction. And on an app all about political conversation, that means I can access a group of people guaranteed to get me riled up with ease, thereby meaning any interaction I have with them which angers me is in effect prolonged and continued.
Bizarrely, the option to Direct Message is *still* missing. This is a very basic functionality of any social network, and not terribly hard to implement.
Could it be that it actually will come soon? I’m doubtful. I think this is a conscious decision to ensure that all communication and interactions stay out in the open. All arguments have to happen on public threads, where everyone can read them.
What that means is often that anyone expressing a non-extremist right-wing view or any kind of criticism of Trump is quickly the subject of a public pile-on. I think removing the option to DM encourages users to gang up together, it shows them that they are part of a group of people who think the same and have the same common enemy.
So no, I don’t expect this feature will come soon. I think it’s a conscious and strategic decision to deepen the feeling of community among the extremists.
Additionally, wherever you see a list of suggested users to follow, you see the same 50 users. No surprise who sits prominently at the top.
When you first set up your account, you are required to follow at least 2 of the top 50 accounts. All of them are right-wing, even the innocent-sounding accounts like “Dogs of Truth” and “Truth Travel”, which share images of dogs and destinations with extremist captions and swipes at other social networks. So you are quite literally forced to be exposed to extreme right-wing content from the start.
Even if you then promptly unfollow, these same 50 accounts are shown everywhere. Empty screen? Here are accounts you might want to follow. Searching for something? Oh, here are the same accounts again.
And even when you do follow these accounts, it doesn’t show other accounts. It just shows the same account with the button saying “Following”.
That means that Donald Trump is at the top of every single page, pretty much. No matter what interests you enter, you’re always shown accounts based on their popularity and follower count, so it’s Trump Trump Trump Trump Trump Trump Trump. You can’t avoid him on the platform.
Can you imagine if Twitter just showed the same suggested accounts to every single user? If Facebook just showed the accounts for the same 50 people with the most friends and suggested you add them too? It amplifies the most popular users already and means that everything else is in effect invisible.
That’s how the app has been designed to not subtly nudge users to follow Trump and be exposed to his words, but push and shove them continually until they do.
It’s an example of the Baader-Meinhof phenomenon, also known as frequency bias, and is a common tool in marketing. Right from sign-up, Trump’s account is shown to you immediately. And then you notice it everywhere because it’s shown at every opportunity. It creates the illusion that he is the most engaged with account on here, which he is, but of course, that’s the whole point of the app.
In the mental model of most of us who regularly use social media, a trend is something that’s popular. Not so on Truth Social.
Actually, seeing this screenshot again, I’ve made a mistake. This isn’t actually labeled as “trends” in the app. It’s labeled as hashtags. Nowhere here does it say that these are the most engaged-with hashtags, or the most commonly used.
So I’ve fallen for deception here. By showing these meaningless lines (which don’t change) and the number of people talking about each hashtag, I just assumed these were the most popular topics. But of course, on a platform with hundreds of thousands of things posted every day, if only 20 people are using the hashtag #FBIRaidTrump, then it’s definitely not a trending topic.
So this builds the perception that these are topics I should be engaging with too, helping to ensure the platform becomes an echo chamber for any topic which the app developers decide to list here.
And yes, every single hashtag is reflective of nationalistic and extremist topics.
We’ve considered the design of the app from a UX perspective but skipped the biggest part of the platform: the content itself. The “Truths”, as it called them.
I don’t need to tell you how utterly nightmarish the content is. Lies, labeled as “Truths” to legitimize themselves.
Threads are tricky to navigate, with threads largely not displayed, to avoid any potential discussion being displayed with ease and keeping the focus on the original content posted.
There are guidelines for both Apple and Google on filtering objectionable content. How the majority of content on this app passes this (admittedly vague and flimsy) guideline, I have no clue.
In combination with the other elements of the app too, all accounts are funneled towards Trump, and so ALL content is about Trump and funnels back to one of his rambling messages.
Fair question. I spend a lot of time thinking about the cognitive design that goes into UX, and the ethics behind nudging users to take certain actions within products. There are many examples of apps employing these techniques deceptively to take advantage of users, such as the Kardashian App when it comes to making money from dedicated players.
Truth Social has expertly designed to funnel users into an echo chamber of hatred and radicalization. Common patterns and best practices related to recommendations, navigation, and feature set (combined with false messaging) have been used to push users toward one thing: Trump.
Even if the launch was a disaster and there were technical issues, we need to take this app seriously and not dismiss it as a gimmick.
And beyond just passing app release guidelines on the surface, it’s high time that the true intent of an app and how the UX has been designed impacts its availability. We know how powerful behavioral design can be, and when it’s used for radicalization in the context of the current global political situation? Well, we’re seeing the fruits that bears.
In the meantime, I want to leave you with the App Store Review Guidelines for “Objectionable Content” (point 1 on this page) and ask whether or not this conceivable stretches beyond the content itself and should apply to UX designed to promote such content in the first place.