Let's start with a little background. There are two ways to follow Twitter: the traditional one, in which we see all the tweets from the accounts we follow as they are published, and the one the platform calls "home", in which its algorithm also shows us popular tweets that are supposed to interest us, because they are related to content we like and with accounts we follow. To switch from one view to another, you have to click on the star that appears at the top of the web or application.

Twitter prefers that we use the "home" version. That's why they call it that, so that we think it's normal to start there. And that's why when we spend a few days without opening the web or the app, it switches to this version.

And now, the novelty, although fortunately it did not end: Twitter announced a few days ago a change in its application: the featured tweets ("start") and those classified chronologically would be in two different tabs. The featured tweet would appear first by default.

This change was not well received: among the responses to Twitter's announcement were comments that bordered on desperation ("not yet") to, directly, insults. On Monday, Twitter published that it was abandoning the idea, after verifying that many of its users preferred the chronological order, the one that has always been used, the one that allows, for example, to follow live events and news without slipping in tweets from 17 hours ago.

Twitter quickly repented. It could have taken six years, like Instagram: the platform announced in January that throughout the first half of 2022, it will allow a return to chronological order. But it had refused since 2016 to offer this possibility.

Transparency and control

And why does Twitter want us to see the "featured" tweets first? For the same reason that Instagram, Facebook and TikTok show by default the content decided by their algorithm: because it works and allows people to spend more time on the platform.

These algorithms are powerful, as anyone who has opened the "For You" tab on TikTok knows. And also dangerous: still on TikTok, after analyzing an internal document of the platform, an expert in algorithms explained to the New York Times that "in a few hours", the platform can identify all sorts of personal information, from musical tastes to depression, through a possible addiction to drugs. The goal: to offer more videos in the "For You" tab and make it even harder to close the app.

The platforms want to capture our attention without seeming to care, at the cost of giving presence to controversial content, fake news and conspiracy theories. And I use the verb "seem" because one of the main problems with algorithms is that they are opaque: we know almost nothing about how they decide what we will and won't see, apart from the four obvious ones mentioned or the advice the networks give to influencers, companies and media so that their content doesn't get lost.

Another issue is the lack of control, as Jay Sullivan, head of product at Twitter, explained in a thread where he was discussing the decision to revert to its new design: "People want - they deserve - transparency about why they see the content they see, and more control over that content." That control could come from turning off the algorithm when we want it to, but also from being able to expressly decide whether we want to see more or less posts from media outlets or our friends, and that those decisions don't depend solely on the interests of the company.