This feature apparently influences and "funnels" what information you regularly see on Twitter (mysteriously)...

Post Reply
Site Admin
Posts: 101
Joined: Fri Aug 18, 2017 9:03 pm

This feature apparently influences and "funnels" what information you regularly see on Twitter (mysteriously)...

Post by circuitbored » Fri Jan 29, 2021 1:21 am

I did some experimentation over the past few years with Twitter, as it's been undeniably one of the most prolific platforms for news for a lot of people for many years.

I noticed that my posts really required hashtags to receive engagement, and still then, my views rarely approached any reasonable amounts based on the group of followers I had. Though I have had hundreds of followers on my account, posts without any tags in them often only received 1-2 views on a regular basis, and at all times of day. Twitter is a site that is accessible world-wide, so it was confusing to me as to why readership was dependent on the time of day posts were made.

Since a lot of Twitter's most prolific posters (ehem.. POTUS45) have now been banned, they dynamic for Twitter is changing... Who knows how it is changing for real, we just know it is.

I've observed that the dynamic on my personal timeline is trending away from a focus on politics and more towards entertainment purely by inspecting trending topics over time.

A funny thing though, I'm now getting the strangest mix of posts. I get political headlines mixed in with music ads, and posts about UK soccer teams, when I'm really not a fan of many of the posts I see on a regular basis. I also see posts concerning Q anon and conservative protest materials, as well as wild posts concerning Antifa protests and much more that I have no connection to whatsoever. The only way I can explain it all is that an "Information Bubble" exists for accounts on Twitter, and it's probably base on an algorithm that is trying to determine demographic data about my personal and political interests (without me knowing it is happening).

I looked into my post settings, and boy they are a maze of confusion under unintuitive titles!

I gave up after a while and chose to search an independent service for insight and found this video in results:

My settings had shown a really long list of check boxes for random topics I had no idea were being used as information filters on my account... Lord knows how they are used, but when I saw "UK soccer" and many of the political categories I mentioned earlier, it began to click. notice how in the video the maker also mentioned that these topics re-populate frequently and that a user has to frequently come back and uncheck them? That means that something deeper is at play as to how these settings are applied to accounts. Account settings on Twitter alone are a constantly changing and confusing "bowl of spaghetti" if you ask me...

There is no "Uncheck All" option at the top of the screen on this setting as well, that indicates that a user like me will be more likely to get frustrated of pruning this freaking huge (yet wildly inaccurate/ incorrect) list of interest filters for each account. Add this to my regular book on frustrations with technology, this is a bad way to really help users to see what is real-time-posted on Twitter. Users should be able to elect their own interests and be able to disable information filters/bubbles so that they can see everything and make their own decisions.

Settings like this work to generate mis-trust in platforms in my opinion. The minute a story would break on this issue, the settings can easily disappear of be hidden by a platform without any real amends made nor apology as well.

If information is secretly "filtered, limited, and tailored" behind the scenes on Twitter, it can lead to wide-scale misinformation, manipulation, and yes -- "fake news" becoming prevalent on a platform. We observed the very real impact of fake news during the US Capitol insurrection of 21' in case you've already forgotten the historic event.

Information bubbles and funneled news can create an echo chamber, where platform users only see information that confirms one point of view without showing opposition. This is why now, more than ever before, people can more easily be motivated to do very negative things by mass persuasion that drives opportunistic profit for social media platforms. This is why people can also be elevated to "cult status" because messages delivered from a popular account are delivered most, and they also take precedence over small accounts that may have just as valid counter points. populist leaders with harmful and wrongful ideals get more attention now than ever, and some of the most potentially valid voices in creating unity and in solving problems get discouraged and ignored by algorithms, and that gets amplified during crisis, why? because views and engagement generate more profit for platforms rather than truth and solving large-scale problems. That is a serious tragedy, it complicates and threatens our future every time a like button is clicked on a bad post.

Don't get me wrong though, negative and violent voices often get demoted and banned rightfully, but a great disservice happens on the less-known side in creating inequality for user profiles when rules exist to create equality in real life. It's not a joke at all that we've found even racial discrimination and gender discrimination to be very real within most social media platforms ove years since their development. Social media platforms cannot continue to go unchecked for discriminative practices of this kind, otherwise serious flaws are created and reinforced within our societies through social media that corrupt everything as a result. Many Government and businesses entities for example use social media. If a social media site already has user demographics, different ads can be delivered separately with either toxic or beneficial information to users from a specific demographic. Further example, a store can issue a 10% discount coupon to platform users that are black, and a 20% discount coupon to people who are caucasian, and no one would ever know that it occurred because of information filtering... Much much worse things could happen if this is left unchecked.

Even if the 1st amendment "doesn't apply" to corporate-owned platforms like Facebook and Twitter, they still hold a very important responsibility to only host accurate data and act responsibly concerning their user community, and that transcends staff limitations. Responsibility, fairness, accountability, and transparency are sorely missing from the operations of many businesses world-wide these days. If it continues, that's when users will simply bail out, and tight government regulation will likely begin to sweep in and make platform innovation a thing of the past due to developers tied up in satisfying regulations. I warn you platform developers, clean up your act and keep it that way before regulation makes doing so mandatory.

We all need to do a better job in understanding the consequences of development decisions we make and in business decisions we make. In a responsible world, we don't conduct tests in production environments.

The lack of real "algorithm transparency" among social platforms dramatically reduces trust we can truly have in service-related sites and software in general. All we have at our disposal are a few YouTube tutorials and wild guesses on how these sites and apps operate based on developers like me that understand how sites like it are built. They may be wrong, but at times they can be terrifyingly right, and a forewarning of things to come in the future. News manipulation is real, we should not take it lightly. This is why I now post most of my important points on non-social media sites, so that they don't get buried by algorithms. As you can see they do far better for readership here anyway, and I don't have to jump through weekly hoops to be read, which saves me tons of time.

Would you buy a sandwich if the maker wouldn't tell you what kind of ingredients it's made of at all? What if you were allergic to a few things that you need to make sure weren't in it? This is modern day social media... You never know how the platforms operate, and they may have harmful effects if they don't tell you how they work.

If apps and platforms continue to operate in secret ways as they currently do, how can we trust the information and statistics they hide from us, and even more importantly, how can we trust our privacy with them as well?

We're always interested to hear your opinions on these discussion posts...

Send them to us at and we'll post your best responses...

Post Reply