Following the takeover of TikTok’s U.S. operations by American investors, users reported content censorship, particularly concerning sensitive topics. While TikTok attributed these issues to a system-wide failure caused by a power outage, questions remain about intentional censorship and the platform’s capabilities. Tech journalist Jacob Ward notes the platform’s sophisticated built-in censorship mechanisms, originally developed in China, and highlights that while current issues may not be intentional, the potential for future manipulation by new ownership is significant. Additionally, TikTok has settled a social media addiction trial, revealing significant awareness of harms to children over years.
Read the original article here
The landscape of online discourse appears to be shifting dramatically, with a noticeable downturn in the visibility of anti-Trump content on TikTok following significant algorithmic changes, reportedly influenced by Larry Ellison’s involvement. This development has sparked considerable concern, as it suggests a deliberate redirection of what users see, potentially shaping public opinion in ways that favor specific narratives. The very idea of algorithms controlling visibility is, for many, more unsettling than the content itself. Historically, oppressive regimes have leveraged censorship to suppress truth and manipulate populations, a tactic that seems to be resurfacing in the digital age through what is being termed “algorithmic indoctrination.”
The implications of this shift are profound. TikTok, which had previously served as a platform for robust anti-Trump sentiment, much like other social media spaces, is now exhibiting a different dynamic. While not everyone may abandon the platform, those who remain risk being increasingly exposed to disinformation and narratives aligned with MAGA ideology. This mirrors concerns raised about other platforms like Twitter and Facebook, which, despite their problematic aspects, have remained effective channels for information dissemination, including misinformation. The prospect of TikTok following a similar trajectory is viewed as a deeply troubling development.
The manipulation of what users encounter online goes beyond simply amplifying certain viewpoints; it actively silences others. The power wielded by those who control the algorithms is immense, as they can dictate what people see and, consequently, influence their beliefs. This raises questions about the legal classification of algorithms, with some arguing they should not be treated the same as editors at mainstream media outlets who make conscious editorial choices. The notion that algorithms are neutral is increasingly being challenged, especially when billionaires with significant influence are involved in their operation.
The potential for this algorithmic control to shape public perception is a recurring theme. It’s a concern that extends beyond just TikTok, with similar influences being observed on platforms like YouTube and Twitch. The consolidation of power in the hands of a few wealthy individuals and corporations controlling these digital spaces is seen as a dangerous extension of their influence, mirroring the impact they’ve historically had on traditional media like newspapers and television. The depth of data these entities possess on users makes them uniquely positioned for manipulation, potentially leading to the “real election tampering” that some fear.
Furthermore, the structure of these platforms themselves is implicated in the problem. Apps like TikTok are criticized for their tendency to erode attention spans and condense communication, a phenomenon reminiscent of Orwellian control. This simplification of interaction can make it harder for nuanced discussions to take root, as users may lack the sustained attention required to engage with complex issues. The fear is that this diminishment of cognitive capacity makes individuals more susceptible to manipulation, fostering a dependence on easily digestible, often superficial, content.
The shift on TikTok is not occurring in isolation. Similar patterns of content suppression and framing are being reported on platforms like Instagram and Threads. This suggests a coordinated effort or a shared trend across various social media spaces. The concern is that this algorithmic steering is not an organic development but a deliberate intervention. The idea that TikTok’s algorithm is now being controlled under a US-led deal, reportedly involving Larry Ellison, raises questions about the initial justifications for such a takeover, which often centered on concerns about Chinese censorship.
The current situation is seen by many as a betrayal of the principles of free speech. The argument that TikTok needed to be seized due to Chinese ownership, and the subsequent shift in its content visibility, points to a potential “projection” of censorship concerns. The fact that anti-Trump content is experiencing a nosedive while other narratives potentially gain prominence suggests that the platform is now serving a different agenda. This has led to calls for boycotts of TikTok and other social media platforms, with users seeking more peaceful and less manipulative online environments.
The concern is that this represents a move towards a more controlled and potentially authoritarian digital landscape, with billionaires and corporations wielding unprecedented power over public discourse. The historical precedent of media conglomerates shaping public opinion serves as a cautionary tale, and the current technological advancements only amplify these dangers. The lack of effective antitrust enforcement is also cited as a contributing factor, allowing for the unchecked consolidation of influence. The current trajectory, where algorithms are subtly nudged by those in power, is viewed as a significant threat to democratic discourse and individual autonomy.
