#platform-accountability — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #platform-accountability, aggregated by home.social.
-
Facebook Algorithm Manipulation and the Dangerous Corporate Control of Speech and Reality
Facebook Algorithm Manipulation exposes how Meta’s opaque systems shape public opinion, suppress visibility, and monetize outrage for profit.https://thedemocracyadvocate.com/news-to-know/tech-news/facebook-algorithm-manipulation/
-
YouTube’s Risk Assessments Are Not Publicly Testable
By Cliff Potts, CSO, and Editor-in-Chief of WPS News
Baybay City, Leyte, Philippines — April 26, 2026
Reporting
Under the Digital Services Act (DSA), very large online platforms are required to conduct regular risk assessments addressing systemic harms, including the amplification of illegal content, threats to civic discourse, and impacts on fundamental rights. YouTube has stated that it complies with these obligations through internal evaluations and mitigation plans submitted to EU authorities.
What remains unavailable is the evidence needed to independently test those claims.
Public disclosures summarize conclusions but not methods. They describe risks in general terms without detailing assumptions, metrics, or counterfactuals. External researchers, journalists, and civil-society groups are asked to trust that assessments are rigorous while being denied access to the data that would allow verification.
In effect, YouTube reports that it has assessed risk—without showing how.
Analysis
A risk assessment that cannot be tested is a corporate assertion, not an accountability mechanism.
Meaningful oversight requires more than assurances. It requires visibility into the indicators used, the thresholds applied, and the trade-offs accepted. Without this information, regulators cannot determine whether mitigation measures address root causes or merely manage appearances.
This opacity reflects incentives shaped at the parent level. Google has long resisted external auditing of its core systems, citing security and proprietary concerns. While some confidentiality is legitimate, blanket opacity prevents independent scrutiny of claims that directly affect public life.
The result is a one-sided process: platforms define risk, evaluate themselves, and report outcomes in summary form. EU oversight is left to review conclusions rather than interrogate evidence.
What Remains Unclear
YouTube does not disclose the specific metrics used to assess systemic risk within EU member states, nor how those metrics vary by language, topic, or election cycle. It also does not publish the results of stress tests showing how changes to recommendations or monetization would alter risk profiles.
Without access to these details, neither regulators nor the public can judge whether risk mitigation is proportionate or effective.
Why This Matters
The DSA was designed to move beyond trust-based governance. Its purpose is to replace assurances with evidence. When platforms provide only summaries, that purpose is undermined.
If risk assessments remain shielded from independent evaluation, enforcement becomes reactive rather than preventive. Harm is identified after it spreads, not before it is amplified.
For EU regulators, the question is straightforward: can a system built on self-assessment deliver public accountability? Until YouTube’s risk evaluations are open to meaningful testing, that question remains unanswered.
References (APA)
European Commission. (2024). Digital Services Act: Systemic risk assessment and mitigation obligations.
#algorithms #DigitalServicesAct #Google #platformAccountability #riskAssessment #YouTube
European Digital Rights (EDRi). (2023). Platform risk assessments and the limits of self-reporting.
Pasquale, F. (2020). New laws of robotics: Defending human expertise in the age of AI. Harvard University Press. -
YouTube Denies Downranking While Practicing It
By Cliff Potts, CSO, and Editor-in-Chief of WPS News
Baybay City, Leyte, Philippines — April 12, 2026
Reporting
For years, YouTube has rejected claims that it “shadow bans” content or creators. In public statements and responses to European regulators, the platform maintains that videos are either available or removed, and that reduced reach reflects user choice rather than platform intervention.
EU creators and researchers describe a different pattern.
Videos that remain publicly accessible frequently experience sudden and sustained drops in impressions, recommendations, and search visibility without notice or policy citation. These declines often coincide with topical sensitivity, political relevance, or advertiser concern. Creators receive no formal enforcement notice, no appeal option, and no explanation.
Because the content is not removed, these actions fall outside the procedural safeguards that apply to takedowns. From the user’s perspective, the video exists. From the platform’s perspective, it effectively disappears.
Analysis
Downranking is enforcement without accountability.
By reducing visibility rather than removing content, YouTube avoids triggering disclosure and redress obligations while still shaping information flows. The company’s insistence that recommendation systems merely reflect audience interest obscures the reality that distribution is an editorial decision embedded in code.
This approach is consistent with incentives set at the parent-company level. Google derives revenue from advertiser confidence and risk minimization. Downranking allows the platform to limit exposure to controversial or inconvenient material without attracting public scrutiny.
From a regulatory standpoint, this creates a blind spot. EU frameworks focus heavily on content removal, yet visibility controls can have equal or greater impact on public discourse. A video that cannot be found, recommended, or surfaced may as well not exist.
What Remains Unclear
YouTube does not disclose when or why content is downranked within the EU. It does not provide creators with visibility metrics tied to policy triggers, nor does it allow independent auditors to assess how recommendation changes affect reach over time.
Without transparency, it is impossible to distinguish between organic audience behavior and deliberate suppression.
Why This Matters
If platforms can quietly reduce the reach of lawful content without notice, explanation, or appeal, then formal safeguards offer limited protection. Enforcement shifts from visible actions to invisible controls.
For EU regulators, the question is not whether YouTube uses the term “shadow banning.” It is whether undisclosed visibility restrictions are compatible with the Union’s goals of transparency, accountability, and equal treatment.
As long as downranking remains unacknowledged and unregulated, a significant portion of platform power operates outside effective oversight.
References (APA)
European Commission. (2024). Digital Services Act: Systemic risk mitigation and recommender systems.
#algorithms #DigitalServicesAct #downranking #Google #platformAccountability #YouTube
AlgorithmWatch. (2023). Auditing platform recommendation and ranking practices.
Gillespie, T. (2020). Content moderation, AI, and hidden governance. Social Media + Society. -
Insta tried #platformaccountability by lifting the PG-13™️ — a rating with nearly 60 years of built-in trust — without authorisation. No wonder governments are banning social media when the industry won't operate legally. www.reuters.com/business/met... #copyright #audiovisual #law #film #television
Meta to limit PG-13 rating use... -
Meta says it removed millions of scam ads after lawmakers called for an investigation into whether it profits from fraudulent content
#Meta #Scams #PlatformAccountability #BigTech -
YouTube’s Latest Insult: Locking Me Out of My Own Channels by Deleting My Manager Accounts
Well, if you thought this situation couldn’t get any worse, YouTube proved me wrong. At first, I thought they deleted both of my channels — jaimedavid327 (author) and luffymonkey0327 (meme/mashup) — but it’s even worse than that. No, my content channels aren’t gone. They’re still up. But YouTube did something even more frustrating: they deleted my manager accounts, effectively locking me out of both channels. Let me clarify — my content is still on YouTube. My channels are […] -
The Digital Leviathan: Inside the BJP IT Cell’s Architecture of Consent, Coercion, and Control
https://onceinabluemoon2021.in/2025/12/19/the-digital-leviathan-inside-the-bjp-it-cells-architecture-of-consent-coercion-and-control/
#DigitalLeviathan, #Misinformation, #Disinformation, #Deepfakes, #AIPropaganda, #InformationWarfare, #SurveillanceState, #FreeSpeech, #MediaLiteracy, #ProtectDemocracy, #PlatformAccountability, #HumanRights_Infringed_India, #Seize_Cronies_Fairplay_for_DHFL_Victims,#Alleged_Dawood_Mirchi_Rkw_Dhfl_Bjp_Collusion, -
Futurism found that Elon Musk’s Grok chatbot is actively doxxing everyday people, handing over real, current home addresses, phone numbers, emails, even family members’ info with almost no prompting.
https://futurism.com/artificial-intelligence/grok-doxxing
#privacy #doxxing #aiharms #platformaccountability #surveillance
1/3 -
It’s a fascinating, uneasy read: internal watchdogs inside a 2,000+ person, $300B-ish AI company, with limited visibility into how their models actually shape real-world behavior. advice.
#platformaccountability #surveillance #aigovernance #ethicalai
2/2 -
OpenAI's credit expires after a year, regardless of any notification or proper acknowledgment. It's displayed as "balance" like your bank account balance, but it's actually not, and it expires faster than some items in your kitchen.
People who shared the same frustration:
https://community.openai.com/t/credit-expired-without-notification/1256598#OpenAI #AICommunity #ConsumerRights #PlatformAccountability
-
What’s framed as a “professional network” is often just an online extension of our worst professional realities…white supremacy and anti-Blackness, coded to scale.
#AntiRacism #WorkplaceEquity #WhiteSupremacyCulture #PlatformAccountability #RacialEquity #DigitalSpaces #Professionalism #BlackWomenAtWork #EquityInAction #TechAccountability #BiasInTech #SystemicRacism #JusticeOverComfort #ProfitWithoutOppression #KimCrayton
-
In a bellwether trial, Uber was cleared in a sexual assault case, though the jury acknowledged company negligence. This ruling has sparked debate among legal experts and victim advocates about how tech companies are held accountable for safety. With over 558,000+ reports of sexual assault/misconduct on Uber trips from 2017-2024, is the 'passenger uses service at their own risk' stance sustainable?
#UberSafety #TechEthics #Lawsuit #PlatformAccountability #GigWorker
https://www.engadget.com/transportation/uber-found-not-guilty-in-first-of-many-sexual-assault-lawsuits-133046712.html?src=rss -
#DSA #DMA #PlatformAccountability
The European Commission is investigating Pornhub, Stripchat, XNXX, and XVideos for potential child safety DSA violations as a priority. https://www.wsj.com/tech/eu-investigates-major-porn-sites-over-child-safety-9158c118?st=WMnhnz&reflink=desktopwebshare_permalink -
Time for action! 💪🏾
Today, we filed a formal #DSA complaint against X in Ireland, together with our Romanian member @apti.
We show how X misleads #TrustedFlaggers to use a wrong, non-functional online form in all EU languages but English.
Reminder: A few weeks ago we asked #X to rectify their faulty online forms, but while they thanked us for the info they didn't actually do anything about it 🤷🏾♀️
Read more here ⤵️ https://edri.org/our-work/edri-files-dsa-legal-complaint-against-x/
-
Yep – as tech leaders gain influence in politics
It's crucial to remember that innovation needs checks & balances. Power in the hands of a few isn’t progress; it’s a step backward.
#platformaccountability #TechAndPower #ResponsibleInnovation #DigitalDemocracy https://www.theguardian.com/commentisfree/2024/nov/11/a-new-era-dawns-americas-tech-bros-now-strut-their-stuff-in-the-corridors-of-power
-
Indeed, show a child a photo of a giraffe, and after three views, they will recognize it. Show AI a photo of a giraffe, and it'll need 10,000 copyrighted images to do so - #PlatformAccountability
---
RT @Techmeme
Italy's privacy regulator temporarily bans ChatGPT and will probe OpenAI, claiming the company lacks a basis for "mass collection and storage of personal data" (@clothildegouj / Politico)https://www.politico.eu/article/italian-…
https://twitter.com/Techmeme/status/1641759960521400320 -
Marking one's homework was never going to be a great idea. Time is nigh for #platformaccountability
---
RT @Moonalice
The @OversightBoard has been PR scam from the start, providing air cover to Meta as it undermined democracy, public health, and public safety. The tweet below is a perfect example. Fact: there is no effective accountability for internet platforms. https://twitter.com/oversightboard/status/1641510271649734673
https://twitter.com/Moonalice/status/1641538880540377090 -
What a horrid battle - gun maker v cat videos.
"Norwegian group Nammo blames ‘storage of cat videos’ for threatening its growth as data centre corners spare electricity "https://www.ft.com/content/f85aa254-d453-4542-a50e-fa1171971ab0?shareType=nongift #tiktok #GreenEnergy #netzero #platformaccountability -
🔘 "You’re the creator, they’re the economy" @CreativeFuture #FreeIsNotFair #PlatformAccountability
---
RT @CreativeFuture
Another reminder that when Big Tech types use phrases like “creator economy,” this is what they’re referring to. You’re the creator, they’re the economy💸. #TikTok
https://www-hollywoodreporter-com.cdn.ampproject.org/c/s/www.hollywoodreporter.com/business/digital/tiktok-creator-fund-sxsw-1235353771/amp/
https://twitter.com/CreativeFuture/status/1639011005283049473 -
The TikTok wars – why the US and China are feuding over the app (meanwhile UK govt bans app from official devices) https://www.theguardian.com/technology/2023/mar/16/the-tiktok-wars-why-the-us-and-china-are-feuding-over-the-app #tiktok #onlinesafetybill #digitalservicesact #platformlaw #platformaccountability #regulations
-
WhatsApp has agreed to be more transparent about changes to its privacy policy introduced in 2021 #meta #DMA #privacy #eudatap #data #platformaccountability https://ec.europa.eu/commission/presscorner/detail/en/IP_23_1302
-
Full text and video here https://www.netopia.eu/netopia-spotlight-professor-eleonora-rosati/
---
RT @NetopiaEU
"How can law influence the owners of Platforms?"
Netopia Spotlight Part 3 w/ Prof Eleonora Rosatihttps://www.netopia.eu/netopia-spotlight-professor-eleonora-rosati/ #copyright #article17 #platformaccountability 📺⤵️
https://twitter.com/NetopiaEU/status/1622961880875642883 -
Twitter sued by Crown Estate (aka King Charles III) for unpaid rent at London offices
https://www.bbc.co.uk/news/uk-64381582 #Twitter @[email protected] #platformaccountability #Monarchy #PiccadillyCircus