home.social

#sarashah — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #sarashah, aggregated by home.social.

  1. "It is just much harder for a volunteer-run, distributed system to roll out protections like E2EE than a centralized company."

    #AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023

    cyber.fsi.stanford.edu/io/news

    Explain the logic underlying that conclusion. Counterexample, the Matrix network. A distributed system, much of which is volunteer-run.

    #E2EE #decentralisation

  2. "It is just much harder for a volunteer-run, distributed system to roll out protections like E2EE than a centralized company."

    #AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023

    cyber.fsi.stanford.edu/io/news

    Explain the logic underlying that conclusion. Counterexample, the Matrix network. A distributed system, much of which is volunteer-run.

    #E2EE #decentralisation

  3. "It is just much harder for a volunteer-run, distributed system to roll out protections like E2EE than a centralized company."

    #AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023

    cyber.fsi.stanford.edu/io/news

    Explain the logic underlying that conclusion. Counterexample, the Matrix network. A distributed system, much of which is volunteer-run.

    #E2EE #decentralisation

  4. "It is just much harder for a volunteer-run, distributed system to roll out protections like E2EE than a centralized company."

    #AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023

    cyber.fsi.stanford.edu/io/news

    Explain the logic underlying that conclusion. Counterexample, the Matrix network. A distributed system, much of which is volunteer-run.

    #E2EE #decentralisation

  5. "Mastodon users probably aren’t aware of CSAM on the platform unless it leaks into their federated timelines. This can happen when a fellow user on their instance follows an account posting CSAM. Ways to handle this problem are few. Though users who follow CSAM-disseminating accounts can be suspended from an instance by administrators, they can easily set up a new account on another..."

    #AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023

    cyber.fsi.stanford.edu/io/news

    #CSAM

    (1/2)

  6. "Mastodon users probably aren’t aware of CSAM on the platform unless it leaks into their federated timelines. This can happen when a fellow user on their instance follows an account posting CSAM. Ways to handle this problem are few. Though users who follow CSAM-disseminating accounts can be suspended from an instance by administrators, they can easily set up a new account on another..."

    #AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023

    cyber.fsi.stanford.edu/io/news

    #CSAM

    (1/2)

  7. "Mastodon users probably aren’t aware of CSAM on the platform unless it leaks into their federated timelines. This can happen when a fellow user on their instance follows an account posting CSAM. Ways to handle this problem are few. Though users who follow CSAM-disseminating accounts can be suspended from an instance by administrators, they can easily set up a new account on another..."

    #AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023

    cyber.fsi.stanford.edu/io/news

    #CSAM

    (1/2)

  8. "Mastodon users probably aren’t aware of CSAM on the platform unless it leaks into their federated timelines. This can happen when a fellow user on their instance follows an account posting CSAM. Ways to handle this problem are few. Though users who follow CSAM-disseminating accounts can be suspended from an instance by administrators, they can easily set up a new account on another..."

    #AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023

    cyber.fsi.stanford.edu/io/news

    #CSAM

    (1/2)

  9. "While large platforms with robust trust & safety teams are able to be more discerning in their moderation..."

    #AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023

    cyber.fsi.stanford.edu/io/news

    Are they though?

    Centralised moderation teams often lack the context to know what they're looking at. Fediverse admins each take care of a small, well-defined bit of overall moderation; the bit that affects accounts on their server. They know what's acceptable in their community.

    (1/3)

    #moderation

  10. "While large platforms with robust trust & safety teams are able to be more discerning in their moderation..."

    #AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023

    cyber.fsi.stanford.edu/io/news

    Are they though?

    Centralised moderation teams often lack the context to know what they're looking at. Fediverse admins each take care of a small, well-defined bit of overall moderation; the bit that affects accounts on their server. They know what's acceptable in their community.

    (1/3)

    #moderation

  11. "While large platforms with robust trust & safety teams are able to be more discerning in their moderation..."

    #AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023

    cyber.fsi.stanford.edu/io/news

    Are they though?

    Centralised moderation teams often lack the context to know what they're looking at. Fediverse admins each take care of a small, well-defined bit of overall moderation; the bit that affects accounts on their server. They know what's acceptable in their community.

    (1/3)

    #moderation

  12. "While large platforms with robust trust & safety teams are able to be more discerning in their moderation..."

    #AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023

    cyber.fsi.stanford.edu/io/news

    Are they though?

    Centralised moderation teams often lack the context to know what they're looking at. Fediverse admins each take care of a small, well-defined bit of overall moderation; the bit that affects accounts on their server. They know what's acceptable in their community.

    (1/3)

    #moderation