#sarashah — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #sarashah, aggregated by home.social.
-
"It is just much harder for a volunteer-run, distributed system to roll out protections like E2EE than a centralized company."
#AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023
https://cyber.fsi.stanford.edu/io/news/common-abuses-mastodon-primer
Explain the logic underlying that conclusion. Counterexample, the Matrix network. A distributed system, much of which is volunteer-run.
-
"It is just much harder for a volunteer-run, distributed system to roll out protections like E2EE than a centralized company."
#AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023
https://cyber.fsi.stanford.edu/io/news/common-abuses-mastodon-primer
Explain the logic underlying that conclusion. Counterexample, the Matrix network. A distributed system, much of which is volunteer-run.
-
"It is just much harder for a volunteer-run, distributed system to roll out protections like E2EE than a centralized company."
#AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023
https://cyber.fsi.stanford.edu/io/news/common-abuses-mastodon-primer
Explain the logic underlying that conclusion. Counterexample, the Matrix network. A distributed system, much of which is volunteer-run.
-
"It is just much harder for a volunteer-run, distributed system to roll out protections like E2EE than a centralized company."
#AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023
https://cyber.fsi.stanford.edu/io/news/common-abuses-mastodon-primer
Explain the logic underlying that conclusion. Counterexample, the Matrix network. A distributed system, much of which is volunteer-run.
-
"Mastodon users probably aren’t aware of CSAM on the platform unless it leaks into their federated timelines. This can happen when a fellow user on their instance follows an account posting CSAM. Ways to handle this problem are few. Though users who follow CSAM-disseminating accounts can be suspended from an instance by administrators, they can easily set up a new account on another..."
#AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023
https://cyber.fsi.stanford.edu/io/news/common-abuses-mastodon-primer
(1/2)
-
"Mastodon users probably aren’t aware of CSAM on the platform unless it leaks into their federated timelines. This can happen when a fellow user on their instance follows an account posting CSAM. Ways to handle this problem are few. Though users who follow CSAM-disseminating accounts can be suspended from an instance by administrators, they can easily set up a new account on another..."
#AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023
https://cyber.fsi.stanford.edu/io/news/common-abuses-mastodon-primer
(1/2)
-
"Mastodon users probably aren’t aware of CSAM on the platform unless it leaks into their federated timelines. This can happen when a fellow user on their instance follows an account posting CSAM. Ways to handle this problem are few. Though users who follow CSAM-disseminating accounts can be suspended from an instance by administrators, they can easily set up a new account on another..."
#AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023
https://cyber.fsi.stanford.edu/io/news/common-abuses-mastodon-primer
(1/2)
-
"Mastodon users probably aren’t aware of CSAM on the platform unless it leaks into their federated timelines. This can happen when a fellow user on their instance follows an account posting CSAM. Ways to handle this problem are few. Though users who follow CSAM-disseminating accounts can be suspended from an instance by administrators, they can easily set up a new account on another..."
#AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023
https://cyber.fsi.stanford.edu/io/news/common-abuses-mastodon-primer
(1/2)
-
"While large platforms with robust trust & safety teams are able to be more discerning in their moderation..."
#AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023
https://cyber.fsi.stanford.edu/io/news/common-abuses-mastodon-primer
Are they though?
Centralised moderation teams often lack the context to know what they're looking at. Fediverse admins each take care of a small, well-defined bit of overall moderation; the bit that affects accounts on their server. They know what's acceptable in their community.
(1/3)
-
"While large platforms with robust trust & safety teams are able to be more discerning in their moderation..."
#AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023
https://cyber.fsi.stanford.edu/io/news/common-abuses-mastodon-primer
Are they though?
Centralised moderation teams often lack the context to know what they're looking at. Fediverse admins each take care of a small, well-defined bit of overall moderation; the bit that affects accounts on their server. They know what's acceptable in their community.
(1/3)
-
"While large platforms with robust trust & safety teams are able to be more discerning in their moderation..."
#AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023
https://cyber.fsi.stanford.edu/io/news/common-abuses-mastodon-primer
Are they though?
Centralised moderation teams often lack the context to know what they're looking at. Fediverse admins each take care of a small, well-defined bit of overall moderation; the bit that affects accounts on their server. They know what's acceptable in their community.
(1/3)
-
"While large platforms with robust trust & safety teams are able to be more discerning in their moderation..."
#AlexStamos, #SaraShah, #StanfordInternetObservatory, 2023
https://cyber.fsi.stanford.edu/io/news/common-abuses-mastodon-primer
Are they though?
Centralised moderation teams often lack the context to know what they're looking at. Fediverse admins each take care of a small, well-defined bit of overall moderation; the bit that affects accounts on their server. They know what's acceptable in their community.
(1/3)