home.social

#data-ethics — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #data-ethics, aggregated by home.social.

fetched live
  1. Public sector AI governance gets serious the moment you ask who explains the decision to a citizen, an auditor, and a judge. If nobody can, the control isn't mature yet. #GovAI #DataEthics #DigitalGovernment

  2. I could have raised money. I said no.

    Not because I'm a martyr. Because I understand the mechanics: an investor in a privacy business creates structural pressure toward monetizing data. Not out of malice. By design of the model.

    So we stay bootstrapped. Our users are our clients, not our product.

    wiggwigg.ca/en/about-us/

    #Bootstrapping #Privacy #DataEthics

  3. I could have raised money. I said no.

    Not because I'm a martyr. Because I understand the mechanics: an investor in a privacy business creates structural pressure toward monetizing data. Not out of malice. By design of the model.

    So we stay bootstrapped. Our users are our clients, not our product.

    wiggwigg.ca/en/about-us/

    #Bootstrapping #Privacy #DataEthics

  4. I could have raised money. I said no.

    Not because I'm a martyr. Because I understand the mechanics: an investor in a privacy business creates structural pressure toward monetizing data. Not out of malice. By design of the model.

    So we stay bootstrapped. Our users are our clients, not our product.

    wiggwigg.ca/en/about-us/

    #Bootstrapping #Privacy #DataEthics

  5. Episode 23 tackles a complex issue: the mapping capabilities of smart vacuums.

    These devices aren’t just cleaning; they’re creating detailed spatial data about your home.

    We’re examining the potential for misuse, the data being collected, and the lack of transparency. It raises serious questions about surveillance and control within the home. Dive in: impracticalprivacy.com

    What regulations (if any) do you think are needed for this technology?
    #privacy #smarthome #dataethics #technology

  6. Episode 23 tackles a complex issue: the mapping capabilities of smart vacuums.

    These devices aren’t just cleaning; they’re creating detailed spatial data about your home.

    We’re examining the potential for misuse, the data being collected, and the lack of transparency. It raises serious questions about surveillance and control within the home. Dive in: impracticalprivacy.com

    What regulations (if any) do you think are needed for this technology?
    #privacy #smarthome #dataethics #technology

  7. Episode 23 tackles a complex issue: the mapping capabilities of smart vacuums.

    These devices aren’t just cleaning; they’re creating detailed spatial data about your home.

    We’re examining the potential for misuse, the data being collected, and the lack of transparency. It raises serious questions about surveillance and control within the home. Dive in: impracticalprivacy.com

    What regulations (if any) do you think are needed for this technology?
    #privacy #smarthome #dataethics #technology

  8. Episode 23 tackles a complex issue: the mapping capabilities of smart vacuums.

    These devices aren’t just cleaning; they’re creating detailed spatial data about your home.

    We’re examining the potential for misuse, the data being collected, and the lack of transparency. It raises serious questions about surveillance and control within the home. Dive in: impracticalprivacy.com

    What regulations (if any) do you think are needed for this technology?
    #privacy #smarthome #dataethics #technology

  9. Episode 23 tackles a complex issue: the mapping capabilities of smart vacuums.

    These devices aren’t just cleaning; they’re creating detailed spatial data about your home.

    We’re examining the potential for misuse, the data being collected, and the lack of transparency. It raises serious questions about surveillance and control within the home. Dive in: impracticalprivacy.com

    What regulations (if any) do you think are needed for this technology?
    #privacy #smarthome #dataethics #technology

  10. the tech world is noisy right now. the real story is in the intersection.

    data, customer experience, AI, and impact on the next generation aren't separate conversations. they're one.

    → AI scales whatever it learns. including the flaws.
    → ethics and cultural context aren't add-ons.
    → thoughtful beats fast. fast usually means redoing it.

    what questions are you asking at the crossroads?

    #DataEthics #ResponsibleAI #AIEthics #AILiteracy

  11. AI procurement question number one: what exactly is the human override when the model is wrong, biased, or legally awkward? If the answer is fuzzy, the governance isn't real yet. #GovAI #DataEthics #PublicSector

  12. AI in public services needs more than a model and a waiver. Ask who explains the outcome to a citizen, an auditor, and a judge. If the answer is fuzzy, the control is too. #GovAI #DataEthics #PublicSector

  13. the noise in these tech streets is LOUD right now.

    but here's the thing, data, CX, AI, and the next generation? that's all one conversation.

    data without ethics is surveillance.
    CX without caution is manipulation.
    AI without cultural and historical context is dangerous.

    none of that matters if we're not holding space for both the promise AND the risk.
    not trying to chase every hot take. trying to ask better questions.

    thoughtful beats fast.

    y'all got thoughts? 🫖
    #DataEthics #ResponsibleAI

  14. There’s a difference between being known… and being reduced to what can be known.

    Profiles don’t just describe us. They stabilise us. They turn movement into pattern, possibility into probability.

    And once that version exists, systems begin to trust it more than the person.

    It is consistent. Predictable. Actionable.

    You are not.

    associationredefine.substack.c

    #DigitalRights #DataProtection #HumanDignity #DataEthics #TechAndSociety #AIethics #PrivacyMatters #SystemsThinking #Democracy

  15. There’s a difference between being known… and being reduced to what can be known.

    Profiles don’t just describe us. They stabilise us. They turn movement into pattern, possibility into probability.

    And once that version exists, systems begin to trust it more than the person.

    It is consistent. Predictable. Actionable.

    You are not.

    associationredefine.substack.c

    #DigitalRights #DataProtection #HumanDignity #DataEthics #TechAndSociety #AIethics #PrivacyMatters #SystemsThinking #Democracy

  16. There’s a difference between being known… and being reduced to what can be known.

    Profiles don’t just describe us. They stabilise us. They turn movement into pattern, possibility into probability.

    And once that version exists, systems begin to trust it more than the person.

    It is consistent. Predictable. Actionable.

    You are not.

    associationredefine.substack.c

    #DigitalRights #DataProtection #HumanDignity #DataEthics #TechAndSociety #AIethics #PrivacyMatters #SystemsThinking #Democracy

  17. There’s a difference between being known… and being reduced to what can be known.

    Profiles don’t just describe us. They stabilise us. They turn movement into pattern, possibility into probability.

    And once that version exists, systems begin to trust it more than the person.

    It is consistent. Predictable. Actionable.

    You are not.

    associationredefine.substack.c

    #DigitalRights #DataProtection #HumanDignity #DataEthics #TechAndSociety #AIethics #PrivacyMatters #SystemsThinking #Democracy

  18. There’s a difference between being known… and being reduced to what can be known.

    Profiles don’t just describe us. They stabilise us. They turn movement into pattern, possibility into probability.

    And once that version exists, systems begin to trust it more than the person.

    It is consistent. Predictable. Actionable.

    You are not.

    associationredefine.substack.c

    #DigitalRights #DataProtection #HumanDignity #DataEthics #TechAndSociety #AIethics #PrivacyMatters #SystemsThinking #Democracy

  19. You might consent to your data being used to prevent societal harm, but who decides where that line is drawn? 🤔⚖️

    Aram Sinnreich & Jesse Gilbert explore the hidden ethics of data collection, facial recognition, and algorithmic decision making in THE SECRET LIFE OF DATA on the Future Knowledge #podcast, with Laura DeNardis. 🔍

    🎧 Listen & subscribe ⬇️
    futureknowledge.transistor.fm/

    #Consent #DataEthics #Privacy #AI @aram @jesse #Bookstodon

  20. You might consent to your data being used to prevent societal harm, but who decides where that line is drawn? 🤔⚖️

    Aram Sinnreich & Jesse Gilbert explore the hidden ethics of data collection, facial recognition, and algorithmic decision making in THE SECRET LIFE OF DATA on the Future Knowledge #podcast, with Laura DeNardis. 🔍

    🎧 Listen & subscribe ⬇️
    futureknowledge.transistor.fm/

    #Consent #DataEthics #Privacy #AI @aram @jesse #Bookstodon

  21. You might consent to your data being used to prevent societal harm, but who decides where that line is drawn? 🤔⚖️

    Aram Sinnreich & Jesse Gilbert explore the hidden ethics of data collection, facial recognition, and algorithmic decision making in THE SECRET LIFE OF DATA on the Future Knowledge #podcast, with Laura DeNardis. 🔍

    🎧 Listen & subscribe ⬇️
    futureknowledge.transistor.fm/

    #Consent #DataEthics #Privacy #AI @aram @jesse #Bookstodon

  22. You might consent to your data being used to prevent societal harm, but who decides where that line is drawn? 🤔⚖️

    Aram Sinnreich & Jesse Gilbert explore the hidden ethics of data collection, facial recognition, and algorithmic decision making in THE SECRET LIFE OF DATA on the Future Knowledge #podcast, with Laura DeNardis. 🔍

    🎧 Listen & subscribe ⬇️
    futureknowledge.transistor.fm/

    #Consent #DataEthics #Privacy #AI @aram @jesse #Bookstodon

  23. You might consent to your data being used to prevent societal harm, but who decides where that line is drawn? 🤔⚖️

    Aram Sinnreich & Jesse Gilbert explore the hidden ethics of data collection, facial recognition, and algorithmic decision making in THE SECRET LIFE OF DATA on the Future Knowledge #podcast, with Laura DeNardis. 🔍

    🎧 Listen & subscribe ⬇️
    futureknowledge.transistor.fm/

    #Consent #DataEthics #Privacy #AI @aram @jesse #Bookstodon

  24. 🥳🎉🎈 Oh wow, a groundbreaking revelation: sending an #email to opt-out of being spied on by a service you voluntarily signed up for! 🕵️‍♂️🤦‍♂️ Next time, maybe just avoid surveillance-friendly apps altogether, genius. 📱💡
    honeypot.net/2026/04/14/i-wrot #optout #surveillance #privacy #technews #dataethics #HackerNews #ngated

  25. 🥳🎉🎈 Oh wow, a groundbreaking revelation: sending an #email to opt-out of being spied on by a service you voluntarily signed up for! 🕵️‍♂️🤦‍♂️ Next time, maybe just avoid surveillance-friendly apps altogether, genius. 📱💡
    honeypot.net/2026/04/14/i-wrot #optout #surveillance #privacy #technews #dataethics #HackerNews #ngated

  26. 🥳🎉🎈 Oh wow, a groundbreaking revelation: sending an #email to opt-out of being spied on by a service you voluntarily signed up for! 🕵️‍♂️🤦‍♂️ Next time, maybe just avoid surveillance-friendly apps altogether, genius. 📱💡
    honeypot.net/2026/04/14/i-wrot #optout #surveillance #privacy #technews #dataethics #HackerNews #ngated

  27. 🥳🎉🎈 Oh wow, a groundbreaking revelation: sending an #email to opt-out of being spied on by a service you voluntarily signed up for! 🕵️‍♂️🤦‍♂️ Next time, maybe just avoid surveillance-friendly apps altogether, genius. 📱💡
    honeypot.net/2026/04/14/i-wrot #optout #surveillance #privacy #technews #dataethics #HackerNews #ngated

  28. 🥳🎉🎈 Oh wow, a groundbreaking revelation: sending an #email to opt-out of being spied on by a service you voluntarily signed up for! 🕵️‍♂️🤦‍♂️ Next time, maybe just avoid surveillance-friendly apps altogether, genius. 📱💡
    honeypot.net/2026/04/14/i-wrot #optout #surveillance #privacy #technews #dataethics #HackerNews #ngated

  29. EOSC just released recommendations on long-term data retention:
    doi.org/10.5281/zenodo.19232470

    Glad to see our work (doi.org/10.1038/s41597-025-047) on extending CARE principles cited in the context of ethical data governance for vulnerable populations.

    #OpenScience #DataEthics #EOSC #ResearchData

  30. EOSC just released recommendations on long-term data retention:
    doi.org/10.5281/zenodo.19232470

    Glad to see our work (doi.org/10.1038/s41597-025-047) on extending CARE principles cited in the context of ethical data governance for vulnerable populations.

    #OpenScience #DataEthics #EOSC #ResearchData

  31. EOSC just released recommendations on long-term data retention:
    doi.org/10.5281/zenodo.19232470

    Glad to see our work (doi.org/10.1038/s41597-025-047) on extending CARE principles cited in the context of ethical data governance for vulnerable populations.

    #OpenScience #DataEthics #EOSC #ResearchData

  32. EOSC just released recommendations on long-term data retention:
    doi.org/10.5281/zenodo.19232470

    Glad to see our work (doi.org/10.1038/s41597-025-047) on extending CARE principles cited in the context of ethical data governance for vulnerable populations.

    #OpenScience #DataEthics #EOSC #ResearchData

  33. Who gets to decide how your data is used, especially when you never gave informed consent?

    Aram Sinnreich & Jesse Gilbert explore the ethical gray areas of data use, from facial recognition to unseen algorithmic decisions, in THE SECRET LIFE OF DATA on the Future Knowledge #podcast, in conversation with Laura DeNardis.

    🎧 Listen & subscribe ⬇️
    futureknowledge.transistor.fm/

    #Consent #DataEthics #Privacy #AI @aram @jesse #Bookstodon

  34. Who gets to decide how your data is used, especially when you never gave informed consent?

    Aram Sinnreich & Jesse Gilbert explore the ethical gray areas of data use, from facial recognition to unseen algorithmic decisions, in THE SECRET LIFE OF DATA on the Future Knowledge #podcast, in conversation with Laura DeNardis.

    🎧 Listen & subscribe ⬇️
    futureknowledge.transistor.fm/

    #Consent #DataEthics #Privacy #AI @aram @jesse #Bookstodon

  35. Who gets to decide how your data is used, especially when you never gave informed consent?

    Aram Sinnreich & Jesse Gilbert explore the ethical gray areas of data use, from facial recognition to unseen algorithmic decisions, in THE SECRET LIFE OF DATA on the Future Knowledge #podcast, in conversation with Laura DeNardis.

    🎧 Listen & subscribe ⬇️
    futureknowledge.transistor.fm/

    #Consent #DataEthics #Privacy #AI @aram @jesse #Bookstodon

  36. Who gets to decide how your data is used, especially when you never gave informed consent?

    Aram Sinnreich & Jesse Gilbert explore the ethical gray areas of data use, from facial recognition to unseen algorithmic decisions, in THE SECRET LIFE OF DATA on the Future Knowledge #podcast, in conversation with Laura DeNardis.

    🎧 Listen & subscribe ⬇️
    futureknowledge.transistor.fm/

    #Consent #DataEthics #Privacy #AI @aram @jesse #Bookstodon

  37. Who gets to decide how your data is used, especially when you never gave informed consent?

    Aram Sinnreich & Jesse Gilbert explore the ethical gray areas of data use, from facial recognition to unseen algorithmic decisions, in THE SECRET LIFE OF DATA on the Future Knowledge #podcast, in conversation with Laura DeNardis.

    🎧 Listen & subscribe ⬇️
    futureknowledge.transistor.fm/

    #Consent #DataEthics #Privacy #AI @aram @jesse #Bookstodon