#themarkup — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #themarkup, aggregated by home.social.
-
An Essential Guide For Mindful Posting by #TheMarkUp
https://themarkup.org/hello-world/2024/02/17/an-essential-guide-for-mindful-postingReferences the 2023 book “Share Better and Stress Less: A Guide to Thinking Ecologically About Social Media” by Whitney Phillips and Ryan Milner:
https://www.amazon.com/Share-Better-Stress-Less-Ecologically-ebook/dp/B0B96RH4X9#MediaLit #MediaLiteracy #edtechSR #edtech #SocialMedia #Instagram #TechCorrection
-
An Essential Guide For Mindful Posting by #TheMarkUp
https://themarkup.org/hello-world/2024/02/17/an-essential-guide-for-mindful-postingReferences the 2023 book “Share Better and Stress Less: A Guide to Thinking Ecologically About Social Media” by Whitney Phillips and Ryan Milner:
https://www.amazon.com/Share-Better-Stress-Less-Ecologically-ebook/dp/B0B96RH4X9#MediaLit #MediaLiteracy #edtechSR #edtech #SocialMedia #Instagram #TechCorrection
-
An Essential Guide For Mindful Posting by #TheMarkUp
https://themarkup.org/hello-world/2024/02/17/an-essential-guide-for-mindful-postingReferences the 2023 book “Share Better and Stress Less: A Guide to Thinking Ecologically About Social Media” by Whitney Phillips and Ryan Milner:
https://www.amazon.com/Share-Better-Stress-Less-Ecologically-ebook/dp/B0B96RH4X9#MediaLit #MediaLiteracy #edtechSR #edtech #SocialMedia #Instagram #TechCorrection
-
Fascinating article on a collaboration between #AmazonRing cameras and police and fire departments across the U.S. This is a grant-funded collaborative research piece between #TheMarkup, #AfroLA, and #LATimes.
My personal takeaway is it really highlights the complexity of digital #privacy, #security, and #infosec. As an amateur #infosec hobbiest the negative impacts from Ring to communities and society are glaringly obvious: such as growing the divide between more privileged and less privileged classes in local communities, promoting fear, further skewing the role of police as only protecting privileged classes. This is also very intertwined with #cambridgeanalytica and related current scandals, the roll of #bigdata in #uspol, and so much more. Plus throw in all the challenges with #AI.
We can't bury our heads in the sand on this, but its such a complicated, insidious, and nefarious topic ... its really hard to communicate this to people. Just like climate change.
-
.> ... large-scale AI models are indeed big water consumers. For example, training GPT‑3 in Microsoft’s state-of-the-art U.S. data centers can directly consume 700,000 liters of clean freshwater (enough to produce 370 BMW cars or 320 Tesla electric vehicles), and the water consumption would have been tripled if training were done in Microsoft’s data centers in Asia. These numbers do not include the off-site water footprint associated with electricity generation.
.> ChatGPT needs a 500-ml bottle of water for a short conversation of roughly 20 to 50 questions and answers, depending on when and where the model is deployed. Given ChatGPT’s huge user base, the total water footprint for inference can be enormous.
.> ... if we only consider carbon footprint reduction (say, by scheduling more AI training around noon), we’ll likely end up with higher water consumption, which is not truly sustainable for AI.
.> ... the vast majority of data centers still use potable water and cooling towers. For example, even tech giants such as Google heavily rely on cooling towers and consume billions of liters of potable water each year. Such huge water consumption has produced a stress on the local water infrastructure; Google’s data center used more than a quarter of all the water in The Dalles, Ore.
.> ... some AI conferences have requested that authors declare their AI models’ carbon footprint in their papers; we believe that with transparency and awareness, authors can also declare their AI models’ water footprint as part of the environmental impact.
- The Markup: Water Footprint of AI Technology
- A conversation with Shaolei Ren and Nabiha Syed
#TheMarkup #NabihaSyed #ShaoleiRen #AISalami #ChatGPT #CarbonFootprint #WaterFootprint #California #Oregon #DallesOregon #Virginia #DataCenterCapital #VirginiaLoudon #LoudonCounty -
.> ... large-scale AI models are indeed big water consumers. For example, training GPT‑3 in Microsoft’s state-of-the-art U.S. data centers can directly consume 700,000 liters of clean freshwater (enough to produce 370 BMW cars or 320 Tesla electric vehicles), and the water consumption would have been tripled if training were done in Microsoft’s data centers in Asia. These numbers do not include the off-site water footprint associated with electricity generation.
.> ChatGPT needs a 500-ml bottle of water for a short conversation of roughly 20 to 50 questions and answers, depending on when and where the model is deployed. Given ChatGPT’s huge user base, the total water footprint for inference can be enormous.
.> ... if we only consider carbon footprint reduction (say, by scheduling more AI training around noon), we’ll likely end up with higher water consumption, which is not truly sustainable for AI.
.> ... the vast majority of data centers still use potable water and cooling towers. For example, even tech giants such as Google heavily rely on cooling towers and consume billions of liters of potable water each year. Such huge water consumption has produced a stress on the local water infrastructure; Google’s data center used more than a quarter of all the water in The Dalles, Ore.
.> ... some AI conferences have requested that authors declare their AI models’ carbon footprint in their papers; we believe that with transparency and awareness, authors can also declare their AI models’ water footprint as part of the environmental impact.
- The Markup: Water Footprint of AI Technology
- A conversation with Shaolei Ren and Nabiha Syed
#TheMarkup #NabihaSyed #ShaoleiRen #AISalami #ChatGPT #CarbonFootprint #WaterFootprint #California #Oregon #DallesOregon #Virginia #DataCenterCapital #VirginiaLoudon #LoudonCounty -
.> ... large-scale AI models are indeed big water consumers. For example, training GPT‑3 in Microsoft’s state-of-the-art U.S. data centers can directly consume 700,000 liters of clean freshwater (enough to produce 370 BMW cars or 320 Tesla electric vehicles), and the water consumption would have been tripled if training were done in Microsoft’s data centers in Asia. These numbers do not include the off-site water footprint associated with electricity generation.
.> ChatGPT needs a 500-ml bottle of water for a short conversation of roughly 20 to 50 questions and answers, depending on when and where the model is deployed. Given ChatGPT’s huge user base, the total water footprint for inference can be enormous.
.> ... if we only consider carbon footprint reduction (say, by scheduling more AI training around noon), we’ll likely end up with higher water consumption, which is not truly sustainable for AI.
.> ... the vast majority of data centers still use potable water and cooling towers. For example, even tech giants such as Google heavily rely on cooling towers and consume billions of liters of potable water each year. Such huge water consumption has produced a stress on the local water infrastructure; Google’s data center used more than a quarter of all the water in The Dalles, Ore.
.> ... some AI conferences have requested that authors declare their AI models’ carbon footprint in their papers; we believe that with transparency and awareness, authors can also declare their AI models’ water footprint as part of the environmental impact.
- The Markup: Water Footprint of AI Technology
- A conversation with Shaolei Ren and Nabiha Syed
#TheMarkup #NabihaSyed #ShaoleiRen #AISalami #ChatGPT #CarbonFootprint #WaterFootprint #California #Oregon #DallesOregon #Virginia #DataCenterCapital #VirginiaLoudon #LoudonCounty -
.> ... large-scale AI models are indeed big water consumers. For example, training GPT‑3 in Microsoft’s state-of-the-art U.S. data centers can directly consume 700,000 liters of clean freshwater (enough to produce 370 BMW cars or 320 Tesla electric vehicles), and the water consumption would have been tripled if training were done in Microsoft’s data centers in Asia. These numbers do not include the off-site water footprint associated with electricity generation.
.> ChatGPT needs a 500-ml bottle of water for a short conversation of roughly 20 to 50 questions and answers, depending on when and where the model is deployed. Given ChatGPT’s huge user base, the total water footprint for inference can be enormous.
.> ... if we only consider carbon footprint reduction (say, by scheduling more AI training around noon), we’ll likely end up with higher water consumption, which is not truly sustainable for AI.
.> ... the vast majority of data centers still use potable water and cooling towers. For example, even tech giants such as Google heavily rely on cooling towers and consume billions of liters of potable water each year. Such huge water consumption has produced a stress on the local water infrastructure; Google’s data center used more than a quarter of all the water in The Dalles, Ore.
.> ... some AI conferences have requested that authors declare their AI models’ carbon footprint in their papers; we believe that with transparency and awareness, authors can also declare their AI models’ water footprint as part of the environmental impact.
- The Markup: Water Footprint of AI Technology
- A conversation with Shaolei Ren and Nabiha Syed
#TheMarkup #NabihaSyed #ShaoleiRen #AISalami #ChatGPT #CarbonFootprint #WaterFootprint #California #Oregon #DallesOregon #Virginia #DataCenterCapital #VirginiaLoudon #LoudonCounty -
.> ... large-scale AI models are indeed big water consumers. For example, training GPT‑3 in Microsoft’s state-of-the-art U.S. data centers can directly consume 700,000 liters of clean freshwater (enough to produce 370 BMW cars or 320 Tesla electric vehicles), and the water consumption would have been tripled if training were done in Microsoft’s data centers in Asia. These numbers do not include the off-site water footprint associated with electricity generation.
.> ChatGPT needs a 500-ml bottle of water for a short conversation of roughly 20 to 50 questions and answers, depending on when and where the model is deployed. Given ChatGPT’s huge user base, the total water footprint for inference can be enormous.
.> ... if we only consider carbon footprint reduction (say, by scheduling more AI training around noon), we’ll likely end up with higher water consumption, which is not truly sustainable for AI.
.> ... the vast majority of data centers still use potable water and cooling towers. For example, even tech giants such as Google heavily rely on cooling towers and consume billions of liters of potable water each year. Such huge water consumption has produced a stress on the local water infrastructure; Google’s data center used more than a quarter of all the water in The Dalles, Ore.
.> ... some AI conferences have requested that authors declare their AI models’ carbon footprint in their papers; we believe that with transparency and awareness, authors can also declare their AI models’ water footprint as part of the environmental impact.
- The Markup: Water Footprint of AI Technology
- A conversation with Shaolei Ren and Nabiha Syed
#TheMarkup #NabihaSyed #ShaoleiRen #AISalami #ChatGPT #CarbonFootprint #WaterFootprint #California #Oregon #DallesOregon #Virginia #DataCenterCapital #VirginiaLoudon #LoudonCounty -
Ousted founder Julia Angwin returns to The Markup - The yet-to-launch tech journalism site The Markup has had a bumpy 2019 — co-founder and editor-in-c... more: http://feedproxy.google.com/~r/Techcrunch/~3/jE1kchRWW0Y/ #juliaangwin #themarkup #startups #media