#lensa — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #lensa, aggregated by home.social.
-
"#AI #art is leaking into the mainstream in the form of #stablediffusion and #Lensa, but there are serious #ethical concerns with this unregulated tech. I'm NOT anti AI, in fact, I believe AI can be of immense benefit to us in the future. But the ethics of AI in its current state MUST be talked about, in order to steer this tech in the right direction."
-
portrait drawing made by #lensa which I'm impressed and happy with :coolmsn: even though my hair is tied back and you can't see my dreads, I think this is a nice vision of me in my male form :msn:
#lensa #ai #aiart #artificialintelligence #cyberpunk #transhumanism #facetattoo #foreheadtattoo #bodymod #cyborg #shamanism #psychedelic
-
Tried the #Lensa App for the first time, and I can see it struggles with certain facial features. I'm not sure if it's because the algorithm is trained on certain types of eyes and noses etc. (which is often the case), but half the pictures couldn't figure out my eye placement and looked strange in the end. Especially the fairy ones. #LensaApp #AI #AIArt
-
The #Lensa #AI app creates “magic avatars” that turn a #user’s #selfies into stylized works of #art. But for many women, the final results are highly #sexualized.
https://www.theguardian.com/us-news/2022/dec/09/lensa-ai-portraits-misogyny
#misogyny #artwork #ArtificialIntelligence #generativeAI #generativeArt via @TheGuardian
-
Geklaute Stile, tiefe Dekolletés, Nacktheit: Kritik an KI-Avataren von Lensa
Die Foto-App Lensa erstellt seit einigen Tagen auf Basis von hochgeladenen Selfies "künstlerische" Avatarfotos. Von mehreren Seiten mehrt sich die Kritik.
#Avatar #KIBildgenerator #KünstlicheIntelligenz #Lensa #Sexismus #StableDiffusion #News
-
Die Foto-App Lensa erstellt seit einigen Tagen auf Basis von hochgeladenen Selfies "künstlerische" Avatarfotos. Von mehreren Seiten mehrt sich die Kritik.
Geklaute Stile, tiefe Dekolletés, Nacktheit: Kritik an KI-Avataren von Lensa -
So I spoke to an NBC News reporter about AI art yesterday and gave my opinions about the whole thing and about the recent Lensa app using Stabe Diffusion too. You can read the whole piece here:
https://www.nbcnews.com/tech/internet/lensa-ai-artist-controversy-ethics-privacy-rcna60242
#AIart #lensa #lensaai #lensaapp -
CW: Thoughts about Lensa - nuanced
On the topic of art and artists
These AI tools do not work without the data, and the data is nonconsensually scraped from artists work posted online. So the artists have had their labor stolen to train an AI system which will make the company a lot of money. They get no compensation for that.
Will the AI replace the artists? In some cases yes. For example, some companies may not want to pay an artist to make advertising materials and will use these systems to generate images instead. But overall I do not think it will significantly impact the financial condition of artists, it is just that a company is profiting off of theft. Te company is a data broker, the images they create are secondary to the data they have produced - stolen and coerced.
On the topic of face data
Is your face data a risk to you? Possibly. But the threat is more on the societal level. This data aggregation will result in tools that will be better at creating deep fakes. These are fabricated visuals that make it seem like something happened when it didn't. Just because these images are dressed up in fantasy (from the art that was scraped) the data itself can be used to produce more refined reality fakes. Also, it is possible to fake art materials, for example there could be new forms of revenge porn.
Does this make you bad for buying some pretty faces?
No. It is unusual that a company has managed to get real money from one of their data grifts. But ultimately the responsibility for the potential harms is on these companies and the state which will use these systems for future disinformation and carceral projects.
This has been an opportunity for public education in data and AI literacy.
Another thing that is an unexpected benefit to creating these images and playing with these tools is that it might help the average user get better at detecting deep fakes. There is a particular gestalt to these images that people may be better able to pick up on in the future.
I have worked in 3D art and animation for years and so I am still able to tell that something is off when I see a deep fake. It is possible that the more people who play with AI tools, the more people will be able to notice that something is off and be skeptical of the media they are watching.
*do not be a pedant about the difference between AI and machine learning I am trying to do public outreach here.
#LENSA #MachineLearning #AIArt #FaceData #DataBrokerage #DataJustice #TechJustice #DeepFake
-
CW: Thoughts about Lensa - nuanced
On the topic of art and artists
These AI tools do not work without the data, and the data is nonconsensually scraped from artists work posted online. So the artists have had their labor stolen to train an AI system which will make the company a lot of money. They get no compensation for that.
Will the AI replace the artists? In some cases yes. For example, some companies may not want to pay an artist to make advertising materials and will use these systems to generate images instead. But overall I do not think it will significantly impact the financial condition of artists, it is just that a company is profiting off of theft. Te company is a data broker, the images they create are secondary to the data they have produced - stolen and coerced.
On the topic of face data
Is your face data a risk to you? Possibly. But the threat is more on the societal level. This data aggregation will result in tools that will be better at creating deep fakes. These are fabricated visuals that make it seem like something happened when it didn't. Just because these images are dressed up in fantasy (from the art that was scraped) the data itself can be used to produce more refined reality fakes. Also, it is possible to fake art materials, for example there could be new forms of revenge porn.
Does this make you bad for buying some pretty faces?
No. It is unusual that a company has managed to get real money from one of their data grifts. But ultimately the responsibility for the potential harms is on these companies and the state which will use these systems for future disinformation and carceral projects.
This has been an opportunity for public education in data and AI literacy.
Another thing that is an unexpected benefit to creating these images and playing with these tools is that it might help the average user get better at detecting deep fakes. There is a particular gestalt to these images that people may be better able to pick up on in the future.
I have worked in 3D art and animation for years and so I am still able to tell that something is off when I see a deep fake. It is possible that the more people who play with AI tools, the more people will be able to notice that something is off and be skeptical of the media they are watching.
*do not be a pedant about the difference between AI and machine learning I am trying to do public outreach here.
#LENSA #MachineLearning #AIArt #FaceData #DataBrokerage #DataJustice #TechJustice #DeepFake
-
CW: Thoughts about Lensa - nuanced
On the topic of art and artists
These AI tools do not work without the data, and the data is nonconsensually scraped from artists work posted online. So the artists have had their labor stolen to train an AI system which will make the company a lot of money. They get no compensation for that.
Will the AI replace the artists? In some cases yes. For example, some companies may not want to pay an artist to make advertising materials and will use these systems to generate images instead. But overall I do not think it will significantly impact the financial condition of artists, it is just that a company is profiting off of theft. Te company is a data broker, the images they create are secondary to the data they have produced - stolen and coerced.
On the topic of face data
Is your face data a risk to you? Possibly. But the threat is more on the societal level. This data aggregation will result in tools that will be better at creating deep fakes. These are fabricated visuals that make it seem like something happened when it didn't. Just because these images are dressed up in fantasy (from the art that was scraped) the data itself can be used to produce more refined reality fakes. Also, it is possible to fake art materials, for example there could be new forms of revenge porn.
Does this make you bad for buying some pretty faces?
No. It is unusual that a company has managed to get real money from one of their data grifts. But ultimately the responsibility for the potential harms is on these companies and the state which will use these systems for future disinformation and carceral projects.
This has been an opportunity for public education in data and AI literacy.
Another thing that is an unexpected benefit to creating these images and playing with these tools is that it might help the average user get better at detecting deep fakes. There is a particular gestalt to these images that people may be better able to pick up on in the future.
I have worked in 3D art and animation for years and so I am still able to tell that something is off when I see a deep fake. It is possible that the more people who play with AI tools, the more people will be able to notice that something is off and be skeptical of the media they are watching.
*do not be a pedant about the difference between AI and machine learning I am trying to do public outreach here.
#LENSA #MachineLearning #AIArt #FaceData #DataBrokerage #DataJustice #TechJustice #DeepFake
-
CW: Thoughts about Lensa - nuanced
On the topic of art and artists
These AI tools do not work without the data, and the data is nonconsensually scraped from artists work posted online. So the artists have had their labor stolen to train an AI system which will make the company a lot of money. They get no compensation for that.
Will the AI replace the artists? In some cases yes. For example, some companies may not want to pay an artist to make advertising materials and will use these systems to generate images instead. But overall I do not think it will significantly impact the financial condition of artists, it is just that a company is profiting off of theft. Te company is a data broker, the images they create are secondary to the data they have produced - stolen and coerced.
On the topic of face data
Is your face data a risk to you? Possibly. But the threat is more on the societal level. This data aggregation will result in tools that will be better at creating deep fakes. These are fabricated visuals that make it seem like something happened when it didn't. Just because these images are dressed up in fantasy (from the art that was scraped) the data itself can be used to produce more refined reality fakes. Also, it is possible to fake art materials, for example there could be new forms of revenge porn.
Does this make you bad for buying some pretty faces?
No. It is unusual that a company has managed to get real money from one of their data grifts. But ultimately the responsibility for the potential harms is on these companies and the state which will use these systems for future disinformation and carceral projects.
This has been an opportunity for public education in data and AI literacy.
Another thing that is an unexpected benefit to creating these images and playing with these tools is that it might help the average user get better at detecting deep fakes. There is a particular gestalt to these images that people may be better able to pick up on in the future.
I have worked in 3D art and animation for years and so I am still able to tell that something is off when I see a deep fake. It is possible that the more people who play with AI tools, the more people will be able to notice that something is off and be skeptical of the media they are watching.
*do not be a pedant about the difference between AI and machine learning I am trying to do public outreach here.
#LENSA #MachineLearning #AIArt #FaceData #DataBrokerage #DataJustice #TechJustice #DeepFake
-
CW: Thoughts about Lensa - nuanced
On the topic of art and artists
These AI tools do not work without the data, and the data is nonconsensually scraped from artists work posted online. So the artists have had their labor stolen to train an AI system which will make the company a lot of money. They get no compensation for that.
Will the AI replace the artists? In some cases yes. For example, some companies may not want to pay an artist to make advertising materials and will use these systems to generate images instead. But overall I do not think it will significantly impact the financial condition of artists, it is just that a company is profiting off of theft. Te company is a data broker, the images they create are secondary to the data they have produced - stolen and coerced.
On the topic of face data
Is your face data a risk to you? Possibly. But the threat is more on the societal level. This data aggregation will result in tools that will be better at creating deep fakes. These are fabricated visuals that make it seem like something happened when it didn't. Just because these images are dressed up in fantasy (from the art that was scraped) the data itself can be used to produce more refined reality fakes. Also, it is possible to fake art materials, for example there could be new forms of revenge porn.
Does this make you bad for buying some pretty faces?
No. It is unusual that a company has managed to get real money from one of their data grifts. But ultimately the responsibility for the potential harms is on these companies and the state which will use these systems for future disinformation and carceral projects.
This has been an opportunity for public education in data and AI literacy.
Another thing that is an unexpected benefit to creating these images and playing with these tools is that it might help the average user get better at detecting deep fakes. There is a particular gestalt to these images that people may be better able to pick up on in the future.
I have worked in 3D art and animation for years and so I am still able to tell that something is off when I see a deep fake. It is possible that the more people who play with AI tools, the more people will be able to notice that something is off and be skeptical of the media they are watching.
*do not be a pedant about the difference between AI and machine learning I am trying to do public outreach here.
#LENSA #MachineLearning #AIArt #FaceData #DataBrokerage #DataJustice #TechJustice #DeepFake