AI Art is Powered by ISIS Executions and Non-Consenual Porn
To create the powerful AI art generators, engineers are training programs to soak up as much data as possible without questioning where it’s coming from.
AI art has gotten wildly popular over the past year. Programs like Midjourney and Dall-E are generating incredible images and incredible controversy. But these programs don’t exist in a vacuum. AI’s require billions of images to learn how and what to draw. Where are they getting those pictures? They’re hoovering them up on the internet. A place full of child porn, ISIS execution videos, and non-consensual adult images. With AI it’s all garbage in, garbage out. So who controls this data and is there anything we can do about it?
Advertisement
On this episode of Cyber, Motherboard writer Chloe Xiang walks us through the ins and outs of the AI trained on ISIS execution images.
ONE EMAIL. ONE STORY. EVERY WEEK. SIGN UP FOR THE VICE NEWSLETTER.
By signing up, you agree to the Terms of Use and Privacy Policy & to receive electronic communications from Vice Media Group, which may include marketing promotions, advertisements and sponsored content.