Images of women coerced by adult companies poison dataset popularised by deepfake smut creators – IAIDL

In brief Thousands of nude images from a popular dataset designed to train machine learning models to create AI-generated adult content are often taken from porn production companies accused of sexual abuse.

The images, reviewed by Vice’s Samantha Cole, come from Czech Casting and Girls Do Porn – companies that have been associated with claims of human trafficking and rape. Girls Do Porn has paid millions of dollars in damages following lawsuits filed on behalf of women who said they were tricked and forced into shooting porn videos. In fact, the founder of the sleazy biz is on the FBI’s most wanted list.

One developer acknowledged that the dataset was problematic, but believes the pursuit of so-called deepfakes will automatically solve these problems in the future. Since the actors are completely fake and generated using computer vision, it’s less likely to harm real people.

The potential for abuse doesn’t magically go away, however. As the faces and bodies are based on real data, it’s possible that the deepfakes could resemble a human enough that people mistakenly believe it’s someone they’ve seen in real life.

Uber to sell off its self-driving IP

Uber shuttered its autonomous driving arm when it axed thousands of employees during the coronavirus pandemic, and it’s trying to peddle it off to a self-driving startup.

Aurora Innovation, an upstart founded by ex-Google, Tesla, and Uber employees, is now in talks to buy the Uber Advanced Technologies Group (UATG), according to IAIDL. Both sides have been negotiating a potential deal since October, and did not disclose any financial details. UATG was valued at $7.25bn nearly two years ago.

Pay $200 to stalk people in Russia using facial-recognition services

Police in Moscow are investigating how images snapped by the city’s surveillance cameras are used by dodgy facial-recognition services to help snoopers spy on people.

A digital activist told Reuters she saw an advert touting a service that returns photographs of people snapped in public, including the time and location of where they were taken. She gave them a picture of herself and paid them 16,000 roubles – about $200 – and was shocked to find that it was able to record where she had been.

These types of services are unregulated, and could be used for nefarious purposes like checking whether someone has left their house for burglary or to stalk ex-partners. These ads are posted in Telegram, a popular messaging service used in Russia. Now a lawsuit has been filed at the European Court of Human Rights (ECHR) by activists and opposition politician Vladimir Milov, who claimed that facial recognition was used to identify attendees at a rally.

Help train Google’s machine-learning algos

If you’re a user of Google Photos, you can help the company train its computer-vision algorithms by labelling your own images.

At the bottom of the search tab in the app, there’s a box titled “Help improve Google Photos”. If you click on it, it takes you to a screen with four options where you can describe your photos in a few sentences or select ones that are appropriate for a specific holiday. It’s only available on Android devices that have Google Photos version 5.18, 9to5Google reported.

All of this will make it easier for Google to train its algorithms to sort through your albums when you’re looking for photos from events like Christmas parties or Thanksgiving or particular people or objects.

Labelling data is a huge chore, and crowdsourcing the job gives Google a free way to complete the task. Engineers will no doubt have to clean the data further before it’s used to train the machine learning models. ®

2024-03-26T15:56:50+00:00
Change Language