The camera lies

If you think we’ve got problems with fake news now, wait until deepfake is mainstream.

The Guardian:

Show a neural network enough examples of faces from two celebrities and it’ll develop its own mental model of what they look like, capable of generating new faces with specific expressions.

Ask it to generate a set of expressions on one face that are mapped onto a second face, and you have the beginnings of a convincing, automatically generated, utterly fake video. And so, naturally, the internet created a lot of porn.

I haven’t seen the porn – I have no interest in seeing videos created without people’s consent – but I have seen what the technology can do in the hands of ethical people.

This is absolutely stunning: Sven Charleer replaces actors with his wife.

Beyond just pure fun, I can only imagine how people will start turning this tech into business ideas. Fashion will be huge (what would I look like with this kind of hair, this kind of dress…), fitness could be interesting (do I look good with muscles, will I really look better skinny), travel (this is you standing on a beach is going to be quite convincing). It’ll bring advertising to a whole new level. No need to imagine what if, they’ll tell you what your “better” life will look like! And it’ll be hard to get that picture out of your head…

This technology is in its infancy, but it’s getting smarter by the day. And the potential ramifications for everything from revenge porn to political propaganda are enormous and disturbing.

Back to The Guardian:

It’s grim. But it’s not going to go away. The technology is publicly available, extensively documented, and the subject of research around the globe. This is our world now. As Lucas warned MPs: “Please don’t spend too much time looking in the mirror at what Russia did to us; look through the windscreen at what’s coming down the road. That’s much more dangerous.”


Posted

in

, ,

by