Deepfakes in Realtime with a five day trained deep neural network

August 19, 2019

Deep fakes are fake images or videos generated by Artificial Intelligence. The media produced by deep neural networks can seem deceptively real. In the most common case, faces are exchanged, creating the illusion of seeing another person in a video which wasn’t orignally there.

Software developers of the IT consulting company TNG Technology Consulting GmbH based in Unterföhring near Munich have succeeded in developing a software that enables deep fakes to be generated in realtime. This makes it possible to distort live images from cameras in such a way that the face in front of the camera is swapped with that of another person.

The facial expressions of a person standing in front of a camera are recognized and displayed on the faces of known persons, such as, for example, Angela Merkel, Barack Obama or Elon Musk, broadcast. In the final video stream, the software first removes the input face completely, and then replaces it with the Deep Fake. The prototype presented by TNG is a kind of mirror, in which one recognizes his facial expressions, but not his face.

The basis for this is Artificial Intelligence. Through the use of various computer vision and neural network techniques, faces are recognized in the video input, translated and integrated back into the video output. Through this technique, it is possible to project deceptively real imitations on other people.

Auto-encoder networks were trained by means of so-called GANs (Generative Adversarial Networks). In addition, the researchers used various other neural networks for facial recognition and facial segmentation.

Advertisements

Now in Realtime – Deepfakes 2.0 published by TNG

June 19, 2019

It has been quiet on our blog for a while. The reason for that is that we were working very hard on new awesome showcases. There was a mission we defined for our team:

“Make deepfakes work in realtime on video streams”

Deepfake is a technique for human face synthesis based on artificial intelligence. It is used to combine and superimpose existing images and videos onto source images or videos using a machine learning technique known as generative adversarial network.

Long story short – We did it!

The hardware hacking team of Unterföhring near Munich based IT consulting company TNG Technology Consulting GmbH wrote a new software based on DeepFaceLab.

The result is a software realizing Deepfakes in realtime, instead of postprocessing video sequences like in the original understanding of Deepfakes.

A person is filmed by an RGB webcam and the face will be replaced by someone else‘s face, e.g. Barack Obama, with the corresponding facial expression.

At this point in time we only want to show you a short sequence of our Deepfakes 2.0 implementation as our neural networks are still in training mode! Stay tuned!

For Deepfakes 2.0 a new conference talk was written as well. It was premiered at Big Techday 2019 in Munich, Germany.

If you are interested into Deepfakes 2.0 and want to invite us for giving a talk on that don’t hesitate to contact us!


Parrots On Devoxx Belgium with Realtime Style Transfer AI

December 4, 2018


Parrots On JavaZone – again!

September 25, 2018

If you want to enjoy our talk about Augmented Reality and an app which enables you to have “Star Wars” like telephony calls enjoy the video footage made by the great guys at JavaZone Oslo! JavaZone is the largest community driven conference which is always worth a visit!


Stereoscopic Realtime Style Transfer – A Deep Learning Showcase

August 20, 2018

We are working hard on new showcases and talks about Deep Learning and neural networks. This is the reason why there aren’t many new posts on this page.

Please enjoy our “elevator pitch” about our new Stereoscopic Realtime Style Transfer showcase based on the Dell Visor Mixed Reality headset. We were able to train our neural network with the style of the music video from A-ha “Take on me”.

Furthermore we wrote a technical article for JavaPRO. You can order an issue for free at https://magazin.java-pro.de/


Parrots on goto Night

April 27, 2018

Our talk about the Avatar telepresence robotics system using the Softbank Nao and the Microsoft Kinect at the goto meeting in Amsterdam is now available on Youtube. It features a demonstration of the system as well as live coding the robot and  some chocolate. Enjoy!


HoloCom – An Augmented Reality Showcase for Microsoft HoloLens

March 2, 2018

The hardware hacking team of the TNG Technology Consulting GmbH just went to another – so called – Winter Retreat into the alps to work on a new prototype. This time we had access to a Microsoft HoloLens.

What if Augmented Reality glasses allowed you to communicate with friends far away as if they were in the same room? The hardware hacking team of TNG Technology Consulting has implemented a software prototype enabling such hologpraphic telepresence. Using this software it is possible to project the 3D shape of another person into your own field of view. This resembles the holograms in the “Star Wars” movies.

The Augmented Reality concept already exists since the late 1960s. Microsoft Hololens is one of the first autonomous devices that can enrich the real world with computer generated elements and so-called holograms. Within this talk we will show and explain some ideas and concepts from the field of Mixed Reality. We will present some technical details of the device and demonstrate them within live showcases. Furthermore, we will show some simple programming examples for the Hololens using Unity 3D.

For this project a very new conference talk with live demo and live coding is available.

Enjoy!