Deep fakes are fake images or videos generated by Artificial Intelligence. The media produced by deep neural networks can seem deceptively real. In the most common case, faces are exchanged, creating the illusion of seeing another person in a video which wasn’t orignally there.
Software developers of the IT consulting company TNG Technology Consulting GmbH based in Unterföhring near Munich have succeeded in developing a software that enables deep fakes to be generated in realtime. This makes it possible to distort live images from cameras in such a way that the face in front of the camera is swapped with that of another person.
The facial expressions of a person standing in front of a camera are recognized and displayed on the faces of known persons, such as, for example, Angela Merkel, Barack Obama or Elon Musk, broadcast. In the final video stream, the software first removes the input face completely, and then replaces it with the Deep Fake. The prototype presented by TNG is a kind of mirror, in which one recognizes his facial expressions, but not his face.
The basis for this is Artificial Intelligence. Through the use of various computer vision and neural network techniques, faces are recognized in the video input, translated and integrated back into the video output. Through this technique, it is possible to project deceptively real imitations on other people.
Auto-encoder networks were trained by means of so-called GANs (Generative Adversarial Networks). In addition, the researchers used various other neural networks for facial recognition and facial segmentation.
We are working hard on new showcases and talks about Deep Learning and neural networks. This is the reason why there aren’t many new posts on this page.
Please enjoy our “elevator pitch” about our new Stereoscopic Realtime Style Transfer showcase based on the Dell Visor Mixed Reality headset. We were able to train our neural network with the style of the music video from A-ha “Take on me”.
The hardware hacking team of the TNG Technology Consulting GmbH just went to another – so called – Winter Retreat into the alps to work on a new prototype. This time we had access to a Microsoft HoloLens.
What if Augmented Reality glasses allowed you to communicate with friends far away as if they were in the same room? The hardware hacking team of TNG Technology Consulting has implemented a software prototype enabling such hologpraphic telepresence. Using this software it is possible to project the 3D shape of another person into your own field of view. This resembles the holograms in the “Star Wars” movies.
The Augmented Reality concept already exists since the late 1960s. Microsoft Hololens is one of the first autonomous devices that can enrich the real world with computer generated elements and so-called holograms. Within this talk we will show and explain some ideas and concepts from the field of Mixed Reality. We will present some technical details of the device and demonstrate them within live showcases. Furthermore, we will show some simple programming examples for the Hololens using Unity 3D.
For this project a very new conference talk with live demo and live coding is available.
On the 4th of February we held a keynote on the topic “See like a Terminator – Augmented Reality with Oculus Rift” at the OOP conference in the International Congress Center in Munich in German language.
In the evening Martin Heider (one of the organizers of the OOP conference) staged a Pechu Kucha night where many people held a talk with 20 slides using 20 seconds for each slide. Here you can watch our Pecha Kucha about a self-built Augmented Reality wearable device based on Intel RealSense and Oculus Rift with which you can see the world through the eyes of a Terminator in English language. Of course this is a very, very short version of our original talk which has been held in many different versions already.
If you want our talk “See like a Terminator – Augmented Reality with Oculus Rift” at your conference as Keynote or standard session (or something else) don’t hesitate to contact us. The talk is available in lots of time formats and updated all the time with the newest bleeding edge technology out there e.g. Atheer Air.
The Pecha Kucha session was staged by Martin Heider and is definitely worth a visit when being at the OOP conference.
As the Keynote talk is in German language you can visit the “Pecha Kucha all night long” at OOP conference instead. It takes place on the same day, but has a different time slot. “Pecha Kucha all night long” is from 6:30 PM to 8:00 PM. For more information please visit the this link.
From Tuesday, the 2nd of February to Thursday, the 4th of February you can find us at the Intel booth showing bleeding edge showcases with Augmented Reality, 3D-cameras, gesture controls, IoT a.s.o. At lunch time and coffee breaks in the afternoon you can find us next to the OOP Internet café where we are showing gesture controlled drones. Visitors are also allowed to fly the drones with bare hands utilizing the Intel RealSense technology.