Talk about “Realtime Deepfakes” is now on YouTube

September 11, 2019

Ever wondered how we realized doing Deepfakes in realtime? Watch our talk on YouTube.

At this point in time there the video is only available in German language. We hope to serve you an English recording as soon as the J-Fall is over. Bert van Schrijver doing organization at J-Fall told us that presentations will be recorded and uploaded to YouTube. As we will have our Deepfake talk there it shouldn’t take much time to be available. ūüėČ

Advertisements

Realtime Deepfakes in the media

September 3, 2019

Within the last week a few news articles were published about our realtime deepfakes implementation. Here are some articles linked to their original sources.

Furthermore Daniel Fritzler, a visitor of our booth at TDWI conference, tested our realtime deepfake showcase and wrote a Science Fiction short story on this.

Upcoming talks:

  • Herbstcampus, Nuremberg, Germany, 4th of September, 10am
  • Nerd Nite Munich Special, Munich, Germany, Hochschule M√ľnchen, 5th of September, 20pm
  • JavaZone, Oslo, Norway, 12th of September, 15.40pm

Now in Realtime – Deepfakes 2.0 published by TNG

June 19, 2019

It has been quiet on our blog for a while. The reason for that is that we were working very hard on new awesome showcases. There was a mission we defined for our team:

“Make deepfakes work in realtime on video streams”

Deepfake is a technique for human face synthesis based on artificial intelligence. It is used to combine and superimpose existing images and videos onto source images or videos using a machine learning technique known as generative adversarial network.

Long story short – We did it!

The hardware hacking team of Unterföhring near Munich based IT consulting company TNG Technology Consulting GmbH wrote a new software based on DeepFaceLab.

The result is a software realizing Deepfakes in realtime, instead of postprocessing video sequences like in the original understanding of Deepfakes.

A person is filmed by an RGB webcam and the face will be replaced by someone else‚Äės face, e.g. Barack Obama, with the corresponding facial expression.

At this point in time we only want to show you a short sequence of our Deepfakes 2.0 implementation as our neural networks are still in training mode! Stay tuned!

For Deepfakes 2.0 a new conference talk was written as well. It was premiered at Big Techday 2019 in Munich, Germany.

If you are interested into Deepfakes 2.0 and want to invite us for giving a talk on that don’t hesitate to contact us!


Parrots On JavaZone – again!

September 25, 2018

If you want to enjoy our talk about Augmented Reality and an app which enables you to have “Star Wars” like telephony calls enjoy the video footage made by the great guys at JavaZone Oslo! JavaZone is the largest community driven conference which is always worth a visit!


Parrots on goto Night

April 27, 2018

Our talk about the Avatar telepresence robotics system using the Softbank Nao and the Microsoft Kinect at the goto meeting in Amsterdam is now available on Youtube. It features a demonstration of the system as well as live coding the robot and  some chocolate. Enjoy!


Parrots On JavaZone

September 14, 2017

Our presentation about the Avatar telepresence robotics system using the Softbank Nao and the Microsoft Kinect at the JavaZone 2017 is now available on Vimeo. It features a demonstration of the system as well as live coding the robot and a stage dive of our Nao. Enjoy!


Interview with Intel Developer Zone

August 12, 2017

Thomas Endres and Martin Förtsch- both Intel Black Belt software developers Рwere interviewed for the Intel Developer Zone. You can read the complete interviewabout gesture control, AR/VR, robotics and Artificial Intelligence on the Intel Developer Zone.

Intel Developer Zone Interview

valkyrie