Pecha Kucha video of “See like a Terminator”

March 4, 2016

On the 4th of February we held a keynote on the topic “See like a Terminator – Augmented Reality with Oculus Rift” at the OOP conference in the International Congress Center in Munich in German language.

In the evening Martin Heider (one of the organizers of the OOP conference) staged a Pechu Kucha night where many people held a talk with 20 slides using 20 seconds for each slide. Here you can watch our Pecha Kucha about a self-built Augmented Reality wearable device based on Intel RealSense and Oculus Rift with which you can see the world through the eyes of a Terminator in English language. Of course this is a very, very short version of our original talk which has been held in many different versions already.

The talk in its original length (60 minutes) was awarded with the Oracle JavaOne Rock Star award when it was held in San Francisco in 2015.

If you want our talk “See like a Terminator – Augmented Reality with Oculus Rift” at your conference as Keynote or standard session (or something else) don’t hesitate to contact us. The talk  is available in lots of time formats and updated all the time with the newest bleeding edge technology out there e.g. Atheer Air.

The Pecha Kucha session was staged by Martin Heider and is definitely worth a visit when being at the OOP conference.

Don’t miss our upcoming talks and events:

09 Mar 2016 03:00 PM – JavaLand in Brühl near Cologne
15 Mar 2016 04:40 PM – CeBIT Developer World / Hall 11, Stand D03 in Hannover
24 Mar 2016 07:00 PM – Nerd Nite München in Munich


TNG Augmented Rift presented at Augmented World Expo and Big Techday 8

June 15, 2015

The challenge

The hardware hacking team from TNG Technology Consulting GmbH started to work on a device which is capable to do full field of view augmented reality in March 2015 when the company’s winter retreat was taking place.

The mobile "TNG Augmented Rift" is a full field of view Augmented Reality device capable to do e.g. face tracking, face identification, emotion detection, heart rate detection, distance measurements, etc. Speech recognition for command and control is implemented as well.

The mobile “TNG Augmented Rift” is a full field of view Augmented Reality device capable to do e.g. face tracking, face identification, emotion detection, heart rate detection, distance measurements, etc. Speech recognition for command and control is implemented as well.

The goal

The goal was to built a device made with off-the-shelf hardware in one working day. Within this short period of time eleven TNG software consultants were able to implement a prototype which is based on an Oculus Rift DK2 and an Intel RealSense F200 3D-camera. With some additional days and a core team of about six TNG software developers we were able to add some more features for the Augmented Rift.

The TNG Augmented Rift consists of off-the-shelf hardware like the Oculus Rift and the Intel RealSense F200 3D-camera.

The TNG Augmented Rift consists of off-the-shelf hardware like the Oculus Rift and the Intel RealSense F200 3D-camera.

The features

The TNG Augmented Rift has the following features:

  • See through” Augmented Reality head mounted display with Oculus Rift DK2 (in contradiction to the HoloLense the whole field of view is augmented)
  • You can see a lot of real time elements inside the Augmented Rift in 3D

The following visible elements are supported:

  • Real environment mixed with augmented information and 3D objects
  • Face detection elements
  • Face tracking information
  • Face landmarks
  • Emotions
  • Heart rate as text
  • (in progress) Heart rate as a graph (“Electrocardiography mode”)
  • Speech recognition for command and control (“Okay Rift, this is Martin.” – “Is this really Martin?” – “Yes!” – “From now on I will remember that user 100 is Martin.”)
  • “Terminator mode” (world augmented with red-shaded textures)

The setup

We need two cameras to record the real world environment. The video streams of the two cameras are merged to a stereo (3D) capture. The webcam processor itself consists of different parts. We use the OpenCV library to do long distance measuring and face recognition of people which are not in the nearer distance.

The Augmented Rift consists of two cameras for recording the real world. The information gets streamed into the Oculus Rift DK2. The Intel RealSense F200 is used to add features like face tracking, face identification, pulse detection and much more.

The Augmented Rift consists of two cameras for recording the real world. The information gets streamed into the Oculus Rift DK2. The Intel RealSense F200 is used to add features like face tracking, face identification, pulse detection and much more.

The Intel RealSense detector uses an F200 3D camera that can be used for realizing the features mentioned above (near field face tracking, face identification, face landmark detection and tracking, emotion detection a.s.o.). All the Augmented Reality elements we gather are added to our stereo webcam capture to enrich the real world environment capture with useful information.

Wait but why?

Of course this device based on an Oculus Rift DK2 is a little bit awkward. But imagine what would happen if the computers and cameras used are getting smaller and smaller within the next few years? It doesn’t need a Oculus Rift DK2 to realize such projects. What we need is glasses with an integrated full field of view transparent display to enrich the world with Augmented Reality elements in a progressive way.

Nowadays devices are only augmenting a small fraction of the field of view of the user (see Epson BT200 or Microsoft Hololens). You need to focus on a special direction to see Augmented Reality elements on the display. With the Augmented Rift this is not problem but the device is huge. If there will be glasses with transparent displays available we could transfer our solution to these new devices.

In the front view of the Augmented Rift you can see two web cameras for streaming a stereo capture image into the Augmented Rift. This would not be necessary when using normal glasses with transparent displays inside the glasses and an R200 camera.

In the front view of the Augmented Rift you can see two web cameras for streaming a stereo capture image into the Augmented Rift. This would not be necessary when using normal glasses with transparent displays inside the glasses and an R200 camera.

At least you could argue that the 3D camera used is too huge for such a field of application. But we could also use the very new Intel RealSense R200. It’s such a tiny device and so powerful that you can easily attach it to glasses. The weight is negligibly small.

Intel RealSense R200

Intel RealSense R200

We want to make a video of the TNG Augmented Rift as soon as possible. See the world like a real T-800! The future is here!

The talks and demonstrations

The Augmented Rift was presented at the Augmented World Expo in Santa Clara, California, USA at the Intel booth from the 8th to the 10th of June 2015. On the 12th of June we held a talk at the Big Techday in Munich with more than 100 developers. After the talk hundreds of people wanted to try the Augmented Rift with its Terminator mode on their own!

Disclaimer

The TNG Augmented Rift is developed by TNG Technology Consulting GmbH software consultants. The software parts used by the Intel RealSense are developed by ParrotsOnJava.com

Hasta la vista!


First pictures of the Intel RealSense R200 development kit

May 29, 2015

This week we’ve received an interesting package by Intel USA with the very new Intel RealSense F200 camera. We are really surpised how small the camera is. The camera is a longer range peripheral 3D camera, perfect for sensing the environment (for Windows and Android tablets, 2-in-1s, and more). The “R” in the cameras model name stands for “Rear” since it is best suitable using it while the camera is facing away from you (instead of the F200 camera which is facing frontally).

The camera has a range of 3-4 metres (inside a room) and a larger range out of doors. The key features of the camera are

  • 3D recording (faces, people, environment)
  • Depth camera
  • Face tracking and face recognition
  • Measuring in general

We are now waiting for the SDK to start our first projects with this tiny piece of high-tech.

Intel RealSense R200

Intel RealSense R200


ParrotsOn JavaLand Conference

April 1, 2015

Last week we’ve attented the JavaLand Conference in the  theme park Phantasialand in Brühl near Cologne. We’ve demonstrated our showcases at the “Java Innovation Lab” starting from flying drones with bare hands up to our very new web based Intel RealSense HTML5 game “Parrots On Target”.

TNG Technology Consulting GmbH – the company we are working for as Software Consultants – was supporting us in doing this. We also showed another really cool showcase which was built during the so called “TNG Winterretreat”. This TNG showcase is about an Oculus Rift DK2 enhanced with two cameras to build an “Augmented Rift”. The idea is to view the real world through the Oculus Rift DK2 enhanced by Augmented Reality elements. The Terminator vision is real!

ParrotsOn JavaLand 2015


Microsoft STC 2014 Keynote talk on YouTube

August 8, 2014

This goes to all German speaking readers out there. You now have the chance to see one of our conference talks on YouTube. This Microsoft Keynote was held at the the Student Technology Conference on the 4th of April 2014. Please note that we’re updating our talk frequently since the amount of devices for gesture control is frequently increasing.

Within the time our library was extended with much more functionality. From now on we can control robots like the Thymio II, little iRacer cars or the Sphero Ball 2.0. For this we had to extend the library with Bluetooth compatibility among other things.

Furthermore ParrotsOnJava.com will attend the Intel RealSense Challenge as so called “Ambassadors”, a special challenge track for teams which attended the last Perceptual Computing World Championship successfully. Stay tuned for our new ideas! Believe us, those ideas are really, really crazy!


ParrotsOnJava.com at Java Forum Stuttgart

July 23, 2014

On the 17th of July we had our talk about gesture control using different 3D camera systems like Intel Real Sense (also known as Creative Gesture Camera or Senz3D), Leap Motion, and Kinect v2. Our talk was very well visited. People were really impressed with what you can do with gesture cameras. Like always, we let people try the Parrot AR.Drone 2.0 demo showcase with all different camera technologies.

We’ve just ordered some Occulus Rift VR version 2 glasses since we have some ideas of really cool and impressive real world showcases.

Stay tuned for more information on this!

unnamed


Microsoft presented ParrotsOnJava.com at the Maker Faire Hannover 2014

July 7, 2014

Microsoft attented the Maker Faire Hannover 2014 and invited us to present some showcases with their very new Kinect v2. Microsoft itself has shown some demo showcases with the Intel Galileo board which should be an alternative to the well known Raspberry Pi. They were able to install a Windows 8 onto this Intel Galileo boards.

ParrotsOnJava.com presented very new demo showcases. Now they are able to control robots and cars using different 3D-camera systems. This makes it possible to let people (and especially children) control little cars and robots with their bodies and hands. This is really cool if we don’t have that much space to make our showcase where drones were controlled with bare hands.

We are still extending our portfolio of different showcases. After more than one year developing our gesture control software we want to start with developing some new real world video games which were controlled with bare hands. Stay tuned!


Kinect v2 controls Thymio II robot

July 2, 2014

For the “Deutsches Museum” exhibition project the ParrotsOnJava.com team is still experimenting with different technologies to find the best mini RC-car which can be gesture controlled by visitors of the museum. At this point in time the Thymio II robot seems to be a very good solution.

For demonstration reasons you can see a Kinect v2 which is connected with a windows laptop. The laptop sends the gesture data through the internet using an android smartphone tethering connection. The raspberry pi is connected to this wireless phone connection as well. The gesture data is transferred to the raspberry pi. A python script translates  the incoming data and runs some ASEBA functions against the Thymio II ASEBA interface.


Demonstration of a Intel RealSense gesture controlled Thymio II robot for Deutsches Museum exhibition

June 23, 2014

The ParrotsOnJava.com team is bulding an exhibition project for the hugest technical museum all over the world – the Deutsches Museum München.

The Intel RealSense exhibition project will allow any visitor to try gesture control. In this case we’ve decided to let the visitor control a small Thymio II robot with bare hands. This small video is a short demonstration of what we’re currently working on and what the current state is.


ParrotsOnJava at JavaOne San Francisco, Maker Faire Hannover and Java Forum Stuttgart

June 12, 2014

After having several successful invited talks at the Javaland conference in Brühl near Cologne, the keynote talk at Microsoft Student Technology in Berlin, the Nerd Night in Munich and some more, we can now announce some new events.

We are still working on the exhibition project for the Deutsches Museum München. We have many new ideas with Intel RealSense and other gesture control technologies based on 3D depth camera systems. Video updates will follow when we have some spare time.