December 7, 2013
We will also show this game at the JavaLand conference in March 2014 – where we have been invited to do a talk.
November 8, 2013
Thomas and Martin were qualified for the Microsoft developement program to receive a developement unit of the new KINECT 2.
You know what we want to implement with it? Yes! You’re right!
With this new native interface device we want to implement some metaphors which were completely different from the Leaps Motion One and Intels perceptual computing gesture cam metaphors. At the end of november 2013 we will receive our developement unit. Then we want to start coding immediately.
Intel just published another video interview taken at the Intel Developer Forum (IDF 2013) in september 2013.
October 25, 2013
For some time now, the Parroteer drone control source code has been Open Source using the GPL v3 license. The code is also available on GitHub. Since the application is becoming more and more stable, we thought that it would be a good idea to write about it now.
The application contains of 2 products:
To get started, you’ll need:
- The Oracle JDK or OpenJDK
- Apache Maven
- A LeapMotion controller with latest drivers installed
- Alternatively, a Senz3D camera and the Intel Perceptual Computing SDK
Have fun using it.
October 25, 2013
Yesterday, we were invited to show the Parroteer gesture based drone control library at the Munich Startup Demo Night. We had a lot of fun and met lots of great people sharing our passion for great products and native user interfaces.
October 21, 2013
ParrotsOnJava.com was invited to the Evobis StartUp Demo-Night in munich. The event is at the Werk.1 (Kultfabrik, Tonhalle) near Ostbahnhof were we’ve just implemented our Parroteer software during the Intel Perceptual Computing hackathon. If you want to attend the demo-night have a look to the following Evobis-website: http://www.evobis.de/evobis/terminuebersicht/termindetails/?termin_id=177
In the middle of november Intel wants to announce the winners of the Intel Perceptual Computing Worlwide Challenge.
September 20, 2013
Thomas and Martin were invited by Intel to the IDF 2013 (Intel Developer Forum) in San Francisco to demonstrate their drone control software named “Paroteer”. We had wonderful days at the IDF conference and met hundreds of people interested in our software which makes it possible to control a Parrot AR.Drone 2.0 with bare hands.
The press article and tech video interview made by Intel is now accessible under http://www.intelfreepress.com/news/drones-fly-hands-free-with-gestural-technology/6877
Only interested in the tech video interview? Here we are!
August 20, 2013
We were qualified for the Intel Perceptual Computing SDK worldwide challenge with our virtual flight control for drones. Our software interprets the signals coming from the creative interactive gesture camera (which will maybe named as “Senz3D”) and sends messages to the drone to control it in all directions.
With this new – more professional – video we are demonstrating how the software works under different conditions. You will also be able to see how the camera and the drone control works.
Want to get in touch with us? Visit us at the Intel Developer Forum 2013 in San Francisco from 10th to 12th of september 2013 when having our demo showcase!
June 28, 2013
The drone steering video can now also be seen on the Intel Germany channel:
Intel in Germany
Please like the video! After the world challenge is over, we will release the app to the public.
June 24, 2013
This weekend, me and my colleague Martin attended the Intel Perceptual Computing Hacknight in Munich. We implemented a control mechanism for steering the AR drone using the Creative Gesture Camera. While similar to the Leap Motion, we implemented a completely different control metaphor that resembles an old airplane steering wheel.
With our app, we won the price for the most innovative app. This is great news. We intend to publish the library soon!
Have a look at the videos and enjoy:
April 13, 2013
New footage: Now, we can finally use the Leap Motion controller to control all directions (roll, pitch, yaw and height). Have a look at it:
After a long time, we started a small project combining the AR drone and the Leap Motion controller, yesterday. This has been the most interesting controller of all, so far. Flying the drone by moving your hands only is the most intuitive way to do it, I think. We had a lot of fun: