Congratulations to the team at Random Workshop blog, Hash Salehi modified some code from Nicolas Saugnier to parse the LIDAR data in realtime!
Movie follows:
That didn't take long!
Friday, November 26, 2010
Thursday, November 25, 2010
GPGPU catchup
Another month, another catchup post on the world of GPU's. A few interesting things have happened:
- nVidia have released the posters from the 2010 Research Summit on GPU Technology, there are a number of absolutely fantastic posters in this collection - too many to list individually!
- CUDA 3.2 has been released, with H.264, random number, and sparse matrix support, along with the usual smattering of improvements.
- Intel OpenCL has finally been publicly released as an alpha. Ofcourse it only targets the Streaming SIMD Extensions (SSE) for Intel CPU's. Still no word from IBM or ARM on their OpenCL implementations.
- Amazon EC2 provides GPGPU capabilities now, and early benchmarks show that it performs relatively well.
- General Electric have released an embedded GPU platform. This looks great, I don't know why more people haven't mentioned this. GPU's are perfect for many of the processing problems we deal with for autonomous vehicles in interpreting vision, RADAR and LIDAR data.
- Matlabs Parallel Computing Toolbox has gained strong support for GPGPU's.
- Vincent Hindriksen runs an interesting blog on stream computing trends.
- Not sure if I've already mentioned this, but Aras Pranckevičius implemented an interesting GLSL optimizer.
- Finally BehaveRT is a GPU based behavior (ie flocking) system that can control >30,000 agents at once. Check out the video below:
Wednesday, November 24, 2010
Microsoft Kinect and other devices
The Microsoft Kinect is a very cheap 3d vision system that provides performance somewhere between stereo cameras and more expensive time-of-flight (eg:PMD) cameras. It was actually something I was hoping would come out before the end of the MAGIC 2010 competition, but unfortunately the release date was scheduled for after the end of the competition.
The device was "hacked" almost immediately after its release and it has already been widely adopted, I think this will do wonders for the robotics/computer vision/stereo-vision community. So far all we have seen are relatively simple applications, but I'm sure people will start pushing the interpretation of the data to its limits soon, especially given that game-developers are now going to be playing with this kind of data. The Wii has already done fantastic things to push the knowledge of filtering and dealing with noisy sensor data to a wider audience, and I expect the Kinect will have a similar effect.
It all started with Adafruit offering $2000 bounty for the Kinect hack, which was promptly won by Hector Martin on the 10th of November.
One of the first videos (now viral!) to do the rounds was from Oliver Kreylos showing the 3d data fused with image data and rendered in OpenGL.
Of course, it didn't stop there, other people are trying to hack the rest of the USB Kinect protocol for the inclinometer and motor controls, and trying to figure out how the Kinect IR works.
Willow Garage has already begun work on integrating the Kinect to ROS including calibration functions, and Philip Robbel managed to integrate the Kinect with a Hokuyo to do visual SLAM (using MRPT) only 7 days after the hack was released.
Hopefully we will see more practical uses for the Kinect soon! Wired published a nice story on the development path at Microsoft of the Kinect.
The best thing about this "hack" is that it has inspired others to do the same, most notably the $800 reward for hacking the Neato Robotics XV-11 vacuum cleaner LIDAR. (Whitepaper here!). At only $300 (or even $30!) for a 360 degree, 6m range, 10Hz scanning LIDAR this would do great things for the robotics community as well. Sparkfun have already made some interesting progress with their LIDAR tear-down.
Exciting times!
The device was "hacked" almost immediately after its release and it has already been widely adopted, I think this will do wonders for the robotics/computer vision/stereo-vision community. So far all we have seen are relatively simple applications, but I'm sure people will start pushing the interpretation of the data to its limits soon, especially given that game-developers are now going to be playing with this kind of data. The Wii has already done fantastic things to push the knowledge of filtering and dealing with noisy sensor data to a wider audience, and I expect the Kinect will have a similar effect.
It all started with Adafruit offering $2000 bounty for the Kinect hack, which was promptly won by Hector Martin on the 10th of November.
One of the first videos (now viral!) to do the rounds was from Oliver Kreylos showing the 3d data fused with image data and rendered in OpenGL.
Of course, it didn't stop there, other people are trying to hack the rest of the USB Kinect protocol for the inclinometer and motor controls, and trying to figure out how the Kinect IR works.
Willow Garage has already begun work on integrating the Kinect to ROS including calibration functions, and Philip Robbel managed to integrate the Kinect with a Hokuyo to do visual SLAM (using MRPT) only 7 days after the hack was released.
Hopefully we will see more practical uses for the Kinect soon! Wired published a nice story on the development path at Microsoft of the Kinect.
The best thing about this "hack" is that it has inspired others to do the same, most notably the $800 reward for hacking the Neato Robotics XV-11 vacuum cleaner LIDAR. (Whitepaper here!). At only $300 (or even $30!) for a 360 degree, 6m range, 10Hz scanning LIDAR this would do great things for the robotics community as well. Sparkfun have already made some interesting progress with their LIDAR tear-down.
Exciting times!
Monday, November 22, 2010
Batch import OBJ with Blender
The OBJ importer in blender actually supports batch-import, to enable this you need to hold SHIFT when you select the importer from the File->Import->Wavefront (obj) menu. You will then be prompted to select a directory. Unfortunately, the importer will import each OBJ file into a separate (and incorrectly constructed) scene. To alter this you will need to edit the "import_obj.py" script and comment lines 1164 and 1165:
for f in files: #scn= bpy.data.scenes.new( stripExt(f) ) #scn.makeCurrent()Saves you from having to write a batch import script yourself! A great utility to convert to OBJ files from various formats (with a batch-convert mode!) is polytrans.
Thursday, November 18, 2010
Team MAGICian/WAMbot takes 4th place!
All of our teams hard work has paid off as we claimed 4th place in the MAGIC 2010 competition! Our system was actually very close to what the top two teams had implemented, and with a bit of luck we might have been able to snatch a prize. Maybe next time!
Above is a photo composition by Rob Reid of all the robots at the challenge stand, ordered from 1st to last (Left to Right). Team Michigan came in 1st, followed by University of Pennsylvania, and RASR took 3rd place. It's been a great opportunity to talk to all the other teams and see how they solved their problems and to see how many of the same headaches we all had. The LWC was a great place to meet and greet a number of industry people and also a nice crowd from DSTO who showed a lot of interest in what we have done. It was great to get so much positive feedback from everyone on our system.
Above is a photo composition by Rob Reid of all the robots at the challenge stand, ordered from 1st to last (Left to Right). Team Michigan came in 1st, followed by University of Pennsylvania, and RASR took 3rd place. It's been a great opportunity to talk to all the other teams and see how they solved their problems and to see how many of the same headaches we all had. The LWC was a great place to meet and greet a number of industry people and also a nice crowd from DSTO who showed a lot of interest in what we have done. It was great to get so much positive feedback from everyone on our system.
Friday, November 12, 2010
Ram Shed Challenge
The Ram Shed challenge was a media-event for the MAGIC 2010 competition. It was our first chance to see all the other teams robots and a chance to see them all at work. Our robots performed very well one the day and navigated the maze better than any other team, the entire open section of the maze was explored fully autonomously requiring no human interaction at any point. All of this despite the fact our wifi gear was mounted outside and failed during the rain, after a few minutes we were able to replace it and continue our mapping operations!
The photos show our robots at the starting line, followed by them moving off in the starting arena, then navigating the maze, and finally the prize.
A fabulous effort by our team, and I'm extremely proud of our achievements on this day!
The photos show our robots at the starting line, followed by them moving off in the starting arena, then navigating the maze, and finally the prize.
A fabulous effort by our team, and I'm extremely proud of our achievements on this day!
Sunday, November 07, 2010
Team MAGICian
Our MAGIC 2010 test day is over, so it's time to catch up on some sleep!
Looking forward to seeing how the other teams will perform.
Looking forward to seeing how the other teams will perform.
Subscribe to:
Posts (Atom)