|
|
|
|
|
|
|
|
|
|
|
Valen
Experienced Roboteer
Joined: 07 Jul 2004
Posts: 4436
Location: Sydney
|
processing 8mbyte/second is nothing, you might even be able to push that in python.
If you throw some numpy at it to handle the array maths you could easily do 30FPS of depth processing data.
a simple bounding box fit would be sufficient I'd imagine, assume that the bots are generally going to be "flat" to the floor give one of the game engines the point cloud for the blob, then do particle inside box tests on a number of rotations, the orientation with the most points inside the bounding boxes wins.
If you really want to push the image processing, use CUDA, there are python bindings for it as well as pycuda (though that seems to have died)
I'm unsure if multiple kinects will play nice when they can see each others IR spots.
chris... You might want to attend a NSW event before you go too far about putting computers and fancy sensors in your bot. The last people who tried that chickened out when they saw Plan-B, and its a wuss by todays standards.
having a sensor overview of the arena, decent computer power giving commands to the bot is an eminently sensible approach. The only addition I'd think about making would be streaming high rate IMU data back to the controller, alternately,keep that inside the bot and add some smarts to it, should make that whole "drive in a straight line" thing easier.
Talking with a bunch of american rocket people the other day, being thanksgiving the topic of flying turkeys came up. It was a short step from there to "assuming a spherical turkey of uniform density" _________________ Mechanical engineers build weapons, civil engineers build targets
|
Tue Nov 30, 2010 6:17 pm |
|
|
|
|
|
|