Tuesday, May 8, 2012

Project files

Here's a link to all the files pertaining to our project!

KiNaomatics

This project was a lot of fun, and perhaps we'll update this blog some more in the future...

Friday, April 27, 2012

Shadow Function?!




Some Slides







KiNaomatics: An Abstract


KiNaomatics
Spencer Lee and William McDermid
Group Pichu

Abstract:

Humanoid robots have many applications ranging from the fun to the incredibly useful. Humanoid robots are currently programmed to play soccer against each other autonomously, and much work is currently being done to find other, more helpful uses for them as well. Humanoids are ideal automatons for navigating a world built for humans, and there is a lot of research going into developing artificial intelligence to control them. However, there are many applications of humanoids entirely separate from the realm of artificial intelligence.

Introducing KiNaomatics: Taking small humanoids where they haven’t gone before--into YOUR shoes! Using data taken from a Microsoft Kinect, we compute the joint angles of the tracked human’s upper torso, then map these angles to the servos in an Aldebaran Nao’s upper torso. This effectively allows the human to control the Nao robot’s arms simply by moving. We also track the user’s position in free space to command a ZMP-based walk engine onboard the robot.

The End is Nigh!





Monday, April 23, 2012

Behind Schedule

Well, unfortunately, we did not reach our baseline by Wednesday night like we expected. Furthermore, having spent Thursday-Sunday at the US Open for Robocup, we haven't been able to get much done. Tonight, however, we began looking into interpreting the joint orientations with OpenNI using a 3x3 rotation matrix. The left column of the rotation matrix represents the joint's position with respect to the x-axis, the middle column represents the orientation with respect to the y-axis, and the right column represents the orientation with respect to the z-axis. We plan on utilizing this orientation data to control the head movements of the Nao, to improve our control over the arms, and to allow us to control the robot's walking direction.

Furthermore, we plan to stream the video from the Nao's cameras wirelessly. We want to enhance the "virtual reality" aspects of our project, and being able to see what the robot sees would help accomplish this.

Bad news: the BeagleBoard still isn't playing nice with the Kinect. We think there may be a problem with the ARM compiled version, so we're going to attempt to alter the compilation of the x86-Release version, as was done in the past, before the ARM version was available.

Wednesday, April 18, 2012

Let's ignore the BeagleBoard for now.

Well, installing an older Ubuntu distribution ended up being a bust. The older distro would not boot, so we decided to reinstall the latest revision (r7) of 11.10 for ARM. Once again, we've set the BeagleBoard up again, compiled everything, and are ready to test the Kinect. However, we've been to busy actually making progress to do this.

We define "actually making progress" like this:
Since our last post, we've begun development on a laptop running 32-bit Ubuntu 10.04 with the latest version of OpenNI (unstable), manctl SensorKinect (unstable), and the NITE middleware (unstable). This configuration worked perfectly, first try. Using the NiSimpleSkeleton sample as a launchpad, we were able to use the joint positions detected by OpenNI to calculate the joint angles we wanted. We are also able to serialize an array containing our calculated joint angles, then wirelessly transmit this information to the robot, where the data is deserialized.

Our plan is to reach our baseline goal of basic mimicry (shadowing arm/shoulder movement) and basic walking control by TONIGHT. We leave tomorrow at 5:30 am for the Robocup US Open in Portland, Maine.