Tuesday, October 12, 2010

Get it Done

We met this past Friday to get down to figuring out exactly how we were going to go about creating ARSpace's 360-degree motion control scheme for our project. We had gotten our hopes raised at the prospect of getting our hands on a Kinect to use in advance before they hit the market in late November- someone in class claimed to have seen one in action at the GDD Lab. Unfortunately, when I emailed Prof. Jacobs asking him to confirm this, I got "not true". So there went that idea.
We decided to go with the Wii Remote as our input device; it's cheap, relatively powerful, and there's already been a ton of tinkering and hacking done to these things, so the technology behind the device is easy to grasp and a lot of the features we'd like to incorporate are well-documented. I knew that a Wiimote Flash API existed already, in fact I had played around with it before, but I was surprised when we learned that there was a C# library available as well. Which is good, because I effing hate Flash.

We also ran some tests to determine how reliably the Wiimote could connect to Nick's laptop's built-in Bluetooth receiver and send it input. After some sync button fails we actually got the thing working pretty quickly, and it's a surprisingly simple setup. We just paired the Wiimote with the laptop, and the program we were running to detect the device input started receiving data from the accelerometer, the pointer, the gyroscope, and the buttons. What we found most interesting was the option for up to four IR sources to be used with the test program. What this basically means is that we could theoretically use four separate sensor bars in our design- one for each quadrant of the wrap-around display. Even better, we could also opt to rig up our own custom-made LEDs to use in place of Nintendo's official sensor bar, to better blend in with the screens.

We also did some more research into the options available to us for warping the display from each projector, to match the curvature of the screens. I've looked at software and hardware solutions, and several look promising- if somewhat cost-inhibitive. We'll see if we can't MacGyver ourselves a homemade solution instead in the meantime.

We've now got our final goals for the prototype due next Friday set in stone: we want to show the Wiimote being used with a laptop, receiving input in conjunction with two separate sensor bars both able to communicate with the Wiimote, against a flat surface with an correlating image being projected onto a flat wall. After we figure out how to get that going, we've got about three weeks until our next milestone- when we're going to try to get as much of the tech side of things working as possible. 

With any luck, we'll have come up with a setup that's functional enough to warrant diving headfirst into programming a kick-ass game demo to take advantage of our new platform, and show it off at ImagineRIT next spring. Till then, we'll just have to see where this project takes us.

No comments:

Post a Comment