Saturday, January 1, 2011

Final Presentation

Here's the link to our Final Presentation. We didn't get as far as we would have liked, but we're still happy with having a conceptually strong idea fleshed out as much as it was. (Note: the video on the second slide won't play unless you download the .pptx file, sorry for the trouble)

Thursday, October 21, 2010

Abstract

RIT Student Research and Innovation Symposium                                                                         Fall 2010

 

Enhancing Immersion In Interactive Media

 

DeVine, Mike. mod1594@rit.edu. IGM New Media, GCCIS, RIT.

Brown, Nick. theepicwizard@gmail.com. Game Design & Development, GCCIS, RIT.

Albanese, Stephen. stephenalbanese@gmail.com. IGM New Media, GCCIS, RIT.

 

Mentor/Supervisors:

Lundgren, Carl. Professor. calite@rit.edu.

Schull, Jon. Professor. jschull@gmail.com.

 

Video games and movies are traditionally designed to be viewed on a flat screen. But what if the screen encompassed the viewer entirely’s entire 360-degree field of view? This could result in a fundamental shift This could changein how we design interactive media with respect to the viewer’s experience and interaction, and . It could also redefine how we employ motion control technology in interactive media.

 

The ultimate goal of this project is is to design a gaming and multimediae platform comprised of that takes advantage of a 360-degree display, coupled andwith a motion controller which can scheme which can function in all directions.

 

As part of this process, Tthe first iteration of this projectapproach will use Google Maps’ Street View to simulate navigating streets in real time-, and in everyall directions. The system will use multiple sensors and screens to create the illusion of immersive motion.

 

In the future, this platform will be expandable to include other applications such as gaming, real-world simulations, and augmented reality.

Abstract

Monday I met with Prof.'s Lundgren and Schull and we hammered out an abstract for the RIT Student Research & Innovation Symposium this fall. You can check it out on Google Docs here.

Tuesday, October 12, 2010

Get it Done

We met this past Friday to get down to figuring out exactly how we were going to go about creating ARSpace's 360-degree motion control scheme for our project. We had gotten our hopes raised at the prospect of getting our hands on a Kinect to use in advance before they hit the market in late November- someone in class claimed to have seen one in action at the GDD Lab. Unfortunately, when I emailed Prof. Jacobs asking him to confirm this, I got "not true". So there went that idea.
We decided to go with the Wii Remote as our input device; it's cheap, relatively powerful, and there's already been a ton of tinkering and hacking done to these things, so the technology behind the device is easy to grasp and a lot of the features we'd like to incorporate are well-documented. I knew that a Wiimote Flash API existed already, in fact I had played around with it before, but I was surprised when we learned that there was a C# library available as well. Which is good, because I effing hate Flash.

We also ran some tests to determine how reliably the Wiimote could connect to Nick's laptop's built-in Bluetooth receiver and send it input. After some sync button fails we actually got the thing working pretty quickly, and it's a surprisingly simple setup. We just paired the Wiimote with the laptop, and the program we were running to detect the device input started receiving data from the accelerometer, the pointer, the gyroscope, and the buttons. What we found most interesting was the option for up to four IR sources to be used with the test program. What this basically means is that we could theoretically use four separate sensor bars in our design- one for each quadrant of the wrap-around display. Even better, we could also opt to rig up our own custom-made LEDs to use in place of Nintendo's official sensor bar, to better blend in with the screens.

We also did some more research into the options available to us for warping the display from each projector, to match the curvature of the screens. I've looked at software and hardware solutions, and several look promising- if somewhat cost-inhibitive. We'll see if we can't MacGyver ourselves a homemade solution instead in the meantime.

We've now got our final goals for the prototype due next Friday set in stone: we want to show the Wiimote being used with a laptop, receiving input in conjunction with two separate sensor bars both able to communicate with the Wiimote, against a flat surface with an correlating image being projected onto a flat wall. After we figure out how to get that going, we've got about three weeks until our next milestone- when we're going to try to get as much of the tech side of things working as possible. 

With any luck, we'll have come up with a setup that's functional enough to warrant diving headfirst into programming a kick-ass game demo to take advantage of our new platform, and show it off at ImagineRIT next spring. Till then, we'll just have to see where this project takes us.

Sunday, October 10, 2010

Getting Off The Ground

This is the first of what we hope to many, many posts chronicling the ARSpace team's progress as we work through the process of creating what we hope will be a step forward in interactive game design. If we're not too lazy, we'll be updating this site at least twice weekly.

Check back often for more updates!