Cocar is a cooperative racing game. The first player drives in Virtual Reality while a second player is tasked with helping the driver to the finish line by altering the virtual world in real time. Players can partake in the gameplay from three different perspectives, detailed in the following sections; Driver, Builder and Spectator. Each perspective makes use of different hardware.
The game was created as a 6 week student project in Advanced Graphics and Interaction at the Royal Institue of Technology, Stockholm, Sweden. The project incorporates Oculus Rift HMD, Thrustmaster steering wheel,Microsoft Pixelsense, 3D-printing and Vuforia AR.
Some of the features of CocAR are:
Watch our concept video below!
Imagine as a kid having the possibility to actually achieve a trustworthy driving experience? Throughout the autumn 2016 at different events this has become reality and the immersive experience of being in the driver seat playing CocAR has turned out to be successful.
One of the biggest factors of the success of CocAR is the Thrustmaster. To enhance the experience of driving a car in VR requires as much credibility as possible and a steering wheel & pedals facilitates this. We have also used the Oculus Rift Development Kit 2 (DK2) throughout the project that has worked better than expected.
The driver is assisted by a world builder who alters the virtual environment in real time. This is done by placing 3d-printed road models on a touch table that utilizes Microsoft's PixelSense technology. The table recognizes the type of model placed, its position and orientation, through the use of high contrast fiducials called byte tags, which are placed on the bottom of the model. A corresponding road module then manifests in the virtual world. Thus, real world models are brought into the virtual environment, this is augmenting virtuality.
The builder is also tasked with guiding the driver to the finish line. This means identifying and constructing suitable route as well as giving directions to the driver. Seeing as this can be a lot for one person to take on, the role of builder is typically shared among several users working in tandem on the PixelSense.
The third and final perspective allow passive spectating of the game through an augmented reality android app. The app makes use of the phone's camera to track a high contrast marker, which is then overlayed with the virutal world. This allows the spectator to move around and get unique camera angles of the racing action.
The tracker can be printed on a piece of paper but the builder's perspective is in itself a tracker. This means that the spectator can direct the camera at the PixelSense table and get the sensation that the world pops out of the table.
September 30 2016
November 4-6 2016
10-00-19.00 every day
Visualization Studio VIC, KTH
December 16 2016
15:00 - 19:00
The purpose of this project was to extend our knowledge and practical experience within computer graphics and interaction. We also strived towards creating a fun game and gave gameplay wuite a bit of thought eventhough this ws out of scope for the course. The idea of a car game arose as a solution for how to move the player in VR whilst sitting down without breaking imersion. We were set on creating a multiplayer experience and lack of hardware meant that it hade to be assymetrical. We chose the PixelSense because of its physical interaction capabilities and the rest of the project grew from there.
Most of the project group were entierly new to Unity and those with experience had not worked collaboratively with Unity before. Using GIT for version control in a Unity project turnedd out to be rather problematic, due to large binary files that could not be merged. Eventually we overcame this problem by turning on plain text meta files and asset serialization in Unity.
We quickly realised that cyber sickness was a very real problem. This has affected many designchoices in the game. First of all we made the world completely flat in order to avoid inclines that we could not reproduce in reality. Secondly we locked the roll of the camera for the same reason. Thirdly we chose a simplistic graphical style rather than realistic.
The PixelSense table is rather old and is of course not prowerful enough to drive the VR experience as well as the builder's perspective. This meant that we a separte computer to run the Oculus rift and synchronzr the gameplay over the network. When we later added the AR spectating app this meant a third client. All clients run the same code and on very different hardware. Simply getting this rather complex setup to work was a big challenge and out of scope for the course.
Developing the AR spectating app went fine up until the point when it was to be built for a android device, a Sony Xperia Z5. The GPU of this particular device, Adreno 430, has problems with Unity standard shader. Finding a shader that worked on the device and identifying suitable build settings took a lot of time out of developing the AR experience further.
The project is the result of a joint effort of the group, but some features are the result of a sole individual's hard work.