Revolution Events: Innovative Mini-Games With Projection And Depth Cameras

The Future Of Event Entertainment With Projection Technology

Computer vision | OpenCV | Projection mapping

The pandemic has forced us to rethink certain things in our company. For our part, we saw the use of VR headsets for events looking increasingly compromised. So I started to explore other avenues where there would be less contact with hardware. In this R&D project, VR headsets were replaced by video projection and depth cameras for interaction. The aim is to test the mechanics of 1 to 2 minute mini-games that can be played quickly by a large number of players in competition, with highly visual animations for trade fair events, team building and animation. The first step was to create a framework for rapidly developing mini-games using these technologies. Thanks to this framework, adapting new mechanics or applying a custom theme to an event takes just 2 to 4 weeks. Based on this framework, here are the first two games we developed :

Shoot the Chook : Chickens have invaded the village. The player has to get rid of as many as possible in the allotted time by shooting them with a foam ball.

Fusion : Two players battle it out in a Cyber Punk atmosphere on an endless runner. By moving their bodies, they control a sidebar that allows them to collect items to earn points. They must also avoid certain items that will cause them to lose points.

Scores

There’s nothing like scores to give players a challenge. Competition is created, and each player tries to reach the Leaderboard, seeking the best place in the ranking. The Leaderboard is displayed between each game and the score is updated during the game, letting players know where they stand.

Tracking & Calibration

The first game uses a ball thrown at the wall to interact with the game. We use a depth and colour camera, the Kinect Azure, to detect the ball. Calibrating the Kinect to the projector is the biggest difficulty. It involves several operations to change space from the depth camera to the colour camera and then to the projector. The Kinect SDK already provides the matrices for changing space between the colour and depth cameras. All that remains is to calibrate the projector. To solve this problem, when the game starts a chessboard with Charuco markers is projected full screen. The Kinect colour camera sees this chessboard and detects the position of the markers.

From there we use OpenCV to calculate the space transformation matrix using 3D – 2D point correspondence (PNP solver). Finally, the matrices calculated are used to transform the 3D positions of the ball relative to the depth camera into positions relative to the screen in pixels. The result is fully automatic calibration, regardless of where the Kinect is placed on the projector, as long as the Kinect can see part of the projected image.

That’s it for our latest experiments! If you’re interested in this type of project or animation, they’re already ready to use and we can customise them to your taste, your brand or even modify the interactions and projection methods.

Vous avez un projet ou besoin d'informations ?

N'hésitez pas à nous contacter ! 

   contact@torrusvr.com

   Bordeaux - Paris (FRANCE)

Development of Immersive Experiences

Technologies immersives

Realité Virtuelle (RV)
Realité Augmentée (RA)
Metaverse
Computer Vision
Projection Mapping
Jeux vidéos

Système embarqué

Internet des objets (IoT)
Industry 4.0
Système embarqué sur Unités Centrales (UC) Conception de Circuit Electronique (PCB)
IoT Gateway

Contact

contact@torrusvr.com