MetaVR’s Virtual Reality Scene Generator fulfils image generation requirements for US Army’s Synthetic Training Environment immersive virtual reality, augmented reality, sensor, and conventional out-the-window capabilities

MetaVR’s Virtual Reality Scene Generator (VRSG) is providing image generation for Synthetic Training Environment (STE) Reconfigurable Virtual Cockpit Trainers (RVCT) which have been delivered to the US Army.

The MetaVR VRSG-based simulators include multiple versions of mixed-reality rotorcraft reconfigurable cockpit systems as well as six virtual reality door gunner trainers. The VRSG has been provided as part of an Other Transaction Authority (OTA 1) programme.

Twenty-five VRSG licenses were purchased for the RVCTs, which have been delivered to US Army PEO-STRI in Orlando by prime contractor, Bugeye Technologies, where they are being evaluated as part of the Army’s STE programme. VRSG fulfils the image generation requirements for immersive virtual reality, augmented reality, sensor, and conventional out-the-window capabilities for both the side door gunnery trainer and the Apache sensor software.

Mixed reality approach allows trainees to benefit from muscle memory training

Integrated by ZedaSoft, the side door gunnery trainer takes advantage of VRSG’s built-in support for the HTC VIVE Pro Head Mounted Display (HMD), providing high-resolution stereo rendering at 90 frames per second. VRSG also supports the SA Photonics SA-92 augmented reality HMD, and can render a cockpit mask model, which allows trainees to see through the HMD into a physical cockpit model. Pixels not affected by the cockpit mask model are filled in with imagery from the virtual scene.

Garth Smith, President of MetaVR, said: “This mixed-reality approach allows a trainee to benefit from the muscle memory training associated with interacting with actual physical cockpit controls. This approach offers the trainee a significant advantage, as they are training on replicas of the actual operational hardware augmented by a seamless see-through to the virtual reality scene, in order to heighten learning and maximize their training value.”

MetaVR provided ZedaSoft with a VRSG plugin capable of decoding streaming H.264 video from a sensor channel, which is then rendered onto the video as an overlay to the out-the-window video in the HMD. This configuration is used to satisfy the sensor video requirements used by the actual Apache Integrated Helmet and Display Sight System (IHADSS) HMD. The sensor video displayed in the HMD can be provided by any H.264 stream on the network, such as an image generator channel simulating a sensor attached to the airframe of a remote aircraft or UAS sensor.

The programme also includes technology supplied by Bihrle Applied Research (Apache aircraft systems model), RT-Dynamics (Blackhawk aircraft systems model), PLEXSYS Interface Products (PLEXComm simulated radio and intercom software), SA-Photonics (SA-92S HMD), and Acme Worldwide Enterprises (M240D Gun System Simulator).

Subscribe to our weekly newsletter