Recent

2022.01.03 Our new CAVE-type display, an ActiveCube by Virtalis, is now fully installed and working in the Appenzeller Visualization Laboratory

 

Software Configuration and Description: Vrui

All desktop systems in the teaching laboratory are fully equipped for developing software for Virtual Environment applications. This includes 3d-capable display systems, in form of 55-inch plasma TVs or a 65-inch LCD TV, as well as graphics cards supporting quad-buffered stereo rendering directly in hardware. Combined with gaming devices for input and different types of tracking systems, these setups provide an optimal testbed for developing such software applications.

The optical tracking systems available in this laboratory are NaturalPoint OptiTrack and Microsoft's Kinect. Since the Vrui environment only runs in Linux, yet the tracking software is available only for Windows, all desktop systems support a cluster-type setup in which the main computer runs Vrui in Linux and the tracking software on a separate Windows computer. The tracking data is then transferred from the Windows computer to the Vrui environment via the VRPN protocol.

This networked environment requires a few steps in order to start the Virtual Environment software. To start the tracking software you need to connect to the tracking computer running Windows via the remote desktop protocol. There is a bash script available called tracking that when executed identifies the computer it got started on and automatically connect to the associated tracking computer. Once the remote desktop connection is established, you need to start the tracking software. For the Kinect-based tracking you need to double click on the desktop icon for the FAAST tracking software and click on the connect button to start up the Kinect. You will eventually see a depth image of what the Kinect sensor detects. For the camera-based OptiTrack tracking system, you have to double click on the TrackingTools icon on the desktop. This will start up the software, connect to the cameras, and enable VRPN streaming automatically.

Once the tracking software is running, you need to start the VRDeviceDaemon on the Linux computer. After that you can immediately start the Vrui-based software which typically will start up in full-screen mode. It will open automatically on the large TV. If you want it to open on the monitor with support for the passive stereo glasses use the command line parameter -rootSection Stereo.

You can base your Vrui software on a basic framework, which incorporates OpenSceneGraph and Delta3D to support various different file formats for loading in models, including OpenCallada or OpenInventor. This framework is based on Vrui version 4.2. Compiled versions (including source) of dependencies can be found here as well, such as OpenSceneGraph and OpenCollada. Using this framework with Delta3D requires a modified application class. This package contains a library that includes the modification.

Similarly, you can use the framework based on Vrui and VTK for various different visualization applications. The framework supports most geometry-based VTK rendering capabilities, including volume rendering. This framework now moved to github to better be able to support and maintain this framework. This version also includes support for touch-enabled devices, such as touchpads and touchscreens. The github repository includes additional instrcutions for using this framework.

The Vrui environment requires additional configuration which is already taken care of for you. Specifically, there are two configuration files, VRDevices.cfg for the VRDeviceDaemon and Vrui.cfg for the Vrui environment itself. These configuration files define the devices used by the Vrui environment and how these devices will be used. By default, the Vrui environment will use the Logitech gamepad attached to each computer to control view settings within the Vrui environment. The joysticks on the gamepad can be used for navigation, such as moving forward or backward and turning left and right using the left joystick. When using the camera-based tracking system, these motion controls are dependent on the gamepad's position and orientation, e.g. the forward direction is defined by the direction the gamepad points to. Since the Kinect is not able to provide this directional information reliably the forward direction is always orthogonal to the display when using Kinect-based tracking. In addition, the button labeled number 2 (or the green button labeled B on the newer model) to grab the entire model such that the entire model moves in accordance to the user moving the gamepad, including movement and rotation.