2017.03.17 The AViDA lab and Wright State are hosting the workshop HPC Enabled Data Analytics for DoD Materials and Biological Sciences organized by WPAFB.
2017.03.04 Thomas Wischgoll is chairing the Visualization and Data Analysis conference again.The call will come out soon. You can find an archive of old conferences via vda.cs.wright.edu.
2017.01.12 Dayton is ranked 17 among the best cities for STEM jobs according to WalletHub.
2017.01.09 We added writing resources to the resources page.
2016.10.27 The director of the AViDA group, Thomas Wischgoll, is co-chairing the Visualization Contest for next year's IEEE VIS conference.
2016.5.18 The director of the AViDA group, Thomas Wischgoll, is chairing the Visualization and Data Analysis conference next year together with David Kao and Song Zhang. The deadline for submissions is July 25 and papers can be submitted here.
2015.11.17 The AViDA lab joined the Data Science and Security Cluser (DSSC)
2015.09.23 Wright State will host the first presidential debate next year
2015.04.16 The department took delivery of a 21.16 TFlop/s high performance computer with 2048 cores, which will tie directly into the capabilities of the Appenzeller Visualization laboratory. It was ranked 310 in the top 500 computer list.
2015.03.16 Dayton is also ranked among the best metro areas for STEM professionals (ranked 16th) according to WalletHub.
2015.02.26 Dayton is among the top 10 cities in America for engineers according to Forbes. See here for more information.
2015.02.25 VTK now supports rendering into external OpenGL contexts with their upcoming release version 6.2. Further details can be found in this blog post. This then no longer requires the multipass rendering hack for integrating VTK with Vrui.
2014.04.12 The Appenzeller Visualization Laboratory keeps getting upgraded; this time touch capabilities were added to make some of the displays even more interactive.
Software Configuration and Description
Some computers available in the research and teaching laboratories use dual-boot setups that enable them to boot into Windows 7 and Linux Mageia 2015, whereas some may only have Linux installed.
The Linux installation ties into a large-scale server-client infrastructure with over 10 TB of storage space. It utilizes common compilers and environments, including GNU compilers for C, C++, Fortran, Python, and Java. A plethora of libraries for developing software is available, such as OpenGL, boost, the Visualization Toolkit (VTK), the Insight Toolkit (ITK). Additional graphics related packages are available as well, such as the Delta 3D gaming engine or the Vrui VR toolkit.
All desktop systems in the teaching laboratory are fully equipped for developing software for Virtual Environment applications. This includes 3d-capable display systems, in form of 55-inch plasma TVs or a 65-inch LCD TV, as well as graphics cards supporting quad-buffered stereo rendering directly in hardware. Combined with gaming devices for input and different types of tracking systems, these setups provide an optimal testbed for developing such software applications.
The optical tracking systems available in this laboratory are NaturalPoint OptiTrack and Microsoft's Kinect. Since the Vrui environment only runs in Linux, yet the tracking software is available only for Windows, all desktop systems support a cluster-type setup in which the main computer runs Vrui in Linux and the tracking software on a separate Windows computer. The tracking data is then transferred from the Windows computer to the Vrui environment via the VRPN protocol.
This networked environment requires a few steps in order to start the Virtual Environment software. To start the tracking software you need to connect to the tracking computer running Windows via the remote desktop protocol. There is a bash script available called tracking that when executed identifies the computer it got started on and automatically connect to the associated tracking computer. Once the remote desktop connection is established, you need to start the tracking software. For the Kinect-based tracking you need to double click on the desktop icon for the FAAST tracking software and click on the connect button to start up the Kinect. You will eventually see a depth image of what the Kinect sensor detects. For the camera-based OptiTrack tracking system, you have to double click on the TrackingTools icon on the desktop. This will start up the software, connect to the cameras, and enable VRPN streaming automatically.
Once the tracking software is running, you need to start the VRDeviceDaemon on the Linux computer. After that you can immediately start the Vrui-based software which typically will start up in full-screen mode. It will open automatically on the large TV. If you want it to open on the monitor with support for the passive stereo glasses use the command line parameter -rootSection Stereo.
You can base your Vrui software on a basic framework, which incorporates OpenSceneGraph and Delta3D to support various different file formats for loading in models, including OpenCallada or OpenInventor. This framework is based on Vrui version 2.2. Compiled versions (including source) of dependencies can be found here as well, such as OpenSceneGraph and OpenCollada. Using this framework with Delta3D requires a modified application class. This package contains a library that includes the modification.
Similarly, you can use the framework based on Vrui and VTK for various different visualization applications. The framework supports most geometry-based VTK rendering capabilities, including volume rendering.
The Vrui environment requires additional configuration which is already taken care of for you. Specifically, there are two configuration files, VRDevices.cfg for the VRDeviceDaemon and Vrui.cfg for the Vrui environment itself. These configuration files define the devices used by the Vrui environment and how these devices will be used. By default, the Vrui environment will use the Logitech gamepad attached to each computer to control view settings within the Vrui environment. The joysticks on the gamepad can be used for navigation, such as moving forward or backward and turning left and right using the left joystick. When using the camera-based tracking system, these motion controls are dependent on the gamepad's position and orientation, e.g. the forward direction is defined by the direction the gamepad points to. Since the Kinect is not able to provide this directional information reliably the forward direction is always orthogonal to the display when using Kinect-based tracking. In addition, the button labeled number 2 (or the green button labeled B on the newer model) to grab the entire model such that the entire model moves in accordance to the user moving the gamepad, including movement and rotation.