At a recently attended event involving visual analytics, a number of advanced prototypes were unveiled in the field of data mining, visualization, ontology, and interrelationships between structured and unstructured, spatial and non-spatial information. The overall intent is to develop systems to detect “signal in the noise” associated with national and homeland security. Clearly, as was evidenced by university, government, and private enterprise, GIS systems have become core components of these applications, expressing the spatial component of information as part of the overall visualization applications. Global projections, including Google Earth, NASA’s Wordwind, ESRI’s ArcExplorer client, and custom applications, were prevalent as well. However, out of all of the applications this author observed and interacted, the most impressive application involved the efforts of the University of Washington’s exploration into the convergence of GIS and VR (Virtual Reality). Seventeen years ago, the concept of utilization a spatially-enabled database as a data engine for VR constructs was a significant idea, but the computing challenges and VR HMD (Head Mounted Display) requirements were significant. Entire, large rooms of computers were required to generate, sustain, and allow for successful interaction within a Virtual World (VW). However, sitting at a simple table with one computer, modified HMD, and the willingness to achieve GIS integration with VR, the UW researchers demonstrated the ability to be immersed in the spatial information. In one instance, it was possible to engage in a high-resolution satellite image of a major urban area, visualize identified target threat features, experience near real-time sensor messaging, and other components through the VR interface. This is achieved through a novel application of VR HMD tracking and the projection and calibration of the geospatial scene onto a small white-board held by the user. The white-board has predefined focal points that are used to tie the spatial image projection into a coordinate space for VR perception. This is no small achievement. Moving beyond plasma screens, touch-tables, stereoscopic screens and glasses for 3D, the GIS-VR integration developed by the University of Washington team in association with other partners represents a next-generation level of geospatial experience. You, the observer, are placed literally into the geospatial data construct in real-time, and experience both 3- and 4D capabilities with the integration of the sensor messages broadcasts from the urban environment. Equally impressive is to review the progress that has been made in VR technology to able to experience this geospatial construct without a massive computing center and the requirement to generate the geospatial features as simulacra. The experience was with actual spatial data as derived from remote sensing, GIS, and spatial sensor messaging (XML-based as SOA). Seventeen years after this author discussed the integration of VR and GIS, significant achievements have been made. The spatial database has evolved as a viable data source for VR worlds, ushering in a new day in advanced visual analytics and the cognitive processes associated with geospatial perception.
-- Posted by - Alex Philp, GCS Research