Interactive Fusion and Tracking For Multi‐Modal Spatial Data Visualization
View/ Open
elshehaly_et_al_2015.pdf (7.075Mb)
Download
Publication date
2015-06Rights
© 2015 Wiley. This is the peer-reviewed version of the following article: Elshehaly M, Gračanin D, Gad M et al (2015) Interactive Fusion and Tracking For Multi‐Modal Spatial Data Visualization. Computer Graphics Forum. 34(3): 251-260, which has been published in final form at https://doi.org/10.1111/cgf.12637. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.Peer-Reviewed
Yes
Metadata
Show full item recordAbstract
Scientific data acquired through sensors which monitor natural phenomena, as well as simulation data that imitate time‐identified events, have fueled the need for interactive techniques to successfully analyze and understand trends and patterns across space and time. We present a novel interactive visualization technique that fuses ground truth measurements with simulation results in real‐time to support the continuous tracking and analysis of spatiotemporal patterns. We start by constructing a reference model which densely represents the expected temporal behavior, and then use GPU parallelism to advect measurements on the model and track their location at any given point in time. Our results show that users can interactively fill the spatio‐temporal gaps in real world observations, and generate animations that accurately describe physical phenomena.Version
Accepted manuscriptCitation
Elshehaly M, Gračanin D, Gad M et al (2015) Interactive Fusion and Tracking For Multi‐Modal Spatial Data Visualization. Computer Graphics Forum. 34(3): 251-260.Link to Version of Record
https://doi.org/10.1111/cgf.12637Type
Articleae974a485f413a2113503eed53cd6c53
https://doi.org/10.1111/cgf.12637