Loading...
Thumbnail Image
Publication

Interactive Fusion and Tracking For Multi‐Modal Spatial Data Visualization

Elshehaly, Mai
Gračanin, D.
Gad, M.
Elmongui, H.G.
Matković, K.
Publication Date
2015-06
End of Embargo
Supervisor
Rights
© 2015 Wiley. This is the peer-reviewed version of the following article: Elshehaly M, Gračanin D, Gad M et al (2015) Interactive Fusion and Tracking For Multi‐Modal Spatial Data Visualization. Computer Graphics Forum. 34(3): 251-260, which has been published in final form at https://doi.org/10.1111/cgf.12637. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.
Peer-Reviewed
Yes
Open Access status
Accepted for publication
Institution
Department
Awarded
Embargo end date
Additional title
Abstract
Scientific data acquired through sensors which monitor natural phenomena, as well as simulation data that imitate time‐identified events, have fueled the need for interactive techniques to successfully analyze and understand trends and patterns across space and time. We present a novel interactive visualization technique that fuses ground truth measurements with simulation results in real‐time to support the continuous tracking and analysis of spatiotemporal patterns. We start by constructing a reference model which densely represents the expected temporal behavior, and then use GPU parallelism to advect measurements on the model and track their location at any given point in time. Our results show that users can interactively fill the spatio‐temporal gaps in real world observations, and generate animations that accurately describe physical phenomena.
Version
Accepted manuscript
Citation
Elshehaly M, Gračanin D, Gad M et al (2015) Interactive Fusion and Tracking For Multi‐Modal Spatial Data Visualization. Computer Graphics Forum. 34(3): 251-260.
Link to publisher’s version
Link to published version
Link to Version of Record
Type
Article
Qualification name
Notes