rframe border=0% 

SpatialTouch: Exploring Spatial Data Visualizations in Cross-reality

Description:

 

We propose and study a novel cross-reality environment that seamlessly integrates a monoscopic 2D surface (an interactive screen with touch and pen input) with a stereoscopic 3D space (an augmented reality HMD) to jointly host spatial data visualizations. This innovative approach combines the best of two conventional methods of displaying and manipulating spatial 3D data, enabling users to fluidly explore diverse visual forms using tailored interaction techniques. Providing such effective 3D data exploration techniques is pivotal for conveying its intricate spatial structures—often at multiple spatial or semantic scales—across various application domains and requiring diverse visual representations for effective visualization. To understand user reactions to our new environment, we began with an elicitation user study, in which we captured their responses and interactions. We observed that users adapted their interaction approaches based on perceived visual representations, with natural transitions in spatial awareness and actions while navigating across the physical surface. Our findings then informed the development of a design space for spatial data exploration in cross-reality. We thus developed cross-reality environments tailored to three distinct domains: for 3D molecular structure data, for 3D point cloud data, and for 3D anatomical data. In particular, we designed interaction techniques that account for the inherent features of interactions in both spaces, facilitating various forms of interaction, including mid-air gestures, touch interactions, pen interactions, and combinations thereof, to enhance the users’ sense of presence and engagement. We assessed the usability of our environment with biologists, focusing on its use for domain research. In addition, we evaluated our interaction transition designs with virtual and mixed-reality experts to gather further insights. As a result, we provide our design suggestions for the cross-reality environment, emphasizing the interaction with diverse visual representations and seamless interaction transitions between 2D and 3D spaces.

Paper download:  (28.8 MB)

 

Additional material:

We make several items of additional material available in the following OSF repository: osf.io/avxr9.

 

Software:

The tool source code is also available at github.com/LixiangZhao98/Cross-Reality-Environment-SpatialTouch.

Video:

paper video:

Get the video:

Pictures:

(these images as well as others from the paper that are our own are available under a CC-BY 4.0 license, see the license statement at the end of the paper)

Cross-references:

This paper relates to several of our previous publications:

Reference:

Lixiang Zhao, Tobias Isenberg, Fuqi Xie, Hai-Ning Liang, and Lingyun Yu (2025) SpatialTouch: Exploring Spatial Data Visualizations in Cross-reality. IEEE Transactions on Visualization and Computer Graphics, 31, 2025. To appear.
×

BibTeX entry:


@ARTICLE{Zhao:2025:SES, author = {Lixiang Zhao and Tobias Isenberg and Fuqi Xie and Hai-Ning Liang and Lingyun Yu}, title = {{SpatialTouch}: Exploring Spatial Data Visualizations in Cross-reality}, journal = {IEEE Transactions on Visualization and Computer Graphics}, year = {2025}, volume = {31}, oa_hal_url = {https://hal.science/hal-04665449}, preprint = {https://doi.org/10.48550/arXiv.2407.14833}, osf_url = {https://osf.io/avxr9/}, github_url = {https://github.com/LixiangZhao98/Cross-Reality-Environment-SpatialTouch}, github_url2 = {https://github.com/LixiangZhao98/PointCloud-Visualization-Tool}, url = {https://tobias.isenberg.cc/p/Zhao2025SES}, pdf = {https://tobias.isenberg.cc/personal/papers/Zhao_2025_SES.pdf}, }

This work was done at and in collaboration with the Department of Computing of Xi’an Jiaotong Liverpool University, China.