Efficient Structure-Aware Selection Techniques for 3D Point Cloud Visualizations with 2DOF Input


Data selection is a fundamental task in visualization because it serves as a pre-requisite to many follow-up interactions. Efficient spatial selection in 3D point cloud datasets consisting of thousands or millions of particles can be particularly challenging. We present two new techniques, TeddySelection and CloudLasso, that support the selection of subsets in large particle 3D datasets in an interactive and visually intuitive manner. Specifically, we describe how to spatially select a subset of a 3D particle cloud by simply encircling the target particles on screen using either the mouse or direct-touch input. Based on the drawn lasso, our techniques automatically determine a bounding selection surface around the encircled particles based on their density. This kind of selection technique can be applied to particle datasets in several application domains. TeddySelection and CloudLasso reduce, and in some cases even eliminate, the need for complex multi-step selection processes involving Boolean operations. This was confirmed in a formal, controlled user study in which we compared the more flexible CloudLasso technique to the standard cylinder-based selection technique. This study showed that the former is consistently more efficient than the latter—in several cases the CloudLasso selection time was half that of the corresponding cylinder-based selection.

Paper download:  (4.9 MB)


In the paper on page 7 (page 2251 in the journal), the formula for the F1 score needs to be corrected to F1=2·P·R/(P+R); i.e., we had missed the factor of 2. The actual values, however, were correctly computed in Table 2, simply the formula was incorrectly reported.


You can download a demo of the CloudLasso selection (for Win32, including a example datasets, 21MB) to try it out for yourself. To be fully functional, however, the demo requires a TUIO-based touch surface.


Get the video:



This paper was later extended to the CAST family of selection techniques, see the page on that paper as well. Also, our CloudLasso technique has since been incorporated into the open-source software SlicerAstro by Punzo et al. [2017].

Additional material:

Main Reference:

Lingyun Yu, Konstantinos Efstathiou, Petra Isenberg, and Tobias Isenberg (2012) Efficient Structure-Aware Selection Techniques for 3D Point Cloud Visualizations with 2DOF Input. IEEE Transactions on Visualization and Computer Graphics, 18(12):2245–2254, December 2012. Best Paper Honorable Mention Award at IEEE Scientific Visualization 2012.

BibTeX entry:

@ARTICLE{Yu:2012:ESA, author = {Lingyun Yu and Konstantinos Efstathiou and Petra Isenberg and Tobias Isenberg}, title = {Efficient Structure-Aware Selection Techniques for {3D} Point Cloud Visualizations with {2DOF} Input}, journal = {IEEE Transactions on Visualization and Computer Graphics}, year = {2012}, volume = {18}, number = {12}, month = dec, pages = {2245--2254}, doi = {10.1109/TVCG.2012.217}, doi_url = {https://doi.org/10.1109/TVCG.2012.217}, oa_hal_url = {https://hal.science/hal-00718310}, url = {https://tobias.isenberg.cc/p/Yu2012ESA}, pdf = {https://tobias.isenberg.cc/personal/papers/Yu_2012_ESA.pdf}, }

Other Reference:

Lingyun Yu (2013) Touching 3D Data: Interactive Visualization of Cosmological Simulations. PhD thesis, University of Groningen, The Netherlands, June 2013.

BibTeX entry:

@PHDTHESIS{Yu:2013:T3D, author = {Yu, Lingyun}, title = {Touching {3D} Data: Interactive Visualization of Cosmological Simulations}, year = {2013}, school = {University of Groningen}, month = jun, address = {The Netherlands}, url = {https://research.rug.nl/en/publications/touching-3d-data-interactive-visualization-of-cosmological-simula}, url2 = {https://tobias.isenberg.cc/VideosAndDemos/Yu2012ESA}, pdf = {https://tobias.isenberg.cc/personal/papers_students/Yu.2013.T3D.pdf}, }

This work was done at the Scientific Visualization and Computer Graphics Lab of the University of Groningen, the Netherlands. Also see Lingyun Yu's page on this project.