Boundary-structure-aware transfer functions for volume classification

Lina Yu, Hongfeng Yu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

We present novel transfer functions that advance the classification of volume data by combining the advantages of the existing boundary-based and structure-based methods. We introduce the usage of the standard deviation of ambient occlusion to quantify the variation of both boundary and structure information across voxels, and name our method as boundary-structure-aware transfer functions. Our method gives concrete guidelines to better reveal the interior and exterior structures of features, especially for occluded objects without perfect homogeneous intensities. Furthermore, our method separates these patterns from other materials that may contain similar average intensities, but with different intensity variations. The proposed method extends the expressiveness and the utility of volume rendering in extracting the continuously changed patterns and achieving more robust volume classifications.

Original languageEnglish (US)
Title of host publicationSIGGRAPH Asia 2017 Symposium on Visualization, SA 2017
PublisherAssociation for Computing Machinery, Inc
ISBN (Electronic)9781450354110
DOIs
StatePublished - Nov 27 2017
EventSIGGRAPH Asia 2017 Symposium on Visualization, SA 2017 - Bangkok, Thailand
Duration: Nov 27 2017Nov 30 2017

Publication series

NameSIGGRAPH Asia 2017 Symposium on Visualization, SA 2017

Other

OtherSIGGRAPH Asia 2017 Symposium on Visualization, SA 2017
Country/TerritoryThailand
CityBangkok
Period11/27/1711/30/17

Keywords

  • Classification
  • Transfer functions
  • Volume rendering

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Vision and Pattern Recognition
  • Computer Graphics and Computer-Aided Design
  • Software

Fingerprint

Dive into the research topics of 'Boundary-structure-aware transfer functions for volume classification'. Together they form a unique fingerprint.

Cite this