氏 名 グンジェー ゾリーグ
GUNJEE ZORIG
本籍(国籍) モンゴル
学位の種類 博士 (工学) 学位記番号 工博 第136号
学位授与年月日 平成19年3月23日 学位授与の要件 学位規則第5条第1項該当 課程博士
研究科及び専攻 工学研究科 電子情報工学専攻
学位論文題目 Particle Rendering Based on Translucent Shadow Mapping and Hierarchical Bucket Sorting
( 透過影マッピングと階層的バケットソートに基づく粒子ベースレンダリング法 )
論文の内容の要旨

 In this thesis we present our solution for shadow generation, visibility sorting and GPU based shading of translucent point data for particle (point with additional attributes) based rendering. The particle rendering system is designed for interactive visualization of particle models, point clouds, polygon data and volume data.

 The modern programmable GPUs (Graphics Processing Unit) allow the graphics programmers to experiment with new algorithms and techniques that would not have been practical on a slower, single-threaded serial processor such as a CPU.

 In last five years, point-based surface representations have proven to be a flexible and efficient alternative to mesh based representations. Current point primitives store only limited information about their immediate locality, such as their position in 3D space, the normal vector, the bounding ball, and the tangent plane disk. However, our particle rendering system uses many other attributes to render different models.

 In computer graphics it has become a deep-rooted habit to use particles to simulate a variety of phenomena such as clouds, explosions, smoke and fluids. Most simulations use volume and point rendering techniques to render particles. However, the special term "particle rendering " is not precisely defined. Our definition of particle rendering is as follows. "Particle rendering is a set of special rendering techniques, which renders particles with rendering-oriented attributes such as the 3D geometry of the particles, material properties of each particle and other specific shading information on particles". At first glance, rendering-oriented particles have greater similarity with point-based geometry such as surfels. If look from the other side the renderable particles represent voxels. In simple terms, rendering-oriented particles correspond to translucent point geometry with extended physical attributes. Rendering-oriented particles have unique flexibility in that they can form translucent and opaque solids and surfaces (surfels). Also they can present volume data, point clouds and micro objects with specific material properties. We chose the particle (point with extended attributes) presentation as being the most suitable common presentation for polygonal, volumetric and point-based models. Shading calculations are performed in the GPU, using rendering-oriented attributes of the particles. The data structure of rendering-oriented particles meets a number of criteria that are important for GPU-based rendering - parallel nature, vertex or point type geometric presentation and useful attributes for comprehensive shading of each point separately. Due to its conceptual simplicity and higher flexibility, particle rendering can be used as an effective rendering method for mixed 3D data sets.

 The main steps of our particle rendering system are:
Step 1. The first step in the rendering engine is preprocessing, which may include conversion to a common format (particles with rendering-oriented attributes), coordinate transformation to the world coordinate space, data analysis for future processing, subdivision of polygonal meshes to adapt the point densities, and the normal vector computation.
Step 2. The rendering system then builds a translucent shadow mapping table for each light source using the translucent shadow mapping method
Step 3. In the final step, we carry out simple object level visibility culling, hierarchical bucket sorting for back-to-front alpha blending, advanced shading for rendering-oriented points in the GPU and splatting-related computations

 Our main contributions are as follows:
 We use rendering-oriented attributes for the particles. The shading of each particle is processed differently, depending on the additional properties of each splat.
 We propose a novel algorithm for visibility sorting using the Hierarchical Bucket Sorting approach
  To solve the translucent shadows mapping problem, we use a different approach to those used  in previously published algorithms.  Our Translucent Shadow Mapping Algorithm uses a spherical coordinate system and solves the distance-based sorting,  transparency calculation, shadow mapping, omni-directional mapping and light intensity attenuation problems in one step.  The proposed algorithm works even in the difficult situation where the light sources are inside the translucent object.  The proposed particle rendering system has following key features that make it different to other methods.

  Unlike other particle systems and engines, particle rendering works independently of modeling.  Our system obtains different modeling data and converts this data into its own specific format before rendering it as particles.

  It can render translucent objects with the same efficiency as opaque objects.

  It can render polygon data, particle models, point clouds and volume data.   Moreover, it handles mixed data types very well, which is a problematical task for other rendering methods.

 We demonstrate the efficiency and flexibility of our novel approach  by showing several rendering results for different types of 3D data.  The result shows that hierarchical bucket sorting and GPU based shading perform at an interactive rate in for million points.  These algorithms work in linear time O(n) for both opaque and translucent points.  Specially, the GPU based shading algorithm works almost nine times faster than its CPU analogue.  But the translucent shadow generation algorithm requires more time for mapping objects with high-level transparency.