RayTracingLensFlare English Edition

525 Views

November 27, 23

スライド概要

■Overview
RayTracingLensFlare is a system that simulates ghosting effects using rays traced through a group of modeled lenses comprising the camera.

We will discuss the calculations involved, the optimizations we made, and how we handled the case of light sources outside of the rendering screen.

Note: This is the contents of the publicly available CAPCOM Open Conference Professional RE:2023 videos, converted to slideshows, with some minor modifications.

■Prerequisites
Those who have knowledge around Ray tracing or have implementation experience.

I'll show you just a little bit of the content !
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
CAPCOM Open Conference Professional RE:2023
https://www.capcom-games.com/coc/2023/

Check the official Twitter for the latest information on CAPCOM R&D !
https://twitter.com/capcom_randd
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

profile-image

株式会社カプコンが誇るゲームエンジン「RE ENGINE」を開発している技術研究統括によるカプコン公式アカウントです。 これまでの技術カンファレンスなどで行った講演資料を公開しています。 【CAPCOM オープンカンファレンス プロフェッショナル RE:2023】  https://www.capcom-games.com/coc/2023/ 【CAPCOM オープンカンファレンス RE:2022】  https://www.capcom.co.jp/RE2022/ 【CAPCOM オープンカンファレンス RE:2019】  http://www.capcom.co.jp/RE2019/

シェア

またはPlayer版

埋め込む »CMSなどでJSが使えない場合

関連スライド

各ページのテキスト
1.

Ray Tracing Lens Flare In this power point, I would like to talk about raytraced lens flares. ©CAPCOM 1

2.

Agenda [Introduction] [Occlusion processing] • Lens Flare • The simplified structure of camera • Handling the case of light source is out of angle of view • Summary of processing [Rendering problem and its mitigation] • Separation of ghosts [Ray Tracing] • Expression lenses by use of simple shape [Technique for Acceleration] • Ray tracing as a simple geometry problem / Computation of intensity • Removing meaningless ray • Optimization result [Light Wave Computation] • Diffraction Pattern [Conclusion and future prospect] • Reflection effect of dust/scratch on front element This is the agenda. First of all, I would like to simply explain what lens flares are. Next, I would like to introduce the conventional methods and explain the standard structure of a camera. 1 Then, I will talk about ray tracing inside a camera to display lens flares, processes related to wave optics, and computation of wave distribution. After that, we’ll mention our occlusion processing, rendering problems and their mitigation, and optimization. To conclude, we’ll talk about future prospects. ©CAPCOM 2

3.

Introduction: Lens Flare All pictures are results of our implemented feature Starburst Close up Twisted ghost Ghost • Optical phenomenon often in response to a bright light • These are comprised of a “starburst (diffraction spike)” and “ghosts” • The ghost shape sometimes changes in a complicated way First of all, I would like to explain what lens flares are. All pictures are results of our implemented feature. A lens flare is the optical phenomenon often in response to bright light. It is often used to produce cinematic visuals in games. 2 Lens flares are divided into the starburst which occurred at the light source and ghosts which occurred on the line connecting the light source and the center of the screen. We often recognize the ghost as polygonal shape, but sometimes it is recognized as complex shape like a twisted form. ©CAPCOM 3

4.

Introduction: Conventional Method [Sprite-Based Method] • Placing images on the line connecting the light source to the center of the screen • Ghost shape can’t be changed • The quality is dependent on artist technique Sprite-Based Method [Image-Based Method] • Recursive image scaling・Compositing • It is difficult to change shape depending on moving of light source • Strongly dependent on artists technique • Difficult to deform complex ghosts in a natural way Up until now, there are many methods for expressing lens flares. In this section, we mention a subset of them. 3 First, I introduce the sprite-based method. This method expresses lens flare by placing images on the lines connecting the light sources to the center of the screen. Hence, we can’t achieve complicated or deformed shape of ghosts, and their positions are entirely dependent on artists. Next, the image-based method. This method expresses lens flares by scaling an image recursively. The computational complexity doesn’t depend on the number of light sources. However, it is also difficult to deform complex ghosts in a natural way. ©CAPCOM 4

5.

Introduction: Physically-Based Method [Physically-Based Method] • A method of constructing a tight-lattice grid and realizing it by ray tracing and interpolation[1] • Good image quality with complex deformation • Precomputation: 90 Light Directions × (64 × 64) Rays × 20 Zoom Factors × 8 FStops • (128 × 128) Grid Per Ghost If we can express lens flare such as this paper, artist’s range of expression will be expanded →Let’s do ray tracing (without precomputation) We consider reflectance/brightness/computational method for starburst and ghosts for expressing a plausible lens flare and accelerating its computation A physically-based method which uses ray tracing of a dense grid has been proposed. This method allows plausible behavior of lens flare with complex deformation of ghosts. Since it uses a dense grid, this method require many precomputation results and has a high computational complexity. 4 However, lens flare’s image in the paper is of high quality, so we think that if this feature is implemented, artist’s range of expression will be expanded. Therefore, we opted to use this raytraced method. Then, we considered the computation needed for lens flare having plausible starburst and ghosts, and we responded to problems that occurred. We have also thought of other ways to accelerate computation to make it run at a practical speed. We will share our ideas about that. ©CAPCOM 5

6.

Introduction: Simplified Camera Structure Aperture Blades Lens Sensor • A camera is mainly constructed from many lenses, aperture blades, and image sensor • We can adjust incident light by using aperture blades • Simulation of Lens Flare is accomplished through raytracing in the lenses structure and computing wave distribution First of all, in this slide, we introduce the assumed structure of a camera. A camera is mainly constructed from many lenses, aperture blades for adjusting incident light, and an image sensor which 5 receives light. Lens flare occurs when the incident light reaches the sensor, so we reproduced that by raytracing in this structure and computing wave distribution. ©CAPCOM 6

7.

Introduction: Summary of Processing ②Diffraction Pattern Generation ①Raytracing Mask Input Starburst ③Diffraction Pattern mapping Ghost Output • Put the grids on front element/Emit rays into lenses →Compute deformed grids on sensor plane • When raytracing, record intensity and transit point on the surface of aperture blades • Generate the image of starburst and ghosts which depend on the shape of aperture blades • Mapping the image on the grids Before I explain in detail, I will lay out the rough description. First of all, as shown in No.1, we generate a uniform grid on the side of the front element and compute the deformed grid 6 by emitting rays from the grid points to the camera. Also, we record intensity of each ray and transit point on the surface of aperture blades. Next, as shown in No.2, we generate the image of starburst and ghosts from the image of aperture blades. After that, as shown in No.3, we complete rendering by using texture mapping on the surface of the deformed grid using the aforementioned position and intensity. In this slide, we explained lens flare by focusing on Ray Tracing, Diffraction Pattern Generation, and Diffraction Pattern mapping. ©CAPCOM 7

8.

Here Ray Tracing Diffraction Pattern Generation Diffraction Pattern Mapping I’ll start explaining about ray tracing. 7 ©CAPCOM 8

9.

Ray Tracing: Expressing Lenses as Simplified Shape Lens(side) 𝒓 Optical axis Lens (front) Biconcave PlanoBiconvex convex Plane 𝒉 Sphere 𝒓 𝒉 𝒉 • Lenses can be expressed by a combination of planes and spheres • Ray tracing is the recursive evaluation of intersection between lines (rays) and planes or spheres • This process can be executed without BVH, since we only need to solve a simple equation Here, let’s consider how to express lenses in a simple manner using raytracing. For example, as shown in the upper left corner of the screen, we can see the plano-convex lens is constructed from planes 8 and spheres by observing lens from the side. This is also true for other concave and convex lenses. As shown above, a lens can basically be expressed as a combination of planar and spherical surfaces. Thus, the recursive evaluation of intersection of a straight line representing a ray and a plane or sphere is what ray tracing in lens flare fundamentally is. This evaluation of intersection is realized by only solving equations, so we can execute this using compute shader without BVH. ©CAPCOM 9

10.

Ray Tracing: Interpretation of lens data No radius of curvature distance to next element 0 𝑟0 = ∞ 𝑑0 1 𝑟1 < 0 𝑑1 2 𝑟2 > 0 𝑑2 … … … 𝑟>0 No.2 𝑟2 𝑟1 𝑧 𝑧=0 𝑧 • • • • No.1 𝑧 = 𝑑0 + 𝑑1 𝑟=∞ 𝑟<0 No.0 𝑧 = 𝑑0 + 𝑟1 𝑑0 𝑑1 𝑑2 𝑧 = 𝑑0 + 𝑑1 + 𝑟2 𝑟<0 We need to express lenses by a combination of planes and spheres for ray tracing Lens data include radius of curvature and distance to next element etc. We can compute the direction using radius of curvature and position of lens from distance For example, the center of No.1 is 𝑧 = 𝑑0 + 𝑟1 (The center is to the left of the 𝑧 = 𝑑0 ) There are tables about lens data on the internet. There, you’ll find the radius of curvature for each surface element and the distance to the next surface, as shown above. 9 For ray tracing, this information needs to be reinterpreted as a set of planar and spherical surfaces. We will take this into consideration. First of all, we assume that rightward is the positive direction toward the screen now. Then, we can see which way the surface is convex from the sign of radius of curvature. In this case, as shown above, if the radius of curvature is positive, the sphere will be convex toward the left, and if it is negative, it will be convex toward the right. ©CAPCOM 10

11.

Ray Tracing: Interpretation of lens data No radius of curvature distance to next element 0 𝑟0 = ∞ 𝑑0 1 𝑟1 < 0 𝑑1 2 𝑟2 > 0 𝑑2 … … … 𝑟>0 No.2 𝑟2 𝑟1 𝑧 𝑧=0 𝑧 • • • • No.1 𝑧 = 𝑑0 + 𝑑1 𝑟=∞ 𝑟<0 No.0 𝑧 = 𝑑0 + 𝑟1 𝑑0 𝑑1 𝑑2 𝑧 = 𝑑0 + 𝑑1 + 𝑟2 𝑟<0 We need to express lenses by a combination of planes and spheres for ray tracing Lens data include radius of curvature and distance to next element etc. We can compute the direction using radius of curvature and position of lens from distance For example, the center of No.1 is 𝑧 = 𝑑0 + 𝑟1 (The center is to the left of the 𝑧 = 𝑑0 ) Also, in case of a plane, nothing is written, or ‘infinity’ is written in the table. We can compute the position of the plane and sphere using the radius of curvature and the distance to the next element. 9 Now, let 𝑧 = 0 be the position of the first surface element. Also, the element 0 has infinite radius of curvature, so it’s constructed from a planar surface at that position. The element 1 has its center at 𝑧 = 𝑑0 + 𝑟1 , since the edge of the surface of sphere would be at position 𝑑0 forward from the element 0. The element with number 1 has its center to the left from 𝑧 = 𝑑0 , since 𝑟1 is negative value. We can define lenses as the set of sphere and planar surfaces by repetitive computations such as these. ©CAPCOM 11

12.

Ray Tracing: The Number of Reflection Reflect Reflect Aperture Blades • “Ghost” is the reflective element on the surface of interface Reflectance is very small • Only rays reflected an even number of times reach the sensor and appear • If we consider reflection between any element, the computation complexity will be expensive →We consider reflection only two times • If there are 𝑛 surface elements, there will be 𝑛𝐶2 ghosts Now, we can reinterpret the lens system as a set of simple shapes. We will now consider what paths to use in ray tracing. Ghosts are the reflective element on the surface of interface. Only rays reflected an even number of times reach the sensor and appear. 10 Then, it is impossible to assume all rays that reflect an even number of times, because the calculation would be too expensive. Since the reflectance is small, only two of the surface elements are selected as reflective surfaces, and the other surface elements are assumed to be refractive paths. If there are 𝑛 surface elements, there will be 𝑛𝐶2 ghosts. ©CAPCOM 12

13.

Ray Tracing: Recording transit point in the aperture 𝑃(𝑥, 𝑦) 𝑈, 𝑉 = D O 𝑥 𝑦 , 𝐷/2 𝐷/2 𝑢0 , 𝑣0 Texture 𝑈 2 + 𝑉 2 > 1 ⇔ Shielded 𝑢, 𝑣 = 𝑈 + 1 𝑉+1 , 2 2 𝑣 𝑢1 , 𝑣1 • We need information on where each ray represents a ghost • We can easily evaluate if rays of light are being blocked by aperture blades • Recorded transit point of ray of light indicate sample position of texture 𝑢 The light rays not being blocked by aperture blades will construct a grid to which the ghost texture will be mapped. Therefore, the rays must include the information where it samples on texture image. 11 Each ray records the value 𝑈,𝑉 of where it transits the aperture blades. We divide the coordinates, which have the aperture blade center as origin, by the opening’s radius. If the square root of the sum of the squares of these two values is larger than 1, it means that the ray is shielded. Also, the values represented by 𝑈 and 𝑉 are changed to 𝑢,𝑣 with a few computation. These can be used as the coordinates for sampling the ghost image. ©CAPCOM 13

14.

Ray Tracing: Anti-Reflection Coating 𝑡𝑋 , 𝑟𝑋 , 𝑟𝑋′ , 𝑡𝑋′ :Fresnel Coefficient 𝐸𝑋 : Electric Field Glass Coat ② 𝑛 𝑡1′ 𝑟1′ light 𝑟0 𝑡0 ① 𝐸0 • • • • • 𝑟1 𝑡1 𝑙 Phase diff.: 𝑒 𝑗2𝛿 Opt. path diff.: 2𝑛𝑙 Relationship between ↑ 2𝛿 = 2𝜋 4𝜋𝑛𝑙 2𝑛𝑙 = 𝜆 𝜆 Sum of the reflected elements 𝐸𝑟 = 𝑟0 + 𝑡0 𝑡1′ 𝑟1 𝑒 𝑗2𝛿 + ⋯ 𝐸0 ① ② Reflectance: 𝑅 = 𝐸𝑟 2 𝐸0 ≅ 𝑟0 + 𝑡0 𝑡1′ 𝑟1 𝑒 𝑗2𝛿 = 𝑟02 + (𝑡0 𝑡1′ 𝑟1 )2 +2𝑟0 𝑡0 𝑡1′ 𝑟1 cos The objective is computing a ghost color ①Reflection(Air/Coat) + ②Multiple Reflection(Coat/Glass) is the electric field of reflected ray Reflectance depends on wavelength→Represents the color of light Actual computation considers the angle of incidence Some information of lens are confidential Therefore, we use dummy values 2 4𝜋𝑛𝑙 𝜆 depended Finally, we got the information for texture mapping. Next, I’ll start explaining about processing the ghost color. On the surface of camera lenses, there are films to decrease the noise such as ghosts. 12 Those films are called an anti-reflection coating, and we attempt to reproduce their plausible color and intensity for ghosts. Now, let us assume that we have a film with refractive index 𝑛 and thickness 𝑙. In that case, we set the phase difference of a ray inside the film to 𝑒 𝑗2𝛿 . We can compute the optical difference as 2𝑛𝑙 and express the phase difference using the optical difference. Although the phase difference is a self-defined value, this equation enables us to compute this value. This equation is needed for computing reflectance. ©CAPCOM 14

15.

Ray Tracing: Anti-Reflection Coating 𝑡𝑋 , 𝑟𝑋 , 𝑟𝑋′ , 𝑡𝑋′ :Fresnel Coefficient 𝐸𝑋 : Electric Field Glass Coat ② 𝑛 𝑡1′ 𝑟1′ light 𝑟0 𝑡0 ① 𝐸0 • • • • • 𝑟1 𝑡1 𝑙 Phase diff.: 𝑒 𝑗2𝛿 Opt. path diff.: 2𝑛𝑙 Relationship between ↑ 2𝛿 = 2𝜋 4𝜋𝑛𝑙 2𝑛𝑙 = 𝜆 𝜆 Sum of the reflected elements 𝐸𝑟 = 𝑟0 + 𝑡0 𝑡1′ 𝑟1 𝑒 𝑗2𝛿 + ⋯ 𝐸0 ① ② Reflectance: 𝑅 = 𝐸𝑟 2 𝐸0 ≅ 𝑟0 + 𝑡0 𝑡1′ 𝑟1 𝑒 𝑗2𝛿 = 𝑟02 + (𝑡0 𝑡1′ 𝑟1 )2 +2𝑟0 𝑡0 𝑡1′ 𝑟1 cos The objective is computing a ghost color ①Reflection(Air/Coat) + ②Multiple Reflection(Coat/Glass) is the electric field of reflected ray Reflectance depends on wavelength→Represents the color of light Actual computation considers the angle of incidence Some information of lens are confidential Therefore, we use dummy values 2 4𝜋𝑛𝑙 𝜆 depended Next, the reflected lights are the sum of reflected element between air and coating and multiple reflected element between coating and glass. Thus, we can compute reflectance by use of 𝐸𝑟 and 𝐸0 . 12 Reflectance that we computed is now dependent on wavelength, so we can get different reflectance values for each wavelength to obtain a ghost color. Also, in this slide, we mention about the case of where light enters the lens perpendicularly. Actually, we consider the case of light enters the lens at an angle. When combined with reflectance and its wavelength stimulus, the resulting lens flare produces ghosts of various brightness and color. In addition, there are many values that are confidential, such as film thickness and reflective index variation with wavelength. Thus, we use dummy value. ©CAPCOM 15

16.

Ray Tracing: Local brightness evaluation Front bright (dense) • • • • Sensor dark (sparse) Intensity Coef. bright 𝑆 𝑆 𝑆 𝑆 𝑆0 𝑆1 bright 𝑆2 𝑆3 dark 4𝑆 𝑆0 + 𝑆1 + 𝑆2 + 𝑆3 bright dark Changes in brightness occur within the image→Objective is to reproduce it The shape of grid changes as light rays pass through the lens As light converge, its brightness increases=Intensity is inversely proportional to the grid area Compute intensity coefficient indicating local brightness using the area of 4 quads near the specified ray on the sensor and front element Next, how to evaluate local brightness in ghosts. Even inside a single ghost, we can observe a distribution of light and shade. 13 Let’s think about how to reproduce this phenomenon. As shown in the left top, the uniform grid defined on the front element side is deformed on the sensor side. Therefore, changes in grid density occurred on the sensor side. ©CAPCOM 16

17.

Ray Tracing: Local brightness evaluation Front bright (dense) • • • • Sensor dark (sparse) Intensity Coef. bright 𝑆 𝑆 𝑆 𝑆 𝑆0 𝑆1 bright 𝑆2 𝑆3 dark 4𝑆 𝑆0 + 𝑆1 + 𝑆2 + 𝑆3 bright dark Changes in brightness occur within the image→Objective is to reproduce it The shape of grid changes as light rays pass through the lens As light converge, its brightness increases=Intensity is inversely proportional to the grid area Compute intensity coefficient indicating local brightness using the area of 4 quads near the specified ray on the sensor and front element Also, when the light with constant intensity enters the lens, if the spread of light at the output surface is small, the light will be brightened by convergence and vice versa. 13 Therefore, we can reproduce the changes of brightness occurring inside a single ghost using this property. Then, as shown in the top center of the screen, we focused on specific ray and compute the sum of the areas of 4 quads around in sensor plane. The ratio computed from this and the sum of the areas of the 4 surrounding quads in the input plane is the coefficient that adjusts the intensity for a better representation. ©CAPCOM 17

18.

Ray Tracing Here Diffraction Pattern Generation Diffraction Pattern Mapping I’ll start explaining about diffraction pattern generation. 14 ©CAPCOM 18

19.

Light wave computation: Starburst Aperture blades Mask texture Intensity calculation Starburst 𝑔 𝑥, 𝑦 𝑦 𝑦′ 𝑥 ℱ 𝑔 𝑥′, 𝑦′ 𝑥′ 𝑑 Integrate over wavelengths +RGB conversion ℱ: Fourier Transform Fraunhofer diffraction 𝐺 2𝜋𝑥 ′ Τ𝜆𝑑 , 2𝜋𝑦 ′ Τ𝜆𝑑 = ℱ[𝑔 𝑥, 𝑦 ] 2 ′ ′ ′ ′ 𝑔 𝑥 ,𝑦 𝐼 𝑥 ,𝑦 ≅ 𝐴𝑒 𝑗 2𝜋 𝑥 ′ +𝑦 ′ 𝑑+ 2𝑑 𝜆 ′ = 𝑔 𝑥 ,𝑦 ′ 2 2 𝐺 2𝜋𝑥 ′ Τ𝜆𝑑 , 2𝜋𝑦 ′ Τ𝜆𝑑 ൘𝑗𝜆𝑑 𝐴2 ≅ 2 2 𝐺(2𝜋𝑥 ′ Τ𝜆𝑑 , 2𝜋𝑦 ′ Τ𝜆𝑑) 2 𝜆 𝑑 Opening side Outputting in the XYZ color system 𝑀𝑋𝑌𝑍 : color-matching function 𝐿: spectral distribution ′ 𝐶XYZ 𝑥 , 𝑦 ′ 𝜆MAX ≅න 𝑧 Destination plane 𝐼 𝑥 ′ , 𝑦 ′ 𝑀𝑋𝑌𝑍 𝜆 𝐿 𝜆 𝑑𝜆 𝜆MIN • No need to compute the distribution of other wavelengths 𝟐 • Multiply the intensity of spectrum at 𝝀𝟎 by 𝝀𝟎ൗ𝝀 and the scale by 𝝀ൗ𝝀𝟎 to get another spectrum at 𝜆.After, we add them up and convert it to an RGB distribution to get the starburst. To produce better result, it is important to use wave optics theory when producing the images. Let’s start by making the starburst image. Here, we use the Fraunhofer diffraction which is an approximation for diffraction. This method is very fast at evaluating diffraction because it uses fast Fourier transform. 15 Concretely, we prepare a mask texture representing the aperture blades, then by applying a Fourier transform, we can get the frequency spectrum as a diffraction image. Here, for the mask texture, we only need to know if it’s empty or not. So, we only use the R channel. By computing the frequency spectrum, we get real numbers and imaginary numbers. Therefore, we take the sum of squares and convert it in intensity. ©CAPCOM 19

20.

Light wave computation: Starburst Aperture blades Mask texture Intensity calculation Starburst 𝑔 𝑥, 𝑦 𝑦 𝑦′ 𝑥 ℱ 𝑔 𝑥′, 𝑦′ 𝑥′ 𝑑 Integrate over wavelengths +RGB conversion ℱ: Fourier Transform Fraunhofer diffraction 𝐺 2𝜋𝑥 ′ Τ𝜆𝑑 , 2𝜋𝑦 ′ Τ𝜆𝑑 = ℱ[𝑔 𝑥, 𝑦 ] 2 ′ ′ ′ ′ 𝑔 𝑥 ,𝑦 𝐼 𝑥 ,𝑦 ≅ 𝐴𝑒 𝑗 2𝜋 𝑥 ′ +𝑦 ′ 𝑑+ 2𝑑 𝜆 ′ = 𝑔 𝑥 ,𝑦 ′ 2 2 𝐺 2𝜋𝑥 ′ Τ𝜆𝑑 , 2𝜋𝑦 ′ Τ𝜆𝑑 ൘𝑗𝜆𝑑 𝐴2 ≅ 2 2 𝐺(2𝜋𝑥 ′ Τ𝜆𝑑 , 2𝜋𝑦 ′ Τ𝜆𝑑) 2 𝜆 𝑑 Opening side Outputting in the XYZ color system 𝑀𝑋𝑌𝑍 : color-matching function 𝐿: spectral distribution ′ 𝐶XYZ 𝑥 , 𝑦 ′ 𝜆MAX ≅න 𝑧 Destination plane 𝐼 𝑥 ′ , 𝑦 ′ 𝑀𝑋𝑌𝑍 𝜆 𝐿 𝜆 𝑑𝜆 𝜆MIN • No need to compute the distribution of other wavelengths 𝟐 • Multiply the intensity of spectrum at 𝝀𝟎 by 𝝀𝟎ൗ𝝀 and the scale by 𝝀ൗ𝝀𝟎 to get another spectrum at 𝜆.After, we add them up and convert it to an RGB distribution to get the starburst. The result of Fraunhofer diffraction is for a single wavelength. However, the components of multiple wavelengths reach a real sensor. If you were wondering if it was necessary to compute the distribution of other wavelengths, the answer is no. 15 If we take a look at the Fraunhofer diffraction box above, the intensity 𝐼 that has been computed using Fourier transform depends on the scale and intensity of wavelengths. Thus, if we compute the power spectrum once for a wavelength 𝜆0 , then we 𝟐 can get the intensity 𝝀𝟎ൗ𝝀 and scale 𝝀ൗ𝝀𝟎 for a wavelength 𝜆. By applying this scheme, we get the result for another wavelength and, after integrating over the visible light range, we convert the result in RGB to get our starburst image. ©CAPCOM 20

21.

Light wave computation: Limitation when computing starburst Fourier transform of real function and its symmetry properties +∞ Definition: 𝐺 𝑢 = ℱ[𝑔(𝑥)] ≡ ‫׬‬−∞ 𝑔(𝑥)𝑒 −𝑗2𝜋𝑢𝑥 𝑑𝑥 8 blades 7 blades 7 blades result (point symmetric) (asymmetric) (point symmetric) +∞ ∗ ∗ 𝑔(𝑥)𝑒 +𝑗2𝜋𝑢𝑥 𝑑𝑥 = 𝐺(−𝑢) 𝑔 𝑥 =𝑔 𝑥 → 𝐺 𝑢 =න −∞ Origin symmetry: 𝐺(𝑢, 𝑣) 2 = 𝐺(−𝑢, −𝑣) 2 (∵ 𝑧 ∗ = 𝑧 ) • For an odd numbers of aperture blades, the starburst will be asymmetrical • Amplitude/intensity spectra of real functions are origin symmetric • Impossible to get asymmetrical starburst while computing only the Fraunhofer diffraction →From the center of the result image, normalize the length in the direction of 𝑢, 𝑣 by a random length On the other hand, this method has limitations. The computing method that this simulation uses inherits the characteristic needed for implementation depending on the 16 Fourier transform. Now, let’s think about the case where we have a Fourier transform using real numbers as input. Since the complex conjugate of real number becomes itself, from the definition of Fourier transforms, we see that real function's amplitude/intensity spectra are origin symmetric. As we can see from the upper right image, a starburst that has been produced with an odd numbers of aperture blades is not symmetrical, but a little bit asymmetrical. ©CAPCOM 21

22.

Light wave computation: Limitation when computing starburst Fourier transform of real function and its symmetry properties +∞ Definition: 𝐺 𝑢 = ℱ[𝑔(𝑥)] ≡ ‫׬‬−∞ 𝑔(𝑥)𝑒 −𝑗2𝜋𝑢𝑥 𝑑𝑥 8 blades 7 blades 7 blades result (point symmetric) (asymmetric) (point symmetric) +∞ ∗ ∗ 𝑔(𝑥)𝑒 +𝑗2𝜋𝑢𝑥 𝑑𝑥 = 𝐺(−𝑢) 𝑔 𝑥 =𝑔 𝑥 → 𝐺 𝑢 =න −∞ Origin symmetry: 𝐺(𝑢, 𝑣) 2 = 𝐺(−𝑢, −𝑣) 2 (∵ 𝑧 ∗ = 𝑧 ) • For an odd numbers of aperture blades, the starburst will be asymmetrical • Amplitude/intensity spectra of real functions are origin symmetric • Impossible to get asymmetrical starburst while computing only the Fraunhofer diffraction →From the center of the result image, normalize the length in the direction of 𝑢, 𝑣 by a random length However, as mentioned above, even with an odd numbers of aperture blades, this technique only computes diffraction images as a function that has origin symmetry. 16 For asymmetrical starburst, first prepare a function that produces a random length from an angle, take the corresponding 𝑢,𝑣 and normalize its length with that random length. ©CAPCOM 22

23.

Light wave computation: Reflection effect of dirt/scratch on the front lens Without dirt With dirt Dirt texture Effect low Effect high R Channel • • • • Presence of dirt/scratch on the front lens Multiply the mask texture with the dirt texture Randomly drawn lines are enough Possible to control the strength of the effect Next, let’s think of a way to make the visual of starburst even more believable. Since the front lens is exposed to the air, let’s add dirt and scratches to it. At the very least, we are thinking of an effect that would influence the diffraction image. 17 For the implementation, we need to prepare a dirt/scratches texture (can be done extremely easily), then simply multiply it with the mask texture. Just like the mask texture only uses the R channel, of course the dirt texture also only uses the R channel. It is of importance to set to 0 the part of the mask where you want dirt to be invisible. We need that to represent occlusion when performing multiplication with the mask texture. Also, by preparing a binary texture and adding a parameter which dictates transition, you can easily control the effect of dirt by conforming to that variation. ©CAPCOM 23

24.

Light wave computation: Ghosts Multiplication Ghost ℱ −1 ℱ FRF Band Limited Angular Spectrum Δ𝑢, Δ𝑣:Sampling Interval(frequency) AS: 𝑔 𝑥, 𝑦; 𝑧 = 𝑧0 + 𝑑 = ℱ −1 ℱ 𝑔 𝑥, 𝑦; 𝑧 = 𝑧0 𝐻FRF 𝑢, 𝑣; 𝑑 chirp function −2 2 2 𝑢 2 + 𝑣 2 ≤ 𝜆−2 𝐻FRF 𝑢, 𝑣; 𝑑 = 𝑒 𝑗2𝜋𝑤𝑑 𝑤(𝑢, 𝑣) = ൝ 𝜆 − 𝑢 − 𝑣 0 otherwise Limit: 𝑢limit = 𝜆 Real part Imaginary part 2Δ𝑢𝑑 2 +1 −1 , 𝑣limit = 𝜆 𝑯𝐅𝐑𝐅_𝐥𝐢𝐦𝐢𝐭𝐞𝐝 𝒖, 𝒗; 𝒅 = 𝑯𝐅𝐑𝐅 𝒖, 𝒗; 𝒅 𝐫𝐞𝐜𝐭 2Δ𝑣𝑑 2 +1 −1 𝒖 𝒗 𝐫𝐞𝐜𝐭 𝟐𝒖𝐥𝐢𝐦𝐢𝐭 𝟐𝒗𝐥𝐢𝐦𝐢𝐭 • Mapping texture from grid obtained by ray tracing →if possible, do precise computation →Angular Spectrum(AS: Convoluted form of Rayleigh-Sommerfeld integral) • Frequency Response Function(FRF):as the propagation distance grows, the frequency rises →Aliasing errors occur in FRF • Use Band-Limited Angular Spectrum Method to limit aliasing errors Now, let’s move on to the next topic: ghosts. Even here, to obtain an even more plausible distribution, we are going to compute optical wave propagation. 18 The Fraunhofer diffraction that we used to produce starbursts was an approximation. Now that we went to great lengths to produce a mapping texture from the grid obtained by ray tracing, we would like to do high precision computation. We decided to use Angular Spectrum method which allow fast evaluation of Rayleigh-Sommerfeld integral, which deals with rigorous diffraction. This method does its calculation using a frequency response function and, just like when computing starbursts, the prepared mask texture from the aperture blades. ©CAPCOM 24

25.

Light wave computation: Ghosts Multiplication Ghost ℱ −1 ℱ FRF Band Limited Angular Spectrum Δ𝑢, Δ𝑣:Sampling Interval(frequency) AS: 𝑔 𝑥, 𝑦; 𝑧 = 𝑧0 + 𝑑 = ℱ −1 ℱ 𝑔 𝑥, 𝑦; 𝑧 = 𝑧0 𝐻FRF 𝑢, 𝑣; 𝑑 chirp function −2 2 2 𝑢 2 + 𝑣 2 ≤ 𝜆−2 𝐻FRF 𝑢, 𝑣; 𝑑 = 𝑒 𝑗2𝜋𝑤𝑑 𝑤(𝑢, 𝑣) = ൝ 𝜆 − 𝑢 − 𝑣 0 otherwise Limit: 𝑢limit = 𝜆 Real part Imaginary part 2Δ𝑢𝑑 2 +1 −1 , 𝑣limit = 𝜆 𝑯𝐅𝐑𝐅_𝐥𝐢𝐦𝐢𝐭𝐞𝐝 𝒖, 𝒗; 𝒅 = 𝑯𝐅𝐑𝐅 𝒖, 𝒗; 𝒅 𝐫𝐞𝐜𝐭 2Δ𝑣𝑑 2 +1 −1 𝒖 𝒗 𝐫𝐞𝐜𝐭 𝟐𝒖𝐥𝐢𝐦𝐢𝐭 𝟐𝒗𝐥𝐢𝐦𝐢𝐭 • Mapping texture from grid obtained by ray tracing →if possible, do precise computation →Angular Spectrum(AS: Convoluted form of Rayleigh-Sommerfeld integral) • Frequency Response Function(FRF):as the propagation distance grows, the frequency rises →Aliasing errors occur in FRF • Use Band-Limited Angular Spectrum Method to limit aliasing errors As shown in the above box, it is a convoluted operation that can be computed using Fourier transform. Now, in theory, you get peak precision with the Angular Spectrum method. However, in practice, you need to be careful 18with numerical precision when computing ghosts. If we look at the frequency response function 𝐻FRF in the above box, we can deduce that with a longer propagation distance 𝑑 comes a shorter period. In other words, we observed that it’s a chirp function where the signal frequency increases as the propagation distance increases. Therefore, as the propagation distance grows, aliasing errors occur in the FRF, which in turn produces severe noise in the result of discretization of numerical computation. To prevent this problem, we used the Band-Limited Angular Spectrum Method which limits the frequency band of the frequency response function. ©CAPCOM 25

26.

Light wave computation: comparison between with and without band limiting Close-up comparison Angular Spectrum Method (AS) FRF real part FRF imaginary part Diffraction Image Band-Limited Angular Spectrum Method (BL-AS) FRF real part FRF imaginary part Diffraction image AS BL-AS • As the propagation distance becomes longer, noise occurs in the diffraction image • By using BL-AS, the noise disappears Here, we compare the result of computing diffraction propagation between the Angular Spectrum Method and the Band-Limited Angular Spectrum Method. 19 As for the Band-Limited Angular Spectrum Method, it has indeed its frequency response function discontinued, where all values become 0 outside of its rectangular area. If we look at a close-up comparison of both methods as shown in the upper-left, we can observe noise appearing in the diffraction image of the normal Angular Spectrum Method. That said, with the BL-AS method, we can observe that such noise has been eradicated, that there is no visual problem and that we are able to get a rigorous diffraction image. ©CAPCOM 26

27.

Ray Tracing Diffraction Pattern Generation Here Diffraction Pattern Mapping With that done, we have all the data needed to create a lens flare, so let’s render it. 20 ©CAPCOM 27

28.

Rendering Complex deformation • Map the ghost on the grid, map the starburst at the light source position • For brightness and color, replicate something believable • Since we map the ghost on the deformed grid, it’s possible to naturally replicate the non-linear deformation There is nothing here to really explain. With all the information we got so far, we apply texture mapping on the grid and render it in a render target. 21 As for brightness and color, since we can replicate something that feels believable and map on the deformed grid, we can render a non-linear transformed ghosts that feel natural. ©CAPCOM 28

29.

Occlusion process when the light source is out of view 1/2 Light source Camera Camera Screen edge • Unable to render flare when the light source is outside of screen space because of depth buffer occlusion • Occlusion evaluation from shadow map and camera coordinate Shadow map From here on out, I would like to introduce you to our occlusion process and also optimization. First, let’s talk about the case where the light source is out of view. 22 With screen space, fundamentally you use the camera depth buffer to apply occlusion, but in the case where it is outside of screen space, you can’t produce a flare. This is where we do an occlusion evaluation by comparing the depths between the camera coordinate and the shadow map produced by the light source. ©CAPCOM 29

30.

Occlusion process when the light source is out of view 2/2 Shadow map Camera Coordinate When 𝑟MIN 𝑟 is small When 𝑟MIN 𝑟 𝑟MIN 𝑟MIN 𝑟 is large Visible Occluded Occluded (nearest neighbor) Binary occlusion state by only comparing with the camera’s depth→Smooth transition needed Prerecord 128 random sample points within a unit circle [including (0,0)] Offset samples by the camera’s position and evaluate occlusion for each sample within the radius 𝑟 Record the length 𝑟 when the sample is visible and the length (≤ 𝑟) between the sample and the camera’s position when occluded 𝑟 • Find the smallest 𝑟MIN and compute the occlusion rate with MIN (0:occluded 1:visible) 𝑟 • Multiply with the lens flare parameter for size/brightness • • • • However, by simply comparing with the camera’s depth, we get a binary occlusion state. At the boundary, we get sudden change in occlusion state for the lens flare. 23 For example, let’s look at the case where the wind is blowing leaves in a tree. The occlusion state would often radically change and make the lens flare flicker on screen. To counter this, we came up with a way to smooth the transition between occlusion states. First, we prepare 128 random sample points within a unit circle including the origin. Then, as you can see in the middle image, we assign a radius 𝑟 where we want to evaluate occlusion, multiply our samples by 𝑟 and then offset them by the camera's position. After that, we evaluate occlusion for every sample using the depth from the shadow map. ©CAPCOM 30

31.

Occlusion process when the light source is out of view 2/2 Shadow map Camera Coordinate When 𝑟MIN 𝑟 is small When 𝑟MIN 𝑟 𝑟MIN 𝑟MIN 𝑟 is large Visible Occluded Occluded (nearest neighbor) Binary occlusion state by only comparing with the camera’s depth→Smooth transition needed Prerecord 128 random sample points within a unit circle [including (0,0)] Offset samples by the camera’s position and evaluate occlusion for each sample within the radius 𝑟 Record the length 𝑟 when the sample is visible and the length (≤ 𝑟) between the sample and the camera’s position when occluded 𝑟 • Find the smallest 𝑟MIN and compute the occlusion rate with MIN (0:occluded 1:visible) 𝑟 • Multiply with the lens flare parameter for size/brightness • • • • The reason to include the origin is so that when we evaluate occlusion, the occlusion of the camera itself is also evaluated. Then, we write this occlusion evaluation shader as a compute shader and execute it using 1 group with 128 threads. 23 Next, each thread will compute the occlusion of a sample, and if visible, it will record the length 𝑟. If not, it records the length between the sample and the camera's position. Those results will be stored in groupshared memory and from them, the smallest length recorded 𝑟MIN will be computed and then divided by 𝑟. With this, we get values close to 0 for occluded samples in the vicinity of the camera and close to 1 if not occluded. If you multiply this with the lens flare parameters for size and brightness, it is possible to render a smooth occlusion. ©CAPCOM 31

32.

Rendering problem and its mitigation Looks unnatural 𝜆0 𝜆1 𝜆2 𝜆3 𝜆4 𝜆5 Sample different wavelength UV • • • • IR We can only use nine wavelengths at most because of its processing load and memory consumption Separate visible light range into some of parts Sample different wavelength per frame Increases the number of apparent wavelengths by blending rendered image in the time domain When the flare moves, the blur effect occurrs The next section, we describe how to deal with problems that occurred. We can only use nine wavelengths from visible right range at most because of its processing load and memory consumption. 24 Therefore, depending on the ghost, as shown in the upper left corner of the screen, the components for each wavelength are largely separated and look unnatural. Then, we divide the band of wavelength into several sections. After that, we sample different wavelength from the divided region each frame to mitigate this problem. After that, we can increase the number of apparent wavelengths by blending rendered image with previous result to mitigate the unnaturalness of ghosts. However, a blur effect occurs when the lens flare moves. So, we need to consider a way to mitigate that blur effect. Here, we won’t mention how to mitigate this blur. ©CAPCOM 32

33.

Technique for Acceleration: Introduction 𝑃(𝑥, 𝑦) D 𝑢0 , 𝑣0 Texture After Culling O 𝑢1 , 𝑣1 𝑈, 𝑉 = Before Culling 𝑣 𝑥 𝑦 , 𝐷/2 𝐷/2 𝑢, 𝑣 = 𝑢 𝑈 + 1 𝑉+1 , 2 2 • Computational complexity is very high because of over drawing for alpha blending • We can determine that a point will contribute to drawing of lens flare from the coordinates of 𝑢,𝑣 • We can eliminate vertex which does not contribute to drawing We present the last item, how to speed up the drawing process. Basically, this method has a high computational complexity because it uses a lot of alpha blending. 25 During the drawing phase, 𝑈,𝑉 recorded during ray tracing is converted to 𝑢,𝑣 for fetching the texture. If the ray are shielded by aperture blades, then by definition, the value of 𝑈,𝑉 would represents outside the aperture. Thus, these values denoted as 𝑢,𝑣 computed from 𝑈,𝑉 may indicate a position outside the defined range of the texture. Therefore, we eliminate non-contributing vertex. In some cases, only a few vertices of the grid may have valid 𝑢,𝑣. For example, this elimination process can eliminate a large number of invalid vertices. ©CAPCOM 33

34.

Technique for Acceleration: Processing 𝑣 Not culled 𝑣=1 Culled 𝑢MAX , 𝑣MAX 1,1 vs. 𝑢 0,0 𝑢MIN , 𝑣MIN 𝑢=1 Code if(isCulled) pos.z = asfloat(0xffffffff); • Compute AABB constructed from vertices surrounding vertex for which we want to test for culling • If the vertex passed culling test, we set NaN to vertex position * [Citation] *Coordinates coming in to clipping with infinities at x,y,z may or may not result in a discarded primitive. Coordinates with NaN at x,y,z or w coming out of clipping are discarded.(1) For the specificity of the process, we focused on specific rays and generate AABB from 𝑢,𝑣 of surrounding vertices that share a vertex and form a triangle. 26 If this AABB doesn’t intersect with AABB constructed from 𝑢, 𝑣 = 0, 0 , (1, 1), we discard unnecessary vertices without lacking vertices which contribute to drawing. For the actual process, we set NaN to the position of vertex we want to discard, which in turn will discard the triangle containing those discard vertices. For example, in case of DirectX, the specification is mentioned in DirectX-Specs. ©CAPCOM 34

35.

Technique for Acceleration: Result We measure execution time on PlayStation5. Output resolution is 1/4 of 3840 × 2160 pixels (960 × 540) Sum of the number of vertex is 16 × 16 vertex × 325 ghosts × 9 wavelength = 748,800 Culling OFF Occlusion Ray Tracing Drawing Burst Drawing Ghosts Culling ON 0.005 0.989 0.007 4.332 ×10 faster ×6.5 faster Occlusion Ray Tracing Drawing Burst Drawing Ghosts 0.005 0.994 0.007 0.407 Culling Process 0.259 0.666 units : [millisecond] During execution, we can take action such as decreasing the number of wavelengths and reducing the draw region to accelerate execution We checked the difference of execution time occurred by culling processing on PlayStation 5. In this case, approximately 750,000 vertices were used. 27 By culling, the computation of drawing ghosts is about 10 times faster. Even if the computation time of culling is included in the computation time of drawing ghosts, the execution time is still about 6.5x faster. It’s a meaningful optimization. During execution, we can take action such as decreasing the number of wavelength and draw region to accelerate execution. ©CAPCOM 35

36.

Conclusion and future prospect [Conclusion] • We generated plausible wave distribution image by use of numerical propagation method • We obtained a ghost image with natural deformation using ray tracing • We proposed one solution to overcome occlusion when the light is out of the sights • In the case of dispersion detected significantly, the ghosts are observed as if they are separated • →Increases the number of apparent wavelengths by blending in the time domain When we simply use Fraunhofer diffraction to compute starburst, we can’t accomplish asymmetry • Since we can’t obtain the position of a light source in the screen space, it is difficult to deal with cases where there are many light sources • Much information about lenses is confidential [Future Prospect] • Rendering with small number of grid division by focusing ray to the periphery of the aperture • Deal with many light sources Let’s conclude this presentation. We drew lens flare with plausible visuals using numerical propagation and ray tracing and did occlusion if the light is out 28 of sights. As for problems, first we had the issue where we can’t generate an asymmetric starburst from apertures blades having an odd number of blades using Fraunhofer diffraction method. However, we can achieve asymmetric bursts by deforming the texture. Another problem we had was that we can not accommodate a large number of light sources due to the inability to compute light source positions in screen space. ©CAPCOM 36

37.

Conclusion and future prospect [Conclusion] • We generated plausible wave distribution image by use of numerical propagation method • We obtained a ghost image with natural deformation using ray tracing • We proposed one solution to overcome occlusion when the light is out of the sights • In the case of dispersion detected significantly, the ghosts are observed as if they are separated • →Increases the number of apparent wavelengths by blending in the time domain When we simply use Fraunhofer diffraction to compute starburst, we can’t accomplish asymmetry • Since we can’t obtain the position of a light source in the screen space, it is difficult to deal with cases where there are many light sources • Much information about lenses is confidential [Future Prospect] • Rendering with small number of grid division by focusing ray to the periphery of the aperture • Deal with many light sources Also, there was the issue that a lot of information about lenses is confidential. In the future, we want to reduce the rendering time of ghosts by having a grid with lesser divisions and by focusing rays 28in a more concise region surrounding the aperture. In addition, we also have to consider a solution for emitting lens flare from many light sources. That's all for my presentation. ©CAPCOM 37

38.

Citation (1) Direct3D 11.3 Functional Specification “15.4 Clipping” April 23, 2015 https://microsoft.github.io/DirectXSpecs/d3d/archive/D3D11_3_FunctionalSpec.htm#15.4%20Clipping(08/29/2023) References [1] Hullin, M., Eisemann, E., Seidel, H.P., Lee, S., “Physically-based real-time lens flare rendering,” ACM Trans. Graph. 30(4), 108:1–108:10 (2011). DOI 10.1145/2010324.1965003. [2] K. Matsushima, T. Shimobaba, "Band-Limited Angular Spectrum Method for Numerical Simulation of Free-Space Propagation in Far and Near Fields," Opt. Express 17(22), 19662-19673 (2009). 29 ©CAPCOM 38