Introduction to Editor Functions and Workflows that Support High-Quality Effects

262 Views

November 27, 23

スライド概要

■Overview
Afterimages that appear when you swing a sword, exploding flames that appear when you destroy an oil drum, the Hadoken that manifests when you concentrate your Ki...

Using the Effects Editor in RE ENGINE, we will introduce asset creation, providing functions, profiling tools, and actual title workflows.

Note: This is the contents of the publicly available CAPCOM Open Conference Professional RE:2023 videos, converted to slideshows, with some minor modifications.

■Prerequisites
For artists interested in creating effects, and programmers interested in interfacing with them.

I'll show you just a little bit of the content !
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
CAPCOM Open Conference Professional RE:2023
https://www.capcom-games.com/coc/2023/

Check the official Twitter for the latest information on CAPCOM R&D !
https://twitter.com/capcom_randd
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

profile-image

株式会社カプコンが誇るゲームエンジン「RE ENGINE」を開発している技術研究統括によるカプコン公式アカウントです。 これまでの技術カンファレンスなどで行った講演資料を公開しています。 【CAPCOM オープンカンファレンス プロフェッショナル RE:2023】  https://www.capcom-games.com/coc/2023/ 【CAPCOM オープンカンファレンス RE:2022】  https://www.capcom.co.jp/RE2022/ 【CAPCOM オープンカンファレンス RE:2019】  http://www.capcom.co.jp/RE2019/

シェア

またはPlayer版

埋め込む »CMSなどでJSが使えない場合

関連スライド

各ページのテキスト
1.

Introduction to Editor Functions and Workflows that Support High-Quality Effects I will now give an introduction to editor functions and workflows that support high-quality effects. ©CAPCOM 1

2.

Introduction of Effects Functions ・Explanation of Editor Functions ・Explanation of Expression Features and Results ・Title Integration Workflow ・Optimization Support The agenda for the presentation is as follows: - Explanation of editor functions. - Explanation of expression features and effects. - Title integration workflow. - Optimization support. 1 I'll be going over these four topics. ©CAPCOM 2

3.

Explanation of Editor Functions This section explains how effect assets are created with the editor functions. 2 ©CAPCOM 3

4.

Explanation of Editor Functions This is the effects editor that runs on RE ENGINE. Effects assets that shape the expressions are created in this editor. - The area in the middle of the screen is the editor that manages the nodes. 3 - The area visible at the top of the screen has the editor's various commands and settings. - The area visible on the left side of the screen is a catalog of functions needed to create an expression. - The area visible at the bottom of the screen is an editor that manages some parameters in timeline. ©CAPCOM 4

5.

Explanation of Editor Functions Dragging the Emitter node item from the catalog lets us add its functionality. "Item" is how we refer to functionalities used to create an effects expression. ©CAPCOM 4 5

6.

Explanation of Editor Functions The basic functions that make up an expression are explained here. Transform is an item that controls the position and orientation of the Emitter. We can define local transformations in relation to the global transform, Game Objects, Joints, etc. 5 The coordinates, rotation, and scaling of a Transform can be controlled with the manipulator. Parent Options are used to follow the aforementioned Game Object or Joint's position. ©CAPCOM 6

7.

Explanation of Editor Functions The Spawn item sets the settings for particle generation. You can set the particle buffer, number of spawns, spawn interval, and number of loops. The Life item sets the particle survival time. 6 There are three survival time intervals: Appear, Keep, and Vanish. Each of these items affects the alpha value. ©CAPCOM 7

8.

Explanation of Editor Functions The EmitterShape3D item is used to set the particle generation range. Three shapes are available: Cylinder, Box, and Sphere. 7 For basic particle generation, this is sufficient, but items that generate geometric shapes and ranges are available separately. ©CAPCOM 8

9.

Explanation of Editor Functions The Velocity3D item sets the settings for particle movement. It can be set to move in a simple direction. You can also set the way particles are generated in conjunction with the EmitterShape3D 8 shape described above. The shape of the emitter and the movement pattern are closely related, and the system is designed to work in tandem with the shape of the object. ©CAPCOM 9

10.

Explanation of Editor Functions The TypeBillboard item is used to configure settings related to drawing. There are multiple BlendTypes. They can be switched between depending on the desired result. The image on the left is an additive blend using AddContrast. The image on the right is an EdgeBlend that blends contours with a specific color. 9 EdgeBlend was implemented for a specific expression that artists wanted to achieve. ©CAPCOM 10

11.

Explanation of Editor Functions However, the artist may not be satisfied with only the BlendType expressions available. Therefore, a TypeBillboard3DMaterial item is also implemented to draw with shaders and materials created by the artist. 10 In addition to billboards, there are many other items that can be rendered with materials. ©CAPCOM 11

12.

Explanation of Editor Functions UVSequence items are used to configure settings related to Texture and Pattern Animation. There is a separate asset called UVSequenceAsset that manages texture and UV patterns. ©CAPCOM 11 12

13.

Explanation of Editor Functions When a fireball falls to the ground, it sparks. The "Action" function exists to realize such expressions. Action is a function that creates an Emitter from a Particle. This is realized by connecting to the Action node from a specific item and receiving a notification. 12 PtLife notifies Action when a particle's life transitions or when it disappears. This is used for timed expressions such as fireworks. PtColliderAction, notifies Action when a particle collides with a terrain collision mesh. This is used for interactive expressions such as attacks. ©CAPCOM 13

14.

Explanation of Expression Features and Results This has been a basic description of the basic functions. Next, I will explain some effects actually used in game titles and the functions of the items that realize these effects. ©CAPCOM 14

15.

Please take a look at this first. This is the effect of Ryu's Hadoken in Street Fighter 6. The effect is a combination of the following: The buildup, launch, flight, and impact. All of them have different effect assets. ©CAPCOM 14 15

16.

Explanation of Expression Features and Results The Hadoken effect is composed of multiple effect assets. This is only a part of the effect expression I have shown you. I will explain the internal structure of the effect assets. 15 ©CAPCOM 16

17.

Explanation of Expression Features and Results Here is the overall structure of the Hadoken firing and main body effect assets. A large number of emitters are required just to create some of the expressions. 16 The following is a brief description of some of the Emitters. ©CAPCOM 17

18.

Explanation of Expression Features and Results This is the part that handles the effect on the hands. In order to control detailed, non-random movements, we use an item called PtTransform3D. 17 This item allows you to control the Transform values set in the timeline on a per-particle basis. An Item called TypeMesh item is used to draw the mesh. ©CAPCOM 18

19.

Explanation of Expression Features and Results Some parameters from the material set in TypeMesh are implemented such that they accessible via reflection. There are many such cases where you want to change how something is expressed only when the effect is drawn. 18 With this function, it is possible to adjust certain parameters only when drawing, from the effect side. ©CAPCOM 19

20.

Explanation of Expression Features and Results This is the expression for the front, which is the core part. It uses an Item called FadeByAngle that fades depending on the angle. There are many expressions that become inconvenient depending on the angle from which they are viewed. To solve this problem, it is possible to fade out depending on the angle. 19 The fade will cause the alpha to fade at angles other than the frontal plane. This means that the effects visible in the image are expressions created for the front view. ©CAPCOM 20

21.

Explanation of Expression Features and Results Here is another representation of the core area. This one is also set up so that some particles fade out at an angle. In addition to fading by angle, there are other items to adjust. For example, you may want to ignore tone mapping, fade polygon edges, etc. 20 Of course, there are functions to do this. You can use the ShaderSettings Item to apply Soft Particle and Detonemap. ShaderSettings is an item that adjusts drawing parameters in the drawing pipeline other than materials. ShaderSettings can also be used to adjust settings such as Particle Lighting and Variable Rate Shading, letting artists easily create rich results. This was an example of how the various Emitters and items work together to create an effect. ©CAPCOM 21

22.

Explanation of Expression Features and Results Next, I will introduce a feature called StretchBlur implemented in Street Fighter 6. Please watch this video first. 21 The blur expression preserves the detail of the attack area. This StretchBlur allows for expressions that could not be achieved with normal motion blur. ©CAPCOM 22

23.

Explanation of Expression Features and Results Blur sampling range StretchBlur is an item that allows you to specify blur sampling in specific areas as shown in the image, and then map blur results to billboards and trails. 22 The blur sampling range of the foot is shown in debug display. This feature allows blur results that reflect the partial color of the costume in the distortion and compositing. ©CAPCOM 23

24.

Explanation of Expression Features and Results Here is a clear image of the debugging display. The rectangular area shown surrounding the character is the range of blur sampling. The blur sampling results are mapped to multiple billboards to express afterimages. ©CAPCOM 23 24

25.

Explanation of Expression Features and Results However, with such a large area, background color is also sampled. Therefore, stencils are used to mask and sample only specific drawing targets. 24 The green debugging display target seen on the right side of the image is drawn with stencil #9, and StretchBlur samples pixels from stencil #9 in the image area. ©CAPCOM 25

26.

Explanation of Expression Features and Results Here, too, only the pixels of the legs are sampled by the stencil. The afterimage of Chun-Li's Lightning Kicks is represented by sampling multiple StretchBlurs at the base of the legs and 25mapping them to polygons. ©CAPCOM 26

27.

Please take a look at this: This is the expression of an explosion caused by a fireball in RE:4. The large scale explosions and rubble expressions create a powerful effect. 26 However, the scale of assets created for such a high-impact expression is inevitably large. ©CAPCOM 27

28.

Explanation of Expression Features and Results This is the overall composition of the explosion expression. Expression of impact and rubble at the time of explosion, secondary effects in the form of sparks, flames, smoke, etc. 27 To control the expression on a large scale and in such detail, the number of Emitters to be processed in this way is also large. ©CAPCOM 28

29.

Explanation of Expression Features and Results EffectAsset EffectAsset Thread0 Thread1 Thread2 Thread3 Emitter0 Emitter1 Emitter0 Emitter1 Emitter2 Emitter3 Emitter2 Emitter3 Emitter4 Emitter5 Emitter6 Emitter7 Emitter0 Emitter5 Emitter1 Emitter2 Emitter3 Emitter4 Emitter6 Emitter1 Emitter3 Emitter0 Emitter2 Emitter7 In order to process such a large amount of Emitters, RE ENGINE's effects system performs parallel processing on an Emitter-by-Emitter basis. 28 All Emitters, regardless of effect assets, are registered in a parallel processing job and processed sequentially. This prevents problems such as processing stalls when the number of Emitters is large. ©CAPCOM 29

30.

Explanation of Expression Features and Results No Emitter Priority Emitter Priority Also, when drawing such a large amount of Emitters, depending on the expression, the position of the camera may cause undesirable rendering. 29 To prevent this, an item called EmitterPriority exists. This item can be set in the Root node to control drawing priority according to the Emitter index order. You can use this item when you want to avoid the inconvenience of automatic sorting by camera distance. ©CAPCOM 30

31.

Please watch this next. This is the effect expressed when the sisters move in RE VILLAGE. A large number of insects are generated from the sisters' bodies. 30 The insects are always clinging to the sisters' bodies at a certain distance. As they move, a large number of insects follow them. Several items are used for the behavior of these insects. ©CAPCOM 31

32.

Here is a video of it occurring on the editor for clarity. In order to update and draw the behavior of a large number of insects, GPU Particles and mesh drawing with instancing 31are used. ©CAPCOM 32

33.

Explanation of Expression Features and Results The item called TypeGpuMesh is a feature that controls and instances a mesh with GPU Particles. For clarity, a white sphere mesh is used. 32 The behavior of the insects are controlled by Velocity3D, Attractor, VectorField, and so on. The Attractor item is a function that converges particles to a specific location. This function makes it possible to move a large number of insects organically. If Attractor is not used, the GPU Particles will be scattered in all directions without convergence as shown in the image on the right. ©CAPCOM 33

34.

Next we introduce an item called ScreenSpaceEmitter. Please see here first. This is an effect in a cutscene from VILLAGE. 33 You can see that fine particles are generated from the letters on the gate. ©CAPCOM 34

35.

Explanation of Expression Features and Results "I want to control the range of occurrence from the drawing result." ScreenSpaceEmitter was implemented based on this desire. This item controls particle generation from the pixel information drawn on the screen and reflects its color. 34 It can be used by simply adding it to Emitter as an item. However, since this is a GPU-driven function, it can only be used with GPU Particles. In this example, the TypeGpuBillboard item functions as a GPU Particle. ©CAPCOM 35

36.

Explanation of Expression Features and Results Pixel Buffer Pixel position information output Color Fetch Pixel Buffer GPU Particle generation GPU Particle Buffer Let me explain in more detail. First, the pixel position information is output from the Pixel Shader of the mesh drawn in the G-Buffer to a dedicated Pixel Buffer. 35 The shader graph editor, which constitutes the material, has a node dedicated to writing to the Pixel Buffer. The maximum number of particles that can be generated by ScreenSpaceEmitter is limited, so pixels are culled to a certain extent before being exported to Pixel Buffer. The pixel spacing is adjusted according to the resolution. When GPU Particles are generated, information is fetched from the Pixel Buffer to control the position of the generated particles. The GPU Particle color can also reflect the color of the GPU Particle by sampling the color of the screen image that has already been drawn. ©CAPCOM 36

37.

Explanation of Expression Features and Results This is how various items and editor functions are utilized to create a variety of effects. RE ENGINE's effects are being enhanced and technological research is being conducted on a daily basis. 36 Next, we will introduce the workflow of how the effect assets are implemented in the title. ©CAPCOM 37

38.

Title Integration Workflow Now, let me introduce you to the title integration workflow. Now that you know how effects are created, let's see how the created effects are placed in the actual game scene. ©CAPCOM 38

39.

Main ways to call effects Effects are used by adding an EffectPlayer component to a Game Object in a scene. There are several ways to register/build them, but the following is an introduction to the methods most commonly used 38by titles. ©CAPCOM 39

40.

Main ways to call effects Direct Scene Placement The first is to place it directly in the scene as I just showed you. Usually this is how background effects like smoke or fog are placed. In an impressive scene, fire and luminous objects might be set like this too. ©CAPCOM 39 40

41.

Main ways to call effects Direct Scene Placement C# Script (Resource≒Converted Asset) The next method is to place directly via a C# script. It is very simple code, doing basically what just did with the tool. Create a Game Object, add an Effect Player, and then set the asset, just as we did before. 40 C# script control is, of course, the most flexible because it can be programmed any way you like. ©CAPCOM 41

42.

Main ways to call effects Direct Scene Placement C# Script Timeline Linkage Another common method is to set up a timeline linkage so that an effect is triggered at a specific timing. For example, a specific timing for a character’s animation, or a specific time in a cut scene. ©CAPCOM 41 42

43.

Main ways to call effects Scene Direct Scene Placement Game Object Set! Transform Effect Player C# Script Other scripts... Timeline Linkage Game Object Game Object Internally, the same thing happens again: A Game Object is created and an Effect Player is set. The artist or programmer can place it with whatever way or tool is easiest for them. ©CAPCOM 42 43

44.

Disadvantages of C# script calls The person in charge goes back and forth depending on the case. Request an effect Create the effect Call the effect Request a length adjustment Adjust frame count Adjust appearance Request appearance adjustment In this context, regarding C# scripting, it is indeed flexible, but... It is not iterative from the game designer's or artist's point of view, as it is programmer-dependent. 43 For example, a game designer orders an effect and asks the programmer to call the effect he/she has had created. However, when they actually see it in game, they may find that they want to extend the length or shift the timing. In this case, the programmer has to intervene each time. The tasks can only be performed in series, which is inefficient. ©CAPCOM 44

45.

Solution Game Object Direct Scene Placement Transform C# Script Effect Player C# Script EPV Other scripts... (EPV = Effect Provider) Timeline Linkage There is a tool called EPV to solve this problem. On the C# script side, it performs execution with the ID as an argument. The artist can flexibly set the effect asset settings, timing of occurrence, direction, bone reference settings, etc. for the 44 ID. ©CAPCOM 45

46.

EPV (Effect Provider) The EPV tool looks like this. Let's take a quick look at the EPV Ryu's Shin Shoryuken from Street Fighter 6. Even for a single technique, the effect settings are divided into multiple elements and stages. 45 The "Jaw hit in front of the camera" is an effect that is adjusted to follow the camera. If you change the angle a little, it looks like this: If you look at the item, you see the ID that I mentioned earlier. This is the ID issued by the programmer or specified by the artist. If you make it appear at a specific timing, the rest can follow the camera, follow the character's bones, and so on. The result is a highly iterative process that can be set up to have detailed effects on a number of elements. ©CAPCOM 46

47.

EPV (Effect Provider) I've issued an ID, so I'll set it to be called at the designated time. I'd like to add a little delay. I want the direction to be more that way. Parameter Setting Effect designation It needs a little something at the point where the foot impacts the ground. In this way, game designers, programmers, and artists work in parallel, thus spreading the workload. This allows for a very high degree of iterability. 46 ©CAPCOM 47

48.

Further Expansion Game Object Direct Scene Placement Transform C# Script Effect Player EPV Other scripts... Timeline Linkage Timeline EPV Helper Linkage Incidentally, EPV was so useful that we've also added EPV Helper, which allows EPV use on the timeline. This provides very flexible and detailed control of effect behavior without the need for any programmer intervention. 47 ©CAPCOM 48

49.

I want finer control! I've focused on how to "create" effects, however, it is often the case that after an effect has been created, you want to control it in a more detailed manner. 48 For example, you may want to change the number of combos, the stage of the attack, the color, or momentum of the effect based on game progress, etc. In order to achieve this, effects have two parameters: Extern Parameter and Expression. ©CAPCOM 49

50.

Extern Parameter & Expression The Extern Parameter is added as one of the nodes in the Effects Editor. Just by placing it like this, it can be accessed from C# scripts by specifying a string. 49 In addition to raw values, "expressions" can be entered in the Effects Editor and used there to reflect them in the Effects. Extern Parameters can also be manipulated in various tools other than C# scripts, such as EPV and timeline control, etc., which we introduced earlier. ©CAPCOM 50

51.

I want to control C# scripts from effects! I'd love to put a blur on it, but... I'd need to set it up from a C# script. Earlier we were controlling the effect from a C# script. Now we've come full circle, and there is a desire to control C# scripts from effects. RE ENGINE answers that request as well. 50 Effects have an item called PtBehavior, which can be automatically used in C# scripts by extending a dedicated class. In this example, it is linked to the C# script that controls the radial blur. Just by calling it, you can apply blur around the effect. If necessary, you can also get values such as particle life, so you can control the lifespan on the effect side. ©CAPCOM 51

52.

PtBehavior with Vector Field Note the cherry blossoms on the ground Other effects have the ability to reference a Vector Field, to make wind blow on a target effect. If we look at Street Fighter 6 as a reference, a Vector Field is set to interact with petals on the floor. 51 However, this Vector Field must also be set in the scene from a C# script. And setting it in C# script code every time is tedious. So, we set it from PtBehavior instead to achieve the effect. ©CAPCOM 52

53.

Optimization Support Now that we have introduced the process of how effects appear in the game world... The next section will discuss the finishing touches to ensure that the effects of the game world are delivered to you in the real world such that you can play the game comfortably. ©CAPCOM 53

54.

Effects prone to high rendering load 32ms 32ms 16ms 16ms Creating effects is also a battle against processing and rendering loads. No matter how cool the effect is, if the rendering load is such that the frame rate drops significantly, you have no choice 53but to cut it back. And often, flashy and cool effects tend to be processor hungry. ©CAPCOM 54

55.

Late to the table Background Cutscene ・Animation ・Camera movement Character ・Model ・Motion Effects Creation 5ms left I'm using 3ms of that 2ms of that is mine Effects Creation Effects Creation Because of scheduling, asset interdependency, and so on, by the time effects are implemented, it's often the case that there's already basically no frame budget left. 54 This is especially true for lighting translucent effects, where shading costs are very high. Adding fog to a load-heavy and complex stage can easily overflow processing time. In such a situation, of course the battle for optimization is brutal and endless. ©CAPCOM 55

56.

Effect Profiler We at RE ENGINE Effects Unit understand this situation, so we have prepared various processing profiling tools to both optimize and predict issues in advance. 55 One such tool is the Effect Profiler tool. This is a tool that allows you to see a list of effects currently running in the scene. If the frame rate drops during a particular timing, this tool will show you at a glance what is taking up the frame budget! ©CAPCOM 56

57.

Provided as a basic function of the Effects Editor However, it is honestly annoying to work with the profiler open every time. We understand that, so we have simple profiling functions in various locations so that you can work while keeping an eye 56 on your processing load. Especially on the Effects Editor, you can see the CPU load, GPU load, memory load, etc., every time you playback an effect. This allows you to create effects while keeping an eye on the processing load. ©CAPCOM 57

58.

Indication of high-load effect functions Very high load functions Fairly high load functions Functions with higher load depending on input value In addition, the item list and each item's parameters have warning icons regarding load. This makes it easy to review the parameters when you feel that something is slow. 57 It also has the effect of easing the psychological anxiety of artists who are not sure if it is a function that they should use carelessly or not. ©CAPCOM 58

59.

Optimization Support Even so, it is not always possible to optimize on the title side alone. In such cases, there is extensive support by each unit of RE ENGINE. We also perform more detailed profiling and optimization at the engine level. Because of the know-how we have cultivated through multiple titles to date, 58 We know how to profile for artists, what situations are load-intensive, and how to deal with them, and we have documented and shared these research methods. Thanks to this, it's possible to control the drawing load at an earlier stage and create higher quality effects. This is the reason why we are able to deliver games that can be played comfortably at stable frame rates. ©CAPCOM 59

60.

Thank you for your attention. This is an introduction to "Editor Functions and Workflow for High-Quality Effects." Thank you for your attention. 59 ©CAPCOM 60