Creating a Real-Time, In-Engine Painting Tool

218 Views

November 27, 23

スライド概要

■Overview
General-purpose texture painting sounds very convenient to have, but in reality it is a tool that requires a sense of balance, as there are many areas that can be improved by what it can do, but can also be replaced by DCC.

Using RE ENGINE, we will answer the question, "How does texture painting work within an in-house engine?"

Note: This is the contents of the publicly available CAPCOM Open Conference Professional RE:2023 videos, converted to slideshows, with some minor modifications.

■Prerequisites
Assumes basic knowledge about shader states.

I'll show you just a little bit of the content !
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
CAPCOM Open Conference Professional RE:2023
https://www.capcom-games.com/coc/2023/

Check the official Twitter for the latest information on CAPCOM R&D !
https://twitter.com/capcom_randd
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

profile-image

株式会社カプコンが誇るゲームエンジン「RE ENGINE」を開発している技術研究統括によるカプコン公式アカウントです。 これまでの技術カンファレンスなどで行った講演資料を公開しています。 【CAPCOM オープンカンファレンス プロフェッショナル RE:2023】  https://www.capcom-games.com/coc/2023/ 【CAPCOM オープンカンファレンス RE:2022】  https://www.capcom.co.jp/RE2022/ 【CAPCOM オープンカンファレンス RE:2019】  http://www.capcom.co.jp/RE2019/

シェア

またはPlayer版

埋め込む »CMSなどでJSが使えない場合

関連スライド

各ページのテキスト
1.

Creating a Real-Time, In-Engine Painting Tool ©CAPCOM 1

2.

Introduction A general-purpose texture painting tool • Visual adjustments can be made much more efficiently The ability to paint textures inside the game engine has a variety of benefits. For example, let‘s say you have laid out the ground layout like this. What if we could seamlessly paint over it, even painting different materials? 1 The game engine is the one and only development environment where you can see the final visuals of your game. Being able to paint in it is of great importance, especially when you are putting detailed touches on scenery. ©CAPCOM 2

3.

Introduction An essential perspective in creating texture painting with the engine: Textures can be created outside of the engine There are many DCC software options to try out before coming to the conclusion that painting with the engine is the way to go The goal for in-engine painting tools is more than just being able to create high-quality textures However, there is one perspective that is essential in creating such a tool. That is! ...that textures can be created without an engine. 2 It may seem obvious, but it's a fact that there were many DCC software options available before we came to the conclusion that painting with an engine was the way to go. With the plethora of designer tools available today, there is no shortage of them if you just want to create rich textures. Therefore, when creating a paint tool for an engine, the functionality must be focused on the benefits of running it in that environment. ©CAPCOM 3

4.

Introduction In this lecture... We will explain • What we have achieved • How we planned for it using RE ENGINE's texture painting tool as an example The star of the show • General-purpose texture painting tool for RE ENGINE • Prototyping period: approx. 2 months In this presentation, I will talk about the features of the texture painting tool in RE ENGINE based on examples of its use, and the implementation of the design to realize these features. I hope to share our knowledge on these two topics. 3 Before that, I would like to give a brief introduction of the star of the show. This is RE ENGINE's general-purpose texture painting tool, thebasic design of which was developed in a period of only two months. Of course, we have continued to update it over time, so take that development period with a grain of salt. ©CAPCOM 4

5.

What Painting in RE ENGINE Achieves In this section, as a casual topic, we would like to share what in-engine painting can do by introducing the functions provided by this tool. 4 ©CAPCOM 5

6.

Painting Straight into the Final Product Versatility to start editing anything, anytime • Live painting can be performed regardless of scene playback • No special setup required First of all, as mentioned earlier, the ability to check the final look is a feature that we are particularly emphasizing in this tool. For example, the video I showed you earlier was actually a video of painting while the main game was playing. The tool is sandboxed so that it can work robustly in a variety of environments. 5 In addition, the tool requires no pre-setup for editing, so you can edit and paint whenever you feel like it, whether the scene is playing or not. Once editing is complete, the painted texture is saved and can be viewed as a static texture in the running scene. This is a very good match with RE ENGINE's Rapid Iteration focus, and is a key point that made it possible to realize a very advanced live painting function. ©CAPCOM 6

7.

Painting Straight into the Final Product The more complex a shader is, the more useful it is to be able to iterate on the engine Layer composition by channel color Painting flow maps Blending using noise weights The benefits of being able to paint while checking how the final product will look are not limited to being able to paint during game play. 6 Other feature examples include flow mapping, blending using noise weights, and so on. The more complex a shader becomes, the more areas you will need to check and adjust when debugging or modifying it. Even a very efficient external editing environment is still external. The more you iterate, the more valuable it is to be able to paint on the engine. ©CAPCOM 7

8.

Painting Straight into the Final Product Simultaneous editing of multiple textures and materials is also possible Based on these features, the tool has also been enhanced to work in conjunction with the material editor, enabling cross-sectional editing of multiple textures and materials. 7 As mentioned above, the game object being edited is sandboxed, so changes to material information can be immediately reflected on the runtime scene. This creates a non-disruptive, uninterrupted editing sequence that actually adjusts material parameters, modifies textures, and even paints the modified textures. ©CAPCOM 8

9.

Painting Outside the Box Support for the wide range of meshes that will be found in production Skinning meshes Secondary UVs Meshlets In addition, if we're essentially painting in production, we need to be able to handle the kinds of meshes used there. For example, it would be convenient to paint on the skin mesh of set pieces and other positioned objects. 8 In addition, if the shaders are complex, there may be a desire to paint on secondary UVs. Also, times have changed, and the current trend is toward meshlets. This tool supports all these mesh formats, and is basically ready to paint without any restrictions. ©CAPCOM 9

10.

Painting Outside the Box Switchable brush alignment according to target and purpose UV Space Screen Space Object Space In addition to the mesh format, of course, there is also a wide variety of shapes and sizes in the final product. For this reason, a wide range of brush alignment options are available for different purposes. 9 For example, some can be applied directly to the UVs, others to the screen space and then projected onto the object, and still others can be applied to the object space. All of these can be used with any mesh format. ©CAPCOM 10

11.

Painting Outside the Box Supports all compositing of paints in any combination For example, even in the case of Flow Map Blending Planes Complex curved geometry In addition, all of Paint's compositing algorithms are independent of these format and alignment variations. This applies even in the case of special blend modes such as Flow maps, where the same algorithm can be applied whether 10 painting directly onto a texture coordinate system, or applying UV expansion to a complex surface. The emphasis is on being a general-purpose painting tool that is not limited by format or the existence of brush alignment. ©CAPCOM 11

12.

Working with Extensions Special tooling capabilities are required depending on the material to be implemented Hard to paint IDs with a regular palette... Example: Material ID is managed by color Of course, there are areas that cannot be covered by simple generalization of functions. For example, imagine a material that manages material IDs by the colors of a single texture, as shown. 11 This is very convenient because it allows you to create a variety of textures with a single material. But when you think about using a regular palette to apply the texture IDs...it may be a bit of a difficult working environment. Thus, depending on the material created, it may be necessary to provide specialized functionality for the specific purpose. ©CAPCOM 12

13.

Working with Extensions Areas that cannot be handled by simple generalization are covered by integrating with macros Macros: Python-based extension functionality, used not just for batch processing but even for developing plugin tools complete with GUI Using macros to implement a palette extension that allows setting colors by ID In such cases, you can use the macro feature to create tools that extend the paint tools. Macros are an extension of RE ENGINE that allow you to create simple tools as well as various batch processes from the 12title-side editor. Macros can also be used to interface with this tool, so you can add your own functions to suit your purposes. Here, we have created a macro tool that lets the user select the ID color by selecting the material. It is more intuitive to be able to set the color in this way. ©CAPCOM 13

14.

Working with Extensions Areas that cannot be handled by simple generalization are covered by integrating with macros Allows entire material workflow to be completed on the engine ▶Titles can do these things on their own initiative Example: Paint extension tool for terrain materials (RE:4) This tool was used in RE:4. I've included some footage of it in action. As you can see, the interactive part uses the common functions of the paint tools, and only the palette is expanded with 13macros. While it is possible to implement this specification as a tool feature, what is noteworthy is the ability to create a complete material workflow within the engine. In other words, the implementation can be done by the title, which means that changes in specifications can be flexibly handled, and processes can be smoothly shared with collaborating companies via a dedicated tool. It also helps to avoid overcomplicating the engine tools by overloading them with specialized functions. ©CAPCOM 14

15.

Integration into Other Editors The function of painting is easily converted to various purposes 1. Streamlines work ▶ Since its use is already familiar it doesn't take long to get up to speed 2. Easy division of work Example: Shell fur grooming tool ▶ Implementation workload shared between tools and core teams I've been talking about the benefits of the paint tool itself to this point. However, if the actual implementation code of the paint tool is available on the engine, it is possible to extend it and incorporate it into other editors. 14 In fact, we developed this tool's internal implementation with that kind of use case in mind. For example, Shellfur's grooming tool introduced in the "New Rendering Features Rundown" RE:2023 talk is an example of an editor derived from this tool. There are several advantages to creating an editor by extending an existing paint tool in this way, and the major ones are as follows. The first is that you can start your development from a place where the "normal" things about painting, such as shortcuts, history, and saving, are already implemented. This is a point that is often overlooked, but there are quite a lot of "obvious" things in Painting, and by omitting discussion of them, it becomes easier to focus on the functions necessary for the original purpose, which is more important than simply reducing man-hours. ©CAPCOM 15

16.

Integration into Other Editors The function of painting is easily converted to various purposes 1. Streamlines work ▶ Since its use is already familiar it doesn't take long to get up to speed 2. Easy division of work Example: Shell fur grooming tool ▶ Implementation workload shared between tools and core teams Another point is that by using general-purpose texture assets as the editing target, the implementation of the tool and the core are separate, and won't interfere with each other. 14 To put it another way, development won't be held up waiting for each others’ implementations, and there'll be sufficient time for considering usability. In this way, the use of paint tools for code diversion is not only a saving in man-hours, but also an effective means to realize more advanced functionality in the editor. ©CAPCOM 16

17.

Points to Keep in Mind When Implementing However, there are a few things to keep in mind when creating a paint tool on an engine, so let's talk about some of the issues we encountered when actually creating the tool and how we solved them. 15 ©CAPCOM 17

18.

Points to Keep in Mind When Implementing Paint tools are only indirectly involved with assets Texture Mesh Adding features x maintaining flexibility = ballooning complexity ▶ It's important to not let this get out of hand Now, we must share with you one special circumstance that painting tools have to deal with when it comes to implementation: They can only be indirectly involved in all the assets they handle. 16 The assets involved in painting are textures as supports and meshes as frames, neither of which can be controlled at run-time by the paint tool, and both of which will continue to be optimized by external factors. The biggest concern here is the combined complexity of versatility and feature addition. The advantage of being able to adjust the final look is at the same time a disadvantage in that assets must be painted in a state close to the scene of execution, and the cost of that versatility is especially high for painting tools. While we naturally want to increase the functionality of our tools, if we have to maintain versatility across all features, we may lose their original potential. This is why it's important to be able to decouple support costs, which continue to increase due to external factors, from feature additions when implementing. Let's look at the considerations for textures and meshes. ©CAPCOM 18

19.

Texture Handling Bypass completely by parsing to binary data Native data Texture asset Convert to lossless binary texture format for general use Conventional Conversion Flow Hard to use in painting to begin with because of BC compression and other lossy problems Let's start by looking at the texture issue. The engine also has a conversion flow that reads in textures and converts them to shader resources. 17 However, textures converted to native data in the conventional flow are subjected to lossy processing such as BC compression for optimization. This makes it difficult to paint using only the information available at runtime. We bypassed this problem at the parser stage and loaded the textures into the tool as common-format texture binaries. This is a heavy-handed method, but it eliminates the need to be aware of the original format and optimization of the textures since they will be converted to a custom binary. ©CAPCOM 19

20.

Mesh Handling Mesh dependencies are isolated using the rendering pipeline • Separate processing before and after the geometry shader Common to all meshes from here on Vertex shader Geometry Geometry shader Pixel Pixel shader shader Desired painting functionality processing (Pick, projection, debug drawing, etc.) On the other hand, meshes are a bit different, especially in that they are not directly editable. This has its bad and good sides. The bad side is that you have to deal directly with the mesh data at runtime if you start considering deformation, etc. 18 The good side is that you don't have to bypass the entire input/output process to ensure reversibility like with textures. The tool's strategy is to absorb mesh differences by taking advantage of the generalization provided by the rendering pipeline. This is a basic approach, but the idea is that when you want to support a new mesh format, you first implement the vertex shaders as best you can, and then use common processing for the rest of the mesh to reduce costs. This is not as smart as it may seem, since it includes pre-processing such as collecting necessary information in the compute shader in advance for a skin mesh. However, this way, we can separate the mesh dependence from the new format when we add new features to the paint. ©CAPCOM 20

21.

Paint Processing Flow However, painting in a nutshell is a combination of various shader processes. If these processes are mixed together in an unregulated manner, even a well-developed strategy to separate them will not work well. It's also important to look at the paint processing flow. 19 ©CAPCOM 21

22.

Paint Processing Flow Separation of the processing phase into the following three phases Pick Create brush map Composite In the case of this tool, the process of painting is divided into the following three phases. I'll introduce them one by one. 20 ©CAPCOM 22

23.

Pick: Select Editing Target First, where is the brush going to go...? Pick: Retrieve information that is above the mouse cursor Pick Create brush map Composite First, let's start with Pick. Let me share a rough understanding of the Pick process, which refers to acquiring information on the mouse cursor or21 pen-tab pointer. Before painting, we need to start with knowing where to paint. ©CAPCOM 23

24.

Pick: Select Editing Target Pick's processing that can be implemented in the rendering pipeline • Use pixel shader to do collision detection When the target coordinates are drawn, write to the GPU buffer (pixel shader) Run shader for Pick Pick completed (brush coordinates fixed) As you can see, the process here is highly dependent on the format of the mesh. Therefore, we would be happy to do the Pick using the rendering pipeline. The basic idea of the Pick process using the rendering pipeline is as follows. 22 First, let's run the shader for Pick as shown here. Then, if a pointer such as a mouse or pen-tab exists on the mesh you want to paint, there will be at least one pixel shader running on those screen coordinates. Each pixel shader is compared to the pointer coordinates to see if they match, and if so, the GPU buffer is filled with the semantics of the current pixel shader. ©CAPCOM 24

25.

Pick: Select Editing Target Pick's processing that can be implemented in the rendering pipeline • As it is, Pick would run multiple times on overlapping surfaces If you Pick here Multiple pixel shaders run on camera coordinates Unfortunately, this idea does not work so easily. If implemented as is, the pixel shader will run multiple times where the faces overlap, and the actual Pick result will be 23 unstable. Since pixel shaders are executed in parallel, a little ingenuity is required to get the foremost plane information as a structure. ©CAPCOM 25

26.

Pick: Select Editing Target Pick's processing that can be implemented in the rendering pipeline Early Z Function that performs depth testing before running the pixel shader • As it is, Pick would run multiple times on overlapping surfaces • Depth can be prepared in advance and processing can be simplified with Early Z Get Depth in advance Implement the same processing with Early Z Pixel shaders that were ambiguous can now be culled This tool avoids this problem by acquiring the depth in advance and drawing the shader for Pick with Early Z. Early Z is a GPU feature that performs a depth test before executing pixel shaders and is mainly used to reduce the load 24on pixel shaders. However, the fact that allows you to skip pixel shaders based on depth testing makes it very convenient to solve our problem. This makes it possible to cull all pixel shaders below the frontmost pixel shader and thus stabilize the Pick. The cost of getting the depth first is offset by debug features like texture highlights, brush cursors, etc., that also result in the need for culling. ©CAPCOM 26

27.

Create Brush Map Before compositing... Brush map Rasterized brush collision information Pick Create brush map Composite Now, let's continue with the brush map creation. If the UV coordinates were obtained from the Pick in the previous step, the texture is treated as a target and can be painted by 25 rasterizing the brush texture. However, before this tool does this, the brush collision information is burned into a special resource view called a brush map. This phase is the process that creates the brush map. ©CAPCOM 27

28.

Create Brush Map To facilitate responding to tool requirements, it is useful to have an intermediate buffer Computationally correct painting behavior Of course, it is possible to skip this step and perform compositing directly, but there is quite a bit of processing that needs to be done here to ensure the actual functionality of the tool. Let's take a concrete example. Shown is an example of painting that is correct in terms of computation. The result of this process is certainly "correct", in that painting is happening, and there are many situations where this is actually required as the end result. However, if you actually look at many paint tools, you will see that they work a bit differently. ©CAPCOM 28

29.

Create Brush Map To facilitate responding to tool requirements, it is useful to have an intermediate buffer Computationally correct painting behavior Tool behavior that is actually desired As you can see, you actually need to achieve a behavior that is more even and has better alpha transitions than a simple 0 to 1. Unevenness of paint sometimes provides a good flavor, but it is useless as a tool if you can't also paint accurately. Thus, when viewed as a tool rather than a simulation, the behavior required for painting requires a different level of precision. To achieve these behaviors, several processes are required. Here are some typical examples. ©CAPCOM 29

30.

Create Brush Map For even painting... "Point Sequence Information Interpolation" • If you don't do this, the density of the paint will change at the frame rate This is not good enough This is how it should be! Need to fill out what happened between frames First of all, the interpolation processing is necessary for even painting. Because, as shown here, with the current one-frame-one-dab method, when the brush opacity is low, the density of the color applied changes with the frame rate. 28 To solve this problem, we divide the point sequence according to the distance traveled, as shown here, and draw the composite results of all stamps done in the frame. In this tool, after complementing the coordinates with computed shaders, batch rasterization is performed using indirection drawing to speed up the process. ©CAPCOM 30

31.

Create Brush Map "Stroke Map Compositing" for easy paint layering Composite one frame of processing Create one stroke of processing Then, to make painting easy to control, compositing into a stroke map is also important. This process stores a single stroke, or in essence, the result of compositing from when the mouse is pressed until it is released. 29 This makes it possible to set an upper limit for the amount of paint that can be applied in a single stroke, making it easier to apply paint in layers. Since a separate resource is required to hold the information for one stroke, this process is made possible because brush map creation is an independent phase. ©CAPCOM 31

32.

Composite on Edit Target Composite the brush map and the source texture to be edited Pick Create brush map Composite Now that we have created a brushmap, let's blend it with the object to be edited based on the color compositing rules of multiplication and addition. 30 ©CAPCOM 32

33.

Composite on Edit Target Various compositing processes can be realized by simply changing this step • Multiply, add, subtract, normalize and combine as a vector • Composition is 1ch due to separate processing of brush map Multiplicative Blending Flow Map Blending (Composite of brush map and stroke direction vector) This is basically a compute shader that combines the source texture and the brush map. The weak point of this painting process flow is that because the brush map creation is a separate phase, the source texture 31 is limited to a single channel. Specifically, it is not possible to add color to the brush texture itself. In return, this process can be completely separated from the other phases, and the number of implementations to be considered when increasing the blending process can be reduced to one. Also, even with a single channel, flow map blending such as shown on the right can be performed without problems because it can be combined with other information. It would be a shame to be limited in what we can do, but as I said at the beginning, there are many excellent DCC software for the purpose of directly applying colors, so we chose this tradeoff in order to design a system that takes advantage of the use case of engine painting without competing with them. ©CAPCOM 33

34.

Support for UV Unwrapping It is difficult to paint a complex UV unwrapped mesh in this configuration Pick Create brush map Composite We have covered everything from picking to painting, but painting a mesh with complex UV unwrapping is not so good. In order to actually paint them nicely, one more step is required. ©CAPCOM 32 34

35.

Support for UV Unwrapping Transfer the brush map to the UV unwrapped space It is now possible to paint meshes with complex UVs Pick Create brush map Support for UV unwrapping Composite UV unwrapped brush map So, let's explain the phase of dealing with UV unwrapping. Specifically, the brush map is transferred to the UV space of the mesh during the brush map creation phase. This phase is not independent, but rather an extension of the brush map creation phase. ©CAPCOM 33 35

36.

Support for UV Unwrapping Even without this phase, a paint tool still has value Without this process it's impossible to paint complex UVs, but... This doesn't really cause any problems when painting terrain or other simple UV meshes The reason why we did not add this process from the beginning is simply that the requirements of a paint editor can be met without it. Certainly, the results of painting complex UVs without this process are, as mentioned above, disastrous. 34 However, it should be mentioned that when working with meshes used for layout work, etc., they are usually connected to each other and the seaming problem is less apparent. Remember, the value of live painting in an engine is that it allows for a workflow that DCC cannot provide. In this regard, the ability to simply paint complex UVs has a disproportionate impact compared to the technical interest, and whether or not to cut this step is also a turning point in whether or not mesh-dependent processing is mixed into the brush map creation phase. It makes sense to not support complex UV painting if cost-effectiveness is the goal. The editor did not support this feature at the initial stage, and it was built in later. ©CAPCOM 36

37.

Support for UV Unwrapping It depends on the mesh, so... UV transcription that can be implemented in the rendering pipeline The mesh should look like this, but With geometry shaders we swap vertex positions and UVs Composite the brush map with pixel shader On the other hand, the ability to paint complex UVs certainly increases what can be done. For example, the aforementioned fur grooming tools can be used to paint complex curved surfaces, which can be extended to include 35 props and characters. There are many tools and workflows that can only be realized when they can handle complex surfaces. So, I would like to try to explain the specific implementation of UV painting. As I am sure I don't need to explain anymore, this process is mesh-dependent itself. Therefore, let's try to make it happen in the rendering pipeline as a whole. The rough explanation is as follows: the geometry shader swaps the vertex positions and UVs, and the pixel shader performs the final brush map compositing. ©CAPCOM 37

38.

Support for UV Unwrapping The compositing method here changes the behavior slightly Screen space Object space There are multiple compositing methods, and the aforementioned screen space and object space brush alignments diverge here. Both have their strengths and weaknesses, so I will try to explain them briefly based on the implementation method. 36 ©CAPCOM 38

39.

Support for UV Unwrapping Screen space composition • Implement with existing brush map + alpha Draw brush map in screen space Transform to UV space Artifacting similar to shadow acne occurs Let's start with screen space compositing. This is a compositing method in which a brush map is rasterized onto an image with the same aspect ratio as the screen as usual, and then transferred to the UVs by taking the intersection of the brush map and the depth map. 37 Since it can be implemented as an addition to the existing brushmap creation process, it's easy to add, and the algorithm is almost the same as shadow maps. However, it is prone to overpainting, and since the algorithm is the same as for shadows, the same artifacts as for shadows occur, and processing is required to suppress them. ©CAPCOM 39

40.

Support for UV Unwrapping Object space compositing • Replace the part that is rasterized Perform intersection determination in the brush map phase Clean results, even at the very edges The other is object space composition. The implementation is a direct replacement of the rasterization process, intersecting the brush's midpoint information with the distance between the world coordinates obtained from the pixel shader. 38 Compared to screen space, object space compositing is more suitable for painting intricate curved surfaces, and for better or worse, it does not compare depth, so artifacts are less noticeable when painting the seam edges. ©CAPCOM 40

41.

Support for UV Unwrapping We can't paint the seams (UV boundaries) like this Required Implementation : Spreading the paint from the seam (Edge padding) 1. Overpaint beyond the edges of the UV region 2. However... Take care not to infringe on neighboring regions Can't paint the seam all the way to the edge Now, the UV transfer is finished, and although it would be nice if this were all that was needed, it is not the case. If this is not done, the UV seam cannot be completely painted over, resulting in the gaps as shown here. 39 This is mainly caused by the sample method of textures such as Mipmap or bilinear filters rather than rasterization accuracy, and cannot be solved by UV transfer. Therefore, a further step is required. Specifically, the paint needs to bleed out from the UV regions. Here we will call it edge padding. There are two requirements for edge padding. The first is that the paint must spread outside of the UV region by an arbitrary amount of pixels. The second is that the extended padding must not encroach into the area of the adjacent regions. ©CAPCOM 41

42.

Support for UV Unwrapping Edge padding process that can be implemented in the rendering pipeline • Padding is applied to each polygon with a geometry shader Generating padding for each triangle Discard by depth Actual execution results Now let's create edge padding. Like UV projection, this one is also mesh-dependent, so let's let the rendering pipeline solve it. 40 I'll share the basic idea of the implementation, which goes like this. First, we use a geometry shader to stretch polygons for all triangles, and then we truncate them at depth to prevent interference between islands. The result of doing this is shown on the right, and you can see that we have succeeded in building the edge padding as we intended. ©CAPCOM 42

43.

Support for UV Unwrapping Edge padding process that can be implemented in the rendering pipeline • If several conditions are met, edge padding can be created that makes problems less noticeable All padding must be sunk parallel to each other Constraining the direction of padding elongation to 4 directions The problem is not as simple as it seems. If you extend the padding in equal directions without thinking, the paddings extended from each triangle will penetrate each other and will not work very well. 41 By adding a few conditions, it's possible to create padding with less noticeable artifacts. The tool has two main constraints. First, all padding must be sunk in parallel, and second, the direction of extension of the padding must be constrained to one of the four vectors of slope 1. This not only ensures that all edges from a vertex have a slope of 1 and allows the distance of padding extension to be specified in pixels, but also improves the quality of the padding by eliminating the condition that the padding penetrates the inner polygon due to the difference in angles. Since the inner angles of a triangle do not exceed 180 degrees, the calculation is simply taking the sign of the perpendicular vectors of the sides to get the padding direction. The exception is the case of edges perpendicular to the UV space, where sign gives an incorrect value. Here, use the sign of the vector from the center to the vertex. ©CAPCOM 43

44.

Support for UV Unwrapping This time it's OK! The result of these processes is shown here. You can see that the seams can now be painted without any problems. 42 ©CAPCOM 44

45.

Summary The value that a paint tool can provide to a game engine • Improved efficiency in brushing up visuals • Extend workflow by integrating with other tools Challenges in implementation • Balancing complexity of versatility and additional functionality • Division of dependent processes determines the potential of a painting tool Thank you for your attention. A summary of this discussion is as shown. Thank you for your attention. 43 ©CAPCOM 45