![]() ![]() We can now accomplish new technical and artistic feats which wouldn’t have been possible before, and to fully realize our goal for what we want Desolation to be. While the focus here has been on lighting and material definition, more generally, we've significantly reduced the number of barriers that artists have run into while working with the engine. These changes not only improve the fidelity, character, and flexibility of the game’s art, but also massively speed up our own workflow. We’ve spent the last year making some sweeping architectural changes to Desolation’s engine. With our upgrades, this barrier has effectively been removed. There used to be a large gap between how an asset would look in one tool versus in-game. Standardizing our rendering system has also come with the added benefit of increased consistency in appearance between our engine and tools like the Substance suite and Blender. Reflections are also parallax-corrected, and can have a spherical or rectangular area of influence. Real-time and texture-based ambient occlusion uses this to drive where softer shadows accumulate, allowing us to preserve some of the subtleties of light bouncing from VRAD. Reflection probes capture the environment, sample the lighting, and recast an approximation onto nearby surfaces. To extend cubemaps, we’ve developed a new system we call reflection probes. They interact real-time with the material properties of surfaces, such as roughness and metalness, to determine what kind of shine to produce. ![]() With our new renderer, lights now make a direct, independent contribution to reflections on a surface, modulating their intensity based on proximity and angle relative to the surface. ![]() In Portal 2, lights did not contribute to the specular term of a surface instead, a static “cubemap” – a capture of the surroundings – would be generated by artists, and then re-cast onto nearby surfaces without any parallax correction or lighting information. We’ve also re-thought how lights and reflections interact with the world’s surfaces. Artists can also equip lights with non-standard shapes using light cookies, which affect both volumetrics and surface lighting. Lights also support real-time volumetrics, allowing artists to drive a stronger atmosphere by using lights with a more physical presence in the air. All your favorite test elements like lasers, excursion funnels and light bridges now cast light, moving and extending along with the element. We’re now able to more strongly ground common elements in the world by having them cast lights dynamically. This is crucial for a game like Desolation. This also has the benefit of allowing lights to move and change color/intensity in the level, making it possible for us to choreograph more sophisticated, dynamic scenes with lighting. As a result, lights can now be placed and moved around in-game, making it simple for artists to quickly iterate on designs. Lights are now completely real-time with per-pixel shadow maps, with no computation done by the compiler. We’ve introduced a physically-based renderer that unifies and drives shading of all surfaces in the world. Here’s how we addressed these challenges. With our engine overhaul, we canned the Source model and rebuilt it from the ground up. Source’s surface definition is fairly binary: surfaces are either shiny or not, and conveying differences in material types like cloth, metal and plastic was challenging.This often produces poorer, less consistent results, especially when placing world geometry next to models. Source handles lighting for world geometry and models differently, using lightmaps for the former and per-vertex lighting for the latter.The static nature of lightmaps also makes it impossible for lights to move and change with their environment. This made iteration very time-consuming, and was inconvenient for our art team all around. While the results of this were generally accurate in-game, it was normal for this process to take hours of time. ![]() Lights were placed by artists outside of the game, in a separate editor with no previewing capability.Using this method, we faced three major problems: With the traditional Source rendering pipeline, lighting is pre-computed and baked into maps via lightmaps during a lengthy compilation process using Valve’s radiosity simulator (VRAD). ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |