Hello everyone! Welcome to this new Devlog, which will be a little more technical than the others as we will tell you about the process of creating shadows and depth in Synergy! While creating the game, we faced a common challenge in top-down 2D games - sorting rendering elements. The go-to strategy is drawing elements from top to bottom based on the texture's baseline, but it doesn’t work in some cases! Buildings and environmental elements vary in size and don't always have a square base. As a result, larger elements may need to be rendered behind smaller decorations, even when they are lower on the screen. Simple fixes, like dividing long decorations into sections, can resolve these issues. [img]https://clan.cloudflare.steamstatic.com/images//42519618/c457ed2706993c18f033b5a7c4e95891492c1d7a.gif[/img] But this approach isn't quite suitable for us because in Synergy, the buildings are animated. Cutting them into sections would complicate production. We opted for a solution similar to what is done in 3D games: sorting the entire game environment with a Z-buffer. [img]https://clan.cloudflare.steamstatic.com/images//42519618/79148b8cbf1869f518904facd4a8106c7ffa2feb.gif[/img] This not only allows us to sort buildings and the environment without issues but also gives us the ability to position characters realistically within the buildings! [img]https://clan.cloudflare.steamstatic.com/images//42519618/29d3f8de4d051d224be70fadbbd20b08649375de.png[/img] To have a functional Z-buffer, we needed a basic 3D model of each building or environmental element and to store depth information in a texture. [img]https://clan.cloudflare.steamstatic.com/images//42519618/27d305020a5041404f78b239f4b940be9b4ec22b.png[/img] We use Blender for our 3D models, and an in-house tool makes a standardized file for each of them, ensuring consistent depth textures for all buildings. After that, we process the texture to smoothly integrate it into the engine, using a filter in Substance Designer. [img]https://clan.cloudflare.steamstatic.com/images//42519618/2a32421991f230fc55f4be1e2db57562b09b8d93.png[/img] With a depth representation of the building, we can explore various visual effects by accurately determining the world position of each pixel in the environment! [img]https://clan.cloudflare.steamstatic.com/images//42519618/36fcdaa47db74baa199a322963292cf3c812a03a.png[/img] This allowed us to implement our atmosphere and fog system, enhancing the world's relief and visual appeal. [img]https://clan.cloudflare.steamstatic.com/images//42519618/64a1573fa8603eba00528d40d9d844ae28b99968.png[/img] Access to 3D information in the game allows for dynamic shadow casting on the screen. The process is a bit intricate. In typical 3D games, shadows are rendered by comparing the Z-buffer from the game camera to the light source's perspective. However, our depth textures are limited to the game's view, not the light source's. So, we use rays from surfaces to the light until they hit a building for shadows. It may not be the most efficient, but it provides interesting results in our 2D world. To maintain smooth gameplay, we generate intermediate images to estimate distance for optimal performance. [img]https://clan.cloudflare.steamstatic.com/images//42519618/5d42c601b70d3b42ddaeb74c5bc540f27eba638d.gif[/img] Simply having depth information doesn't cover the thickness of a decorative element. For the most accurate shadow, we have to estimate thickness; otherwise, the shadow appears too large and incorrect. The building's thickness is estimated based on the rendered element's width. This way, a pole or column looks less solid than the body of a house. [img]https://clan.cloudflare.steamstatic.com/images//42519618/51ab62f04c0c7b55e232a984c9bfdd5467a53189.png[/img] There is still some work left to enhance shadow quality and performance, but it's a visual bonus well worth the effort. We hope you enjoyed this 5th Devlog and thank you for your support during this project. See you soon for the next one! https://store.steampowered.com/app/1989070/Synergy/