-
Notifications
You must be signed in to change notification settings - Fork 106
sky
title: Sky Rendering (2005) description: Postmortem of the NeL sky rendering system — skydome, dynamic clouds, celestial objects, weather, and sun occlusion queries published: true date: 2026-03-14T18:26:05.993Z tags: editor: markdown dateCreated: 2026-03-14T18:03:52.995Z
Translated from "fr dossier 2002 / Projet NeL - 3D - Rendu ciel REVIEWED". {.is-info}
One of the objectives in representing persistent universes is to do so as convincingly as possible. Sky rendering is an important element in this depiction. A simple way to obtain a realistic sky is simply to photograph it. However, this approach suffers from its static nature. In current MMORPGs (Massively Multiplayer Online Role Playing Games), the trend is towards illustrating dynamic universes. In some more traditional multimedia products, a static sky rendering is suitable most of the time, due to the scripted nature that requires a game scene to take place under very specific conditions. In an MMORPG, the narrative constraints are much looser and freedom of action much greater — the user therefore expects to observe the passage of time, the seasons, and weather variations. Under these conditions, rendering a convincing sky requires the implementation of more sophisticated techniques to account for this dynamic nature.
The basic technique used for sky rendering is known as the "sky dome". It consists of using a hemisphere-shaped mesh to which the sky texture is applied. It is also possible to apply vertex color (color applied to each vertex) to give the illusion of a sky. During rendering, the camera is placed at the center of the sky, and therefore only the view direction is taken into account (one can thus work in a spherical reference frame). This also implies that the sky is approximated as being infinitely far away.
- A textured sky can be extremely realistic if photographs are used. However, there are several disadvantages:
- It is difficult to make it change over time.
- If the texture contains clouds, they must be distant, or the user may notice their immobility when traversing long distances across the terrain.
- A sky using vertex colors can approximate certain simple sky forms and allows for a dynamic sky. This technique is well suited because a skydome generally contains a color gradient from the horizon to the zenith, located directly above the observer. We can illustrate this with a few examples:
- A daytime sky is a simple gradient between light blue and royal blue.
- At dawn and dusk, a band of warmer color is added.
- A sunset typically takes yellow colors at the horizon, then red-orange followed by blue-purple at the zenith.
The use of vertex color allows modeling the progression of the day by simple interpolation between the different gradients.
It is possible to determine a color gradient analytically [3].
In the sky, the brightness of certain objects such as the sun can be very high. Current display devices cannot faithfully represent this range. Moreover, the depth of the textures used is generally only 8 bits per component, which limits the possible intensities to only 256 values. By using floating-point textures and applying the appropriate filter to the final image, it is possible to simulate large brightness differences in the final image [2]. However, this remains an expensive technique for real time in games where only a small fraction of machine time is available for sky rendering.
They are necessary for a realistic sky, and their visibility depends on weather conditions.
- Sun: The star of our solar system is also the most visible and produces the greatest brightness during the day.
- Stars: It is possible to encode them in a texture, but it must be very fine if one seeks to be faithful to reality. The BSC (Bright Star Catalog) is available in the public domain for their placement. The choice of a magnitude (minimum brightness) determines the number of visible stars.
Some are optical effects (rainbows). Weather conditions are mainly illustrated through the use of precipitation (generally using particle systems).
- [1] [Nielsen] Ralf Stockholm Nielsen, "Real Time rendering of atmospheric scattering for Flight Simulators", 2003
- [2] Jonathan Cohen, Chris Tchou, Tim Hawkins, and Paul Debevec, "Real-time High Dynamic Range Texture Mapping", Eurographics Rendering Workshop 2001, London, England, June 2001
- [3] [Preetham] Preetham, A. J., Shirley, P., and Smits, B., "A Practical Analytic Model for Daylight", in SIGGRAPH, 1999
- [4] [Wang] "Realistic and Fast Cloud Rendering", Microsoft Corporation, 2003
In Ryzom, we seek the representation of a convincing sky rather than a realistic one. The action does not take place on Earth, so we do not have the constraint of rendering a sky faithful to the one we know.
- The system must be capable of illustrating the passage of time and seasons.
- Weather conditions must be represented.
- The fill rate must be minimal so as not to overload the graphics card, because sky rendering, although important, is lower priority than rendering the player's immediate environment (buildings, terrain, other players, ...) as it conveys less information useful to gameplay.
- The system must be able to run on graphics cards such as the GeForce 2, which has only 2 multi-texturing stages. The reference cards for the project are the nVidia GeForce 3/GeForce 4, which have 4 texture units.
- The system must be capable of representing celestial objects such as stars, the sun, and the other celestial bodies of the Atys solar system (the planet where the action of Ryzom takes place).
- We wish to vary the sky representation, particularly weather phenomena, across the continents of planet Atys. The system must therefore be flexible enough to allow the sky to be declined in several versions.
- The sky rendering must provide an elegant transition with the fog of the main scene.
The biggest uncertainty concerns the fill rate required for sky rendering. This is the strongest constraint because real time is an absolute necessity in most games or interactive simulations.
We developed two prototypes in our research: a first very simple one that only allowed a day/night transition, to which we later added cloud rendering using the impostor technique. The second prototype, which is the one currently used in Ryzom, allowed us to manage seasons and obtain a more flexible and reusable system.
In a first version of the sky, we did not account for seasons, only the day/night transition. The system was based on a simple skydome whose texture was an interpolation between a daytime texture and a nighttime texture with stars. A single texture therefore contained in a fixed way the stars, the horizon-zenith gradient, and the clouds. This approach is inexpensive because it works on the minimum target hardware. Indeed, only 2 texture stages are needed (the interpolation between the 2 textures is provided by a constant alpha at the second stage). Using a single texture to represent these different elements means that these elements are static. Moreover, the texture resolution is dictated by the finest element of the texture, which is the stars. The final result was disappointing on this point despite the use of 1024x1024 DXTC textures (a compressed texture format providing a fixed 1:4 ratio).
The technique consists of computing an animated 3D Perlin noise texture for each cloud, then lighting it. The final cloud is then rendered using the impostor technique, in a manner similar to that used in [4]. A cloud is therefore rendered as a textured rectangle in the end, but its final texture comes from a temporal interpolation (between 2 animation steps) and spatial interpolation (when the cloud's movement requires updating its impostor). Our technique, unlike [4], gives an animated result, but in return cloud size is smaller due to the expensive 3D texture generation, and the fill rate needed for cloud updates is significant. In the end, we abandoned this technique.
It uses a scene composed of multiple objects to represent the sky. An animation created in 3ds Max allows the different objects to move to simulate the passage of time on a large scale.
We use a description file that specifies the color of each object as a function of time of day and weather. For this reason, we chose to use a bitmap to represent this two-dimensional function. Consequently, this allows celestial bodies to progressively appear during the night. These objects can be quite large on screen and therefore fill-rate consuming. They are therefore not displayed if the alpha component of the color bitmap is zero for the current time and weather. We also integrated particle systems into the scene to model certain phenomena such as solar eruptions, shooting stars, etc.
Furthermore, we chose to use 4-texture materials for rendering. Support for graphics cards with only two texture stages is provided through the use of multi-pass rendering, or an alternative single-pass material. The scene description file associates each complex object with a list of simpler alternative objects intended to be rendered sequentially (fallback versions).
In the end, we did not integrate clouds into the skydome rendering — they are displayed as separate objects.
The skydome material uses 4 textures:
- A 1D horizon-zenith gradient texture. This is recalculated based on the time of day and weather using a column of pixels from a bitmap whose x-axis represents the time of day. There are multiple bitmaps, one per weather condition.
- Two textures for star rendering: The first texture contains a star pattern that repeats multiple times over the skydome to maintain a certain fineness. To prevent the texture repetition from being visible, a macro mask texture is used to hide stars in certain regions of the dome, thus breaking the periodicity of the texture.
- A texture to represent the solar halo. It has an appearance similar to the analytical results obtained in [3], but is created by an artist.
@startuml skinparam rectangleBorderColor black skinparam rectangleBackgroundColor white
rectangle "Solar Halo" as E rectangle "Sun Visibility" as F rectangle "Star Texture" as A rectangle "Mask Texture" as B
rectangle " * " as M3 rectangle " * " as M1
rectangle "Star Visibility\nConstant" as C
rectangle " * " as M2
rectangle "Horizon-Zenith\nGradient" as H
rectangle " + " as P1 rectangle " + " as P2
rectangle "Final Result" as L
E -down-> M3 F -down-> M3 A -down-> M1 B -down-> M1 M1 -down-> M2 C -down-> M2 M2 -down-> P1 H -down-> P1 M3 -down-> P2 P1 -down-> P2 P2 -down-> L @enduml
Fixed-function pipeline usage for skydome rendering.
The scene contains multiple cloud layers, one per weather type. At most there are two cloud layers displayed simultaneously (during a weather transition), in order to limit the fill rate.
Each layer uses texture coordinate animation to simulate cloud movement. Two textures combined via alpha blending are generally used, allowing two cloud layers to be displayed simultaneously in a single pass.
This is achieved using a dome portion anchored at the horizon, displayed in alpha blend after the rest of the sky. The object is textured with a white-to-black gradient, which is multiplied by the fog color used for the main scene.
Most modern GPUs (nVidia GeForce or ATI Radeon) have a hierarchical Z-buffer, which allows fragments to be rejected in blocks during rasterization of a primitive, thus avoiding pixel pipeline evaluation when unnecessary. To take advantage of this optimization, it is necessary to draw the scene from front to back, in order to fill the Z-buffer with foreground occluders, which allows the optimization to benefit more distant hidden objects.
To benefit from this optimization, we used the OpenGL command glDepthRange, which remaps the scene's Z values to the desired interval before comparison with the Z-buffer. It is not possible to simply display the main scene then the sky scene, because they are not modeled at comparable scales. The use of glDepthRange has the drawback of reducing the precision available in the Z-buffer. Since the sky contains many transparent objects (in the sense that they use alpha blending for rendering), Z-buffer precision has no real impact. We therefore chose an interval of [0, 0.99] for the main scene and [0.99, 1] for the sky. The gains obtained are significant, particularly in places where the sky is less visible (in towns). On a GeForce 4, the time needed for rendering goes from 3 ms to 1 ms in certain places thanks to this optimization.
This is done in two steps:
- The sun halo is integrated into the skydome rendering, and is therefore done simultaneously with the stars and the horizon-zenith gradient.
- To simulate the sun's brightness, an adapted technique is the use of HDR [2], but this method does not respect our constraints because it is too expensive in fill rate. We limited the glare effect to the sun. For this, the sun is drawn after the scene rendering using additive blending. An attenuation factor is used to simulate retinal persistence (temporal interpolation) and partial visibility. Visibility is the ratio between the number of visible pixels and the total number of pixels needed to draw the solar sphere.
One technique for determining this is to read the Z-buffer of the scene after rendering, which was our first approach. However, this technique breaks the parallelism between the CPU and GPU because the end of rendering must be reached to obtain a coherent Z-buffer.
To solve this problem, we used the GL_NV_occlusion_query extension, which determines the number of fragments that pass the Z-test asynchronously.
The algorithm is as follows:
- Create an occlusion query object V; for this object:
- Enable Z-test, disable writing to the framebuffer and Z-buffer
- Draw a mesh overlapping the solar sphere
- Create an occlusion query object T; for this object:
- Disable Z-test
- Redraw the previous mesh
- Re-enable writing to the framebuffer, Z-buffer, and Z-test
- Query the previously created object pairs to see if they have returned a result
- If so, compute the ratio between the number of valid pixels from V and T
- Update the lens flare / sun opacity
We obtained a performant result that can nevertheless run on older GPUs. The resulting system is flexible and reusable (it has indeed been reused across the different continents of Ryzom). It allows managing seasons and weather conditions through the use of bitmaps as 2D functions to modify the scene's appearance. We also set up a prototype that allows rapid feedback when modifying the scene description file.
Our approach, although simple, is pragmatic in that it allowed us to illustrate a completely dynamic sky, which remains rare in competing products. Furthermore, the use of particle systems for rendering certain celestial bodies allows for dynamic rendering (such as solar eruptions).

Sky rendering debug tool: daytime sky with sun, cloud layers, and weather/time controls.

Weather progression series from the debug tool: dawn (Weather=0.03), dusk (Weather=0.43), and overcast (Weather=0.64).

Sun rendering with additive lens flare and occlusion-query-based visibility.

Night sky with the moons of Atys, and rain weather conditions.