Water ambient occlusion simulation in real time
Water ambient occlusion simulation in real time
In this post, I aim to generalise caustics computation in real-time using WebGL and ThreeJS. It is critical to emphasise that this is only an attempt; finding a solution that works effectively in all conditions and runs at 60 frames per second is difficult, if not impossible. However, as you can see, we can accomplish some really good results with this method.
What is the definition of caustics?
Caustics are light patterns that form when light is refracted and reflected from a surface, such as an air/water contact.
Water acts as a dynamic magnifying glass, producing the light patterns seen as a result of reflection and refraction on water waves.
This page is about caustics caused by light refraction, specifically what happens underwater.
We need to compute them on the graphics card (GPU) to maintain a continuous 60fps, thus we'll use shaders written in GLSL.
To compute them, we must do the following:
Determine the refracted rays at the surface of the water (which is straightforward in GLSL as a built-in function is provided for that)
Determine where those rays intersect the environment using an intersection approach.
Calculate the caustics intensity by observing where the rays converge.
3D Modeling Services creates a magnificent and impressive visual experience for Information Transformation Services' clients. We are absolutely committed to providing our customers with a variety of appealing 3D designs that have been meticulously designed to fulfil all types of requirements.
WebGL's well-known water demonstration
Evan Wallace's demonstration of visually plausible water caustics using WebGL has always piqued my interest: madebyevan.com/webgl-water
I strongly advise you to read his Medium blog on how to compute them in real-time utilising a light front mesh and partial derivative GLSL methods. His solution is really fast and visually appealing, but it has a few limitations: it only works with a cubic pool and a sphere ball in the pool. You can't put a shark underwater and expect the demo to work since the shaders are hard-coded to think it's a sphere ball.
The reason for immersing a sphere was that finding the intersection of a refracted light ray and a sphere was simple, requiring just very elementary math.
All of this is fine for a presentation, but I wanted a more general solution for caustics computation, so that any form of unstructured mesh might float around like a shark in the pool.
Let us now turn our attention to our strategy. In this tutorial, I'll assume you already know the basics of 3D rendering via rasterization, as well as how the vertex shader and fragment shader work together to draw primitives (triangles) on the screen.
Working within the confines of GLSL
Shaders written in GLSL (OpenGL Shading Language) can only access a subset of scene data, such as:
Characteristics of the current vertex you're sketching (position: 3D vector, normal: 3D vector, etc.). Your own properties can be sent to the GPU, but they must be of a GLSL built-in type.
Uniforms that are constant for the entire mesh you are drawing at the current frame. It could be a texture, a camera projection matrix, a light direction, or something else entirely. It must include the following built-in types: int, float, texture sampler2D, vec2, vec3, vec4, mat3, mat4.
There is no way, however, to access meshes that are present in the scene.
As a result, the webgl-water example could only be produced with a bare-bones 3D environment. The intersection of the refracted ray and a relatively simple shape represented by uniforms proved easier to compute. A sphere, for example, can be characterised by a position (3D vector) and a radius (float), and this information can be supplied to shaders via uniforms, and the intersection calculation comprises very simple arithmetic that can be run easily and quickly in a shader.
My articles is a family member of guest posting websites which has a large community of content creators and writers.You are warmly welcome to signup and publish a guest post with a dofollow backlink no matter in which niche you have a business. Follow your favorite writers, create groups, forums, chat, and much much more!
Some shader ray-tracing approaches transmit meshes between textures, but this is not practical for real-time WebGL rendering in 2020. We must remember that we must compute 60 images per second while using a sufficient number of rays to produce a decent result. If we use 256x256=65536 rays to compute the caustics, we must execute a substantial number of intersection computations each second (which also depends on the number of meshes in the scene).
We must figure out how to represent the sub-water environment as uniforms and compute the intersection while still achieving decent speed.