I found out on February the 13th that GPU Gems 3 describes a slightly similar technique. Still their algorithm requires more passes and some more computations for the rendering, so mine will be quite faster: I need only one prepass to prepare my work buffer; meaning particles are only displayed twice, against once per layer plus once in the final rendering in GPUGem3's algorithm.
Used in CAE Healthcare's LapVR Appendoctomy simulator:
In a prepass, a RGBA buffer is rendered to build three depth layers of the particles in the RGB channels, and an acculation buffer in Alpha.
color.rgb = float3(depth, depth + step(centerInvDistance, 0.35), depth + step(centerInvDistance, 0.45)); color.w = texture(metaballAlphaMap, uv);The default texture contains a distance from the center of the image. It can be replaced with a texture containing several sub-particles to replace each particle with several smaller drops.
In the final pass, we render again the particle, at twice their radius. If the distance from the center is lower or equal to the radius, the particle is always visible. If further, we recompute the same particle surface depth, find the cloest depth in the prepass depth layers, and we discard the fragment if the Z distance is too important.
Then we use the depth differences to generate a normal, and UVs for a surface material texture.
Pseudo-code:
float NeighborDepth(float3 depths, float z) { if (fabsf(depths.x - z) < MaxDiff) return depths.x; if (fabsf(depths.y - z) < MaxDiff) return depths.y; #if 1 return depths.z; #else // If storing a 4th depth layer in the W channel. if (fabsf(depths.z - z) > MaxDiff) return depths.z; return depths.w; #endif } float WaterFresnel(in float3 I, in float3 N) { // R0 = pow(1.0 - 1.133, 2.0)/pow(1.0+1.133, 2.0); const float R0 = 0.0204; return saturate(R0 + (1.0 - R0)*pow(2.0*(1.0 + dot(I, N)), 5.0)); } float z = ...; // The 4 neighbor UVs can be more than one pixel away, resulting in blurrier/smoother normals. float zLeft = NeighborDepth(texture(prepass, uvLeft).rgb, z); float zRight = NeighborDepth(texture(prepass, uvRight).rgb, z); float zUp = NeighborDepth(texture(prepass, uvUp).rgb, z); float zDown = NeighborDepth(texture(prepass, uvDown).rgb, z); // If in the periphery outside of the particle core, and too far from neighbors, clip current pixel. if (centerInvDistance < 0.5 && (z too different from all neighbors)) discard(); // Generate a normal. screenSpaceNormal.x = (zRight - zLeft); screenSpaceNormal.y = (zDown - zUp); screenSpaceNormal.z = ScreenNormalDefaultZ; screenSpaceNormal = normalize(screenSpaceNormal); // Generate an albedo color. float4 albedoXY = tex2D( textureSampler, uvDistorsion + worldCoordinates.xy*MbDiffuseTiling); float4 albedoYZ = tex2D( textureSampler, uvDistorsion + worldCoordinates.yz*MbDiffuseTiling); float4 albedoXZ = tex2D( textureSampler, uvDistorsion + worldCoordinates.xz*MbDiffuseTiling); float4 albedoColor = worldNormal.x * worldNormal.x * albedoYZ + worldNormal.y * worldNormal.y * albedoXZ + worldNormal.z * worldNormal.z * albedoXY; float3 worldNormal = screenToWorldMatrix * screenSpaceNormal; float4 refractionColor = texCUBE(refractionMap, worldNormal); // Apply a basic water-like lighting. float fresnel = WaterFresnel(-incident, worldNormal); outcolor.rgb = lerp(color.rgb, envmapColor.rgb, 0.1 + fresnel * ReflectionOpacity); float mbWaterSpecular = 4.0*pow(saturate(dot(0.5*(incident-fakedLightDir), worldNormal)), 64.0); outcolor.rgb += fresnel*mbWaterSpecular;By reversing the last depth layer in the prepass, a second metaball effect can be seen through a first one, as visible in this image:
Main page | email : Sly |