2009 - Fast Metaballs

A new Metaball algorithm I invented in 2009. It's much much faster than the Marching Cube, and the results are quite comparable. MC still looks a bit nicer as I'm working on improving lighting in my algorithm.

I found out on February the 13th that GPU Gems 3 describes a slightly similar technique. Still their algorithm requires more passes and some more computations for the rendering, so mine will be quite faster: I need only one prepass to prepare my work buffer; meaning particles are only displayed twice, against once per layer plus once in the final rendering in GPUGem3's algorithm.

Used in CAE Healthcare's LapVR Appendoctomy simulator:

Video demonstrations


Integration of this method in Edouard Poutot's Fluid Dynamic framework (usual Metaball technique implementation, particle physic and help on the procedural sand shader by Edouard):
 

How it works

Firstly the particles are created twice the size of a lonely drop. The expended area won't be displayed, unless a neighbor particle is so close that their periphery overlap. The alpha of the particle ranges from 1 in the middle to 0.5 at the edge of a lonely particle and to 0 on the edge of the expended periphery.

In a prepass, a RGBA buffer is rendered to build three depth layers of the particles in the RGB channels, and an acculation buffer in A. To do so, the RGB channels are rendered with a D3DBLENDOP_MIN/GL_MIN blending, and the alpha channel with a D3DBLENDOP_ADD/GL_BLEND_ADD blending. Each of RGB contains a different, but partial, depth layer. To do so in a single pass, RGB contains the particle surface depth, plus a 2D distance test from he center that clip the current particle at different distances from the center:

color.rgb = float3(scaledDepth, scaledDepth + step(centerInvDistance, 0.35), scaledDepth + step(centerInvDistance, 0.45));
color.w = texture(metaballAlphaMap, uv);
The default texture contains a distance from the center of the image. It can be replaced with a texture containing several sub-particles to replace each particle with several smaller drops.

In the final pass, we render again the particle. We recompute the same particle surface depth, and we read the RGB depths of neighbors pixels. Of these 3 depth layers, we pick the one closer to the current particle depth. If none is close enough, we don't display the current particle, not to blend it with a particle that is too far on the depth axis.
Then we use the depth differences to generate a normal, and UVs for a surface material texture.

Pseudo-code:

float NeighborDepth(float3 depths, float z)
{
	if (fabsf(depths.x - z) < MaxDiff) return depths.x;
	if (fabsf(depths.y - z) < MaxDiff) return depths.y;
	if (fabsf(depths.y - z) > MaxDiff) return depths.y;
	return depths.z;
}

float WaterFresnel(in float3 I, in float3 N)
{
	// R0 = pow(1.0 - 1.133, 2.0)/pow(1.0+1.133, 2.0);
	const float R0 = 0.0204;
	return saturate(R0 + (1.0 - R0)*pow(2.0*(1.0 + dot(I, N)), 5.0));
}

float z = ...;
// The 4 neighbor UVs can be more than one pixel away, resulting in blurrier/smoother normals.
float zLeft = NeighborDepth(texture(prepass, uvLeft).rgb, z);
float zRight = NeighborDepth(texture(prepass, uvRight).rgb, z);
float zUp = NeighborDepth(texture(prepass, uvUp).rgb, z);
float zDown = NeighborDepth(texture(prepass, uvDown).rgb, z);

// If in the periphery outside of the particle core, and too far from neighbors, clip current pixel.
if (centerInvDistance < 0.5 && (z too different from all neighbor)) discard();

// Generate a normal.
screenSpaceNormal.x = (zRight - zLeft);
screenSpaceNormal.y = (zDown - zUp);
screenSpaceNormal.z = ScreenNormalDefaultZ;
screenSpaceNormal = normalize(screenSpaceNormal);

// Generate an albedo color.
float4 albedoXY = tex2D( textureSampler, uvDistorsion + worldCoordinates.xy*MbDiffuseTiling);
float4 albedoYZ = tex2D( textureSampler, uvDistorsion + worldCoordinates.yz*MbDiffuseTiling);
float4 albedoXZ = tex2D( textureSampler, uvDistorsion + worldCoordinates.xz*MbDiffuseTiling);
float4 albedoColor =	worldNormal.x * worldNormal.x * albedoYZ +
			worldNormal.y * worldNormal.y * albedoXZ +
			worldNormal.z * worldNormal.z * albedoXY;

float3 worldNormal = screenToWorldMatrix * screenSpaceNormal;
float4 refractionColor = texCUBE(refractionMap, worldNormal);

// Apply a basic water-like lighting.
float fresnel = WaterFresnel(-incident, worldNormal);
outcolor.rgb = lerp(color.rgb, envmapColor.rgb, 0.1 + fresnel * ReflectionOpacity);
float mbWaterSpecular = 4.0*pow(saturate(dot(0.5*(incident-fakedLightDir), worldNormal)), 64.0);
outcolor.rgb += fresnel*mbWaterSpecular;
By reversing the last depth layer in the prepass, a second metaball effect can be seen through a first one, as visible in this image:

Main page

email : Sly