December 8, 2017
The 'gooey' SVG filter revisited
Since Lucas Bebber described the "gooey effect" in a post on CSS Tricks , it has gained some popularity. But noone has asked more about the underlying filter primitives - presumably because they look so unintelligable. But that comes at a price when browsers have bugs.
On Stackoverflow I recently encountered multiple questions dealing with browsers displaying the effect inconsistently. In this answer I tried to detail a few principles behind the effect:
🔗 Bluring the input grafics
The filter starts with
<feGaussianBlur stdDeviation="20" />
to produce a blurred image. While it "smears" different colors, its main intent here is to produce a wide band of pixels along the border of the grafic sources where the alpha color values vary. The further you go out, the more partial transparency increases.
🔗 Raising the contrast of the alpha channel
The original idea takes advantage of the feColorMatrix
primitive:
<feColorMatrix mode="matrix"
values="1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 19 -9"
result="cutoff" />
This has the obvious disadvantage of nobody understanding what these numbers mean. In theory, there is a mathematically identical version using the feComponentTransfer
primitive:
<feComponentTransfer result="cutoff">
<feFuncA type="linear" slope="19" intercept="-9" />
</feComponentTransfer>
They both transform the alpha value of a RGBA color with the function
(alpha) => slope * alpha + intercept
and clamp the result to the interval [0, 1], so that
alpha < (0 - intercept) / slope => 0
alpha > (1 - intercept) / slope => 1
There is a small range of input alphas that get transformed to partial transparency in the output. (For values 19 and -9 it is between 0.46...0.54.) The effect is of reducing the wide blurry borders to a very slim line that now "softens" all corners and has that "gooey" effect between nearby source objects.
But why leave that slim band of partial transparency in the output? It is possible to define a hard distinction between fully opaque and fully transparent:
<feComponentTransfer result="cutoff">
<feFuncA type="discrete" tableValues="0 1" />
</feComponentTransfer>
The range of input values is halved:
alpha < 0.5 => 0
alpha >= 0.5 => 1
This can result in visible antialiasing artefacts. There might be an advantage in a very small blurring of the result:
<feComponentTransfer>
<feFuncA type="discrete" tableValues="0 1" />
</feComponentTransfer>
<feGaussianBlur stdDeviation="1" result="cutoff" />
🔗 Superimposing the original grafics
To avoid the "smearing" of different colors inside an object you can superimpose the original on the "goo". There are several filter primitives, some with parameters that can achieve that. Basic superimposition can be formulated as
<feBlend in="SourceGraphic" in2="cutoff" />
// or
<feMerge>
<feMergeNode in="cutoff" />
<feMergeNode in="SourceGraphic" />
</feMerge>
// or
<feComposite operator="over" in="SourceGraphic" in2="cutoff" />
Restricting the superimposition to the "goo" area can be achieved with
<feComposite operator="atop" in="SourceGraphic" in2="cutoff" />
🔗 Rendering results
In theory, apart from that last distinction, the results should look the same. I've designed a test case where you can play around with the different filters and their setting. The filters can be applied to SVG circles and rectangles, or to HTML divs. You can compare the effect on multi-colored and monochrome objects. This should help to find a filter that has optimal cross-browser consistency.
I suspect that some rendering errors are depending on using GPU functions for some computations directly. Therefore, keeping in mind that only some hardware architectures support this, tests should include comparing diffent operating systems and processors.
See the Pen goo effect test by ccprog (@ccprog) on CodePen.