Open Bablakeluke opened 8 years ago
Here's a rough visual overview of a "gradient node" - the filled path is representative of any random path:
The resulting triangulation:
Using multiple gradient nodes:
Their resulting triangulation:
Fascinating examples, Luke.
How closely-related are these to the shaders used by WebGL? I suspect we'd have an easier time getting browsers on board if any new features where just an XML interface to rendering code they already implement.
Another thing to think of: You mention that the XML shaders use a node-based processing method, rather than a compositing method. That in itself is not a problem. SVG filter effects use a node-based processing method, but work with rendered pixel data instead of initial vector data.
The original SVG 1.2 vector-effects proposal would have provided a way to create a node-based tree of vector manipulations to control rendering. Work on that proposal stopped because browsers didn't think many of the manipulations would be easy to implement with the underlying 2D graphics libraries that they use. However, if the manipulations could be mapped to another underlying graphics library (i.e., WebGL), that might make it more feasible.
Relationship with WebGL - Triangle meshes This is fortunately about as simple as it gets; triangles are "the" primitive for 3D rendering and vertex colours are universal too. Essentially in order to draw an SVG path on the GPU at the moment, the path is typically triangulated first anyway _(except in cases where specific path rendering extensions are available on the underlying hardware such as Nvidia's "NV_pathrendering") so GPU-based implementers have the functionality needed already. I.e. drop in additional vertices during the triangulation stage or directly pipe through a raw triangle mesh.
As a quick example, Skia, the graphics library used in both Chrome and Firefox for SVG rendering, can draw triangle meshes with both the CPU and GPU already. The implementers who would need to do the most work are other CPU renderers; they'd need to perform bilinear interpolation on a triangle but fortunately that's something which is extensively documented due to it's ubiquity in 3D graphics.
Noise based textures For some clarity, "shader" here will be referring to a GPU shader; a user written program (in WebGL, they're the ones that go in script tags). Noise functions are very commonly implemented in shaders and as a result, hardware is starting to get native implementations too. Some of the underlying noise functions (from the experimental library) as shaders being displayed on a sphere looks like this:
Each node has it's own shader (Skia would implement these in HLSL and GLSL; mine are in Unity3D's ShaderLab language). Rendering it is simply a case of drawing a quad (2 triangles as a square) for each node using its associated shader. Some nodes, such as a mathematical 'add', require two sources - these require secondary off screen render contexts (also common in WebGL). Importantly, as the quads are just ordinary triangle arrays, they are also compatible with triangulated SVG paths (as used by GPU implementers). To help with visualising this, here's a quick example:
(Image from http://www.neilblevins.com/cg_education/fractal_noise/fractal_noise.html)
GPU implementation:
This "stacking" approach has some distinctively nice properties:
Thanks for that explanation.
Have you played around with the existing noise functions available as part of SVG filters? You can get some of these effects. Here are some demos I put together of paper/wood textures.
The main limitation right now is performance: because SVG filters are applied as manipulations of the rendered bitmap version of the shape, any change to the underlying shape or its position causes the entire filter (including noise functions) to be re-calculated. So noise filters + animation are a big problem for performance.
As a general comment, this is a direction I would love to go with SVG in the future. I think procedural textures in particular would be very useful and popular.
My company has been looking at Diffusion Curves as a method for rendering advanced gradients. See http://andronikos.id.au/pub/ for some talks And http://nikosandronikos.github.io/svg-adv-grad/ for a very draft proposal Diffusion curves are a nice example of a higher level abstraction for gradients. But they're expensive to render. The gradient meshes described in SVG 2 can be rendered via triangulation, so they should be able to be accelerated via the GPU - though perhaps with a one off initial expense. The triangle mesh format described by Boyé et al in A vectorial solver for free-form vector gradients is a nice format that can be GPU accelerated and provides a lot of power (e.g. a lot of other gradient representations can be converted into that format for rendering).
There's a lot of cool things we could do, but I think it is important to focus on GPU implementable algorithms, and we need either need to keep things at a high enough level that authors can understand them, or provide a higher level of abstraction in addition (though this could also be achieved via libraries). Filter Effects is an example of something that is not very author friendly.
Hey guys, apologies for the delay - I've been playing around with a few concepts amongst other projects too; the above comments are awesome! I'm really looking forward to seeing where SVG is taken in the future. @AmeliaBR Yep I've used SVG filters but they are very limited in comparison to what filter forge (and similar) can create, and I also agree that from an author point of view it's quite difficult to use in its current form. I.e. wood starts looking more like this:
The major node categories are ~16 noise options, gradients, graphs (very useful for "tone mapping"), mathematical operators and repetition. SVG does kind-of cover these categories, but it's the current lack of diversity in those noise functions that seems to stick out the most. Performance is certainly a sticking point (Filter Forge itself is CPU rendering only, and it's very noticeable), so it's awesome to see there's a potential GPU focus; from my own experiments, images like the above are easily generated in realtime on the GPU.
As far as usability is concerned, I would imagine the major issue is that SVG filters represent a graph with a much more complex flow instead of the very familiar tree like everything else in SVG. I.e. a path adds to whatever is before it; a blend does not. So maybe it would help to represent the graph of nodes as a tree as much as possible; that would somewhat bring it into alignment with the rest of SVG. Based on the above stacking approach, that also makes implementation simpler too. There's an interesting range of options here:
<filter id='Wood'>
<noise type="perlin" baseFrequency="0.3 0.1" numOctaves="4"/>
<!-- Perlin noise is now on 'this' stack -->
<blend operator="add" in2="anotherFilterID"/>
<!-- Here we've now got noise + whatever anotherFilterID outputs -->
<toneMap graph="graphID"/>
<!-- It's now been tonemapped -->
</filter>
To me, the above is nice and easy to follow through. A nice bonus is ordinary SVG nodes can be used too - i.e. dropping in a path node would be understandable. It still uses some ID references, but the flow is much more well-defined. To avoid ID references entirely, it could make use of nesting inputs, essentially flipping the flow in reverse:
<!-- The filter node represents the one and only output, so start from that and work downwards -->
<filter id='Wood'>
<toneMap>
<graph>
<!-- An inline tone mapping graph -->
</graph>
<blend operator="add">
<!-- Adding noise to some other noise -->
<noise type="perlin" baseFrequency="0.3 0.1" numOctaves="4"/>
<noise type="voronoi" function="manhatten" baseFrequency="0.3"/>
</blend>
</toneMap>
</filter>
Lots of interesting combinations available with the above. I like this form, but something feels slightly "off" that its flow is upside down in comparison to the rest of SVG. For example, introducing multiple path nodes into the mix would therefore be a very odd experience indeed:
<filter id='Wood'>
<path d=".."/>
<!-- Who renders first..? -->
<path d=".."/>
</filter>
However, at least it has a flow, and it can easily represent any number of inputs to each node.
Flipping back to the second part of this, triangle meshes, I can see a lot more advantages now if the gradient meshes can be GPU accelerated. I think it would still be quite convenient to introduce raw triangle meshes, possibly defined in the COLLADA format for easy compatibility, as it potentially allows for a polyfill option when introducing new features (i.e. if triangle meshes were available in an earlier iteration of SVG, a JS library could potentially polyfill gradient meshes by triangulating them and dropping the resulting triangles into the SVG's document).
+1 to nesting filters, which makes them much more readable than having to use the in
and in2
attributes. Though that's independent from this proposal here, so I've created issue #258 for it.
Sebastian
I'd be interested in some feedback on the following concept:
1) It appears that the SVG 2 momentum has seriously slowed, if not halted.
2) All browsers actually have now exposed the webGL capabilities of direct rendering through javascript.
The kinds of renderings using this capability far outstrip what seems possible in svg.
3) Given the structure of an SVG document, It doesn't seem to require any changes from the browser developers to have an SVG file that integrates a webGL-based script or an associated self-contained Javascript "helper" library and script.
Many things open up from here -
1) a purely javascript alternative for the declarative SVG language embedded in an SVG document structure
2) A webgl based SVG interpreter/webgl converter that could be optionally carried inside the svg structure enabling all kinds of enhancements to the declarative SVG without changes to the browsers.
3) an alternative declarative structure, as discussed above, also with a built-in interpreter and carried in the svg structure for historical consistency.
seeing the following changed my opinion of SVG forever:
https://threejs.org/examples/#webgl_animation_cloth
http://webglplayground.net/share/lsystem-tree-fractal?
the source code in javascript is so short..
@sdwarwick I can't comment on point #1 but:
In short, different technologies serving different purposes :) WebGL is for raw graphics rendering; SVG is for an abstract representation of a (scalable) graphic. For example, a texture on that cloth could be an SVG - using both SVG and WebGL together.
Hello! As both a frequent SVG author and an SVG renderer implementor, the mesh gradient element appears to have limited use and an awkward recursive rendering algorithm which would severely limit shader based/ GPU rendering of SVGs. So, I'd like to open a discussion for alternatives, starting with lots of noise!
Disclaimer: I haven't been following the SVG 2 specs too closely but after some searching I was unable to find any matches, so apologies if this has come up before/ is being discussed elsewhere at the moment.
Procedural texture engines - a (very noisy) source of inspiration
If photoreal gradienting is the name of the game then I'd like to draw some attention to filter forge. It's a program for creating procedural textures as used in movies and games. Internally, filter forge uses an XML format to represent these "filters" and the results are often wonderfully realistic. The following images are entirely represented with a tiny XML file (as a "filter") and are resolution-free:
Each filter (a tiny XML file) has input settings which can be changed by the user, resulting in an enormous variety of images resulting from the one filter. Some of these "variations" are pre-defined in the XML.
They're constructed by combining relatively few noise functions in a node-based editor. This is of course a totally different approach from SVG's compositing process, however, it shows that XML can be successfully used to represent photorealistic, resolution-free imagery and that having good support for noise functions is vital for achieving that. A bonus is that it can be generated entirely on the GPU using shaders too.
(Another disclaimer! I'm not in any way related to filter forge, however, I essentially built an experimental engine for generating noise based images at runtime using shaders and created an importer for filters from filter forge)
Triangle meshes Diving into another alternative, coming from the wonderful world of 3D graphics, triangle meshes are remarkably versatile and trivial to render on the GPU. I would assume this has been discussed a lot already so I won't cover it too much, however, a triangle mesh primitive could allow for much more expressive gradients. So for example, from an author perspective, the creation process could be like so:
The above could be far more intuitive than the current mesh gradient proposal as well as being trivial to implement (as under the hood it's using triangles).
SVG itself could declare one of two things (or both):
That's enough for today I think!