Open strandedcity opened 10 years ago
The variable-stroke-expansion that @microbians shows here is slick, to say the least.
Is this stroke expansion via geometric path-offsetting or is it a just stroke rendering trick provided by Flash itself?
Do the approaches you guys outlined above cover such functionality(variable-stroke-expansion)? It could work wonders for Pencil Tools and sketch-like drawings.
@nicholaswmin Its a geometric interpolation ;) in reality it offset the quadratic bezier for the minimum and the maximun and interpolate the data in relation to ( length of the quadratic bezier segment to the subdivision point / the length of the quadratic bezier ) ~ Sorry fo the offtopic.
@nicholaswmin yes I am planning to support such a feature once we have constant offsetting working. Since we're dealing with cubic bezier instead of quadratic ones in Paper.js, it'll all be a wee bit more tricky, but the interpolation trick mentioned by @microbians should work similarly.
I don’t know if this is of interest, but checkOutlinesUFO (https://github.com/adobe-type-tools/afdko/blob/master/FDK/Tools/SharedData/FDKScripts/CheckOutlinesUFO.py) – which is used to remove overlaps in fonts – also uses pyClipper; inspired by Robofont. The flattening of outlines only happens as an intermediate step, the final outlines are of course not flattened.
@iconexperience Is there a simple way to make it works for straight path as well ? For now, it brokes as my path does not contain any curve. Thanks for the help !
Flattening is less efficient than bezier clipping, especially when high precision is required.
@iconexperience I've found some information regarding your dot product of the first and second derivative (what you called "peaks"):
http://math.stackexchange.com/questions/1954845/bezier-curvature-extrema
It looks like this simple calculation will give you something that resembles curvature extrema, but isn't quite those. I can confirm this in my tests. But finding the actual extrema is more expensive, we'd need a solver for quintic polynomials... I am wondering what we should call these peaks?
And while digging deep inside Skia's code, I found out that they do wrongly call those curvature extrema, and base their code on this assumption:
https://github.com/google/skia/blob/master/src/core/SkGeometry.cpp#L784
It's funny that this hasn't cause any issues yet. : )
Some more information about curvature extrema VS "peaks". I created a quick demo that compares the two methods on an interactive curve:
This may help figuring out further which information is more useful for sub-dividing curves before performing offsetting. Perhaps we do need an internal solution to finding the actual curvature extrema?
I've worked on a script to generate stroke outlines from a shape with line segments (non-bezier) using miter joint, I think it can be improved, but here's:
Interesting doc related to WebGL + Quadratic Beziers http://wdobbie.com/post/gpu-text-rendering-with-vector-textures/
Looking for further resources on Hoschek’s method after discussing different approaches with @hkrish over Skype, I accidentally came across this really curious document that describes another method for curve offsetting that claims to be much faster and as precise, if not better:
"A New Shape Control and Classification for Cubic Bézier Curves" by Shi-Nine Yang and Ming-Liang Huang: http://link.springer.com/chapter/10.1007%2F978-4-431-68456-5_17
I've extracted the pages from Google Books and turned them into a PDF for easier reading :)
A New Shape Control and Classification for Cubic Bezier Curves.pdf
I have started experimenting with this recently and the results are very promising. I will post my findings here shortly.
Here a first sketch that allows playing with this mentioned shape control method by dragging the green circle and all the curve handles.
And here finally the offsetting sketch, along with determining the largest error by projecting along the normals from the original curve, and logging plus visualization of the found largest error.
Note that this does not perform any subdivision yet, so you will naturally encounter large errors with more advanced curves. But this is yielding some really good results for "simple" curves. I will now merge this method into the code that @iconexperience has already been working on based on a different method for the offsetting part, of which much of the subdivision and path treatment can be reused.
I have some good news. The code is working rather beautifully in my local tests. I will put something online for people to experiment more soon, but here for now a screenshot.
That is quite a useful agorithm. In my original code I could offset through a point at t=0.5, but this one is better, as it allows for arbitrary points. I think using the point where the curve's tangent is the average of the start and end tangent could give better results than simply splitting at t=0.5.
Here is the code that I use to find points on a curve that have a certain tangent:
/**
* Returns the t values where the curve has a tangent in the same direction as the specified vector.
*/
getTangentTs: function (v, vTan) {
var ax = -v[0] + 3 * v[2] - 3 * v[4] + v[6],
bx = 3 * v[0] - 6 * v[2] + 3 * v[4],
cx = -3 * v[0] + 3 * v[2],
ay = -v[1] + 3 * v[3] - 3 * v[5] + v[7],
by = 3 * v[1] - 6 * v[3] + 3 * v[5],
cy = -3 * v[1] + 3 * v[3],
roots = [],
sTan,
epsilon = paper.Numerical.CURVETIME_EPSILON;
if (vTan.y < vTan.x) {
sTan = vTan.y / vTan.x;
paper.Numerical.solveQuadratic(3 * (ay - sTan * ax), 2 * (by - sTan * bx), cy - sTan * cx, roots, epsilon, 1 - epsilon);
} else {
sTan = vTan.x / vTan.y;
paper.Numerical.solveQuadratic(3 * (ax - sTan * ay), 2 * (bx - sTan * by), cx - sTan * cy, roots, epsilon, 1 - epsilon);
}
return roots;
};
(Update: Included epslion in getTangentTs())
@iconexperience I have been comparing to your original code also, and I believe that this algorithm produces the same shapes with t = 0.5! I am not 100% certain yet, but will soon put a little test-case together that allows us to easily compare these approaches. The code I linked to above also only works with t = 0.5, but based on your input I have now made a more general version of it that can work with any values of t, and will now try your suggestion. I'll put all my tests online soon, but things are behaving surprisingly well already. The trick really is to determine the maximum error and keep subdividing if it is too large.
Even more important than the maximum error is to check if one of the the handles' direction flipped during offset. If this is the case, it's is a clear sign that the result is not only unsatisfying, but quite messed up. So first check for flipping handles, then iterate until the maximum error is below you threshold.
That's a good point too! I've implemented their more general shape control method now that works with an arbitrary curve-time value, and was using the curve-time that has the average tangent for it, but this is producing results not as good as when using t = 0.5. But you are right, when splitting, using this "average tangent curve-time" is much better!
Here a useful sketch that visualizes the encountered errors:
And here now the status quo of the process work:
https://bl.ocks.org/lehni/raw/a665d6f9d95dd055b0ff901f8e313780/
I am very pleased by how reliable it works. The next steps are adding proper support for strokeCap
and strokeJoin
, and resolving self-intersections.
Despite the silence here I've been hard at work at getting the offsetting code work reliably. Things have far progressed, to the point where most edge-cases seem squashed, self-intersections are dealt with, and strokeJoin
is already supported. Here a screenshot:
I am aware that this issue mainly mentions parametrized / variable width strokes. I first want to get fixed width strokes to work well, and then tackle that next. I do have some ideas as to how this algorithm can be extended to support variable width strokes.
Impressive work @lehni
Here the latest state of the offsetting code, now with boolean operations in the mix:
http://bl.ocks.org/lehni/raw/9aa7d593235f04a3915ac4cef92def02/
Unfortunately, many edge cases remain. I will compile a selection of tricky paths and give access through a pulldown menu, but please have a go and let me know your thoughts and observations.
Loving it!
This already seems like it would be really useful in an application I'm working on, even if there are still edge cases that don't work. Is there any chance that this could be merged in soon?
I'm sitting on the fence about that, because it does have a lot of edge cases related to boolean operations still, and endless topic for us unfortunately... But yes, it would be great to add it soon.
I've tested some of the code from your demo, and it looks like it will work really well for my application.
There's a slight issue with closed paths and the joinOffsets function... basically the resulting path misses a chunk around the starting point of the input path. I can work around this by "unclosing" my path (making a new open path with an extra copy of the first point at the end.)
@BrianHanechak can you give an example of that problem?
I'm working on it but having trouble isolating it. (I'm running this inside of a much larger project). Since I'm having trouble isolating it, I'm beginning to think the problem is being caused by other things in my code.
I do get a similar issue if I pass in a compound path, which I am able to show in a fork of your gist:
https://bl.ocks.org/BrianHanechak/raw/98b0a80656c11b2a7a1a6026932298f2/
In this case, I assume it's just that compound paths aren't supported yet. But the result is similar to what I'm seeing in my bug, so I figured I'd share. I'll keep trying to track this down.
@lehni It's really great work!
I have a suggestion about parameter setting. Currently, parameter (like strokeJoin) comes from the path style parameter. I would like to set it by option (like below example), when you publish the function.
offsetPath: function(path, options){
if(!options) options = {};
if(options.offset == null)
options.offset = path.getStrokeWidth() / 2;
if(options.strokeJoin == null)
options.strokeJoin = path.getStrokeJoin();
if(options.miterLimit == null)
options.miterLimit = path.getMiterLimit();
if(options.strokeCap == null)
options.strokeCap = path.getStrokeCap();
...
},
I found one case which looks losing some handles. It is possible to replicate by console command in http://bl.ocks.org/lehni/raw/9aa7d593235f04a3915ac4cef92def02/.
path.pathData = "M274.01539,113.5838c-183.54286,28.92795 -238.08075,169.76149 -70.49814,239.58758c167.58261,69.82609 78.53292,249.69441 168.30932,205.80373";
$('#slider-offset').trigger('input');
Thanks @sapics! Yes the suggestion to optionally provide overrides for these values makes sense.
And wow, that's a funny case you found there. I am aware of a few as well, check these out:
path.pathData = "M466,467c0,0 -105,-235 0,0c-376.816,-119.63846 -469.06596,-146.09389 -650.61329,-266.59735c-282.68388,-230.49081 300.86045,-10.26825 452.77726,121.52815z";
path.pathData = "M466,467c-65,-34 136,64 0,0c-391,-270 62,-670 62,-670l-463,370z";
path.pathData = "M466,467c-65,-34 136,64 0,0c-391,-270 520,-471 522,-137c-214,-144 -1489,123 -923,-163z";
The 2nd two are linked to boolean operation issues still...
So it'll still take quite a bit of time unfortunately to get this all ready for prime-time. The question is, do we included it before?
These 3 cases have the similar special situations that neighbor segment.point has same position.
I am happy to use this function even if these special cases dose not fix, because it looks rare case in real use case and it might take a lot of time to fix it.
Personally, it is possible to publish or put this function to develop
branch, because this function have enough practical use.
It's also possible to make a new branch (like offset-path
) before publishing, it will make more reports from the participants and watchers.
path.pathData = "M274.01539,113.5838c-183.54286,28.92795 -235.08075,170.76149 -67.49814,240.58758c167.58261,69.82609 239.53292,-6.30559 329.30932,-50.19627"
This case might be without boolean operations.
@lehni I try to fix the case which looks losing some handles where I introduced before. http://bl.ocks.org/sapics/raw/d10f455b8ec1e54a46cf04ed2b386334/
I cannot judge that this fix is good or bad, but, the result looks better. You can check the difference in https://gist.github.com/sapics/d10f455b8ec1e54a46cf04ed2b386334/revisions
Sorry for the semi-offtopic but added variable offsetting to my shit http://microbians.com/?page=code&id=code-bezieroffsetingplayground ;)
@microbians not at all off topic :) Looks great! Unfortunately I haven't had any time lately to work on anything paper.js related, and that won't change for quite a while still : /
Great work! This feature would be very useful in a project that I'm currently working on. Specifically, I've been looking for a way to erode/bloat a complex path. Has work on this stalled for now, or is there any chance of it being merged in the relatively near future? :)
It seems this feature is still missing officially. I was design an animation editing tool and ran into this problem. I found that by the offset algorithm, there are some unwanted self-intersections. I was intend to implement same funtionalities as Adobe Illustrator's offset path and expand stroke. I managed to cover most of the cases, and I write it as a small extend library for paper.js. I just want to share it here in case someone needs it. The package name is paperjs-offset. It append two methods for Path & CompoundPath: offset and offsetStroke. Here is the running result of the lib.
@luz-alphacode this looks great! After looking at your code, I am wondering though: Did you consider using the work I posted above as a starting point? You can see it at http://bl.ocks.org/lehni/raw/9aa7d593235f04a3915ac4cef92def02/
The actual code you can see here: http://bl.ocks.org/lehni/raw/9aa7d593235f04a3915ac4cef92def02/offset.js
This includes code that handles joins and ends, through Paper's own already existing internal functions, such as Path._addBevelJoin()
, Path._addSquareCap()
, etc.
The offsetting code itself works really well and is based on A New Shape Control and Classification for Cubic Bezier Curves by Shi-Nine Yang and Ming-Liang Huang.
The only reason why I haven't added this to paper.js yet is issues with self intersection when expanding the offset to outlines. These issues are in the boolean operations code, and I so far have not managed to find more time to try and resolve them.
Could you explain how you addressed the issues with self-intersection in your code? Can we merge these two efforts into something that can be integrated in the library officially?
I've test my code again and found that my solution of finding self intersections performs reasonable for closed path but will fail on some type of open paths, and I've got some ideas about improving it. I'll check these ideas and come back later for dicussion. @lehni
... I've tried to imporve the solution but not really successful, and it's too late in my zone and I'll try for another idea tomorrow, the issues of self-intersection detection is somehow can be related to a clockwise or counterclockwise comparision. Self-itersected path will create a a loop reverse to the direction of the original curve. But complex curves can be partial clockwise, partial counterclockwise; I think maybe I should break the curve to do it. I'll come back later to explain my idea in detail, after I get some sleep.
Great to see progress on this feature! If it's of any help, I've run a few tests using the latest code from @lehni and found a bit of weirdness in certain cases where a point's handles are aligned perfectly (example below)
var pathData = "M288 216H0V44.514c21.206 0 32.848-.841 62.371-16.358C81.174 18.271 107.746 0 143.999 0c36.255 0 62.827 18.271 81.629 28.155C255.15 43.673 266.794 44.514 288 44.514V216z";
Any updates / release times please?
@shagarah I've worked some more on it recently, but am again swamped with work currently, so can't give a timeline. I want to release this soon as part of a final v1.0.0, but it needs more tweaking to be reliable enough.
Hey guys, is there any progress on this?
i can offer a financial bounty to anyone able to iron out the remaining issues with the offset code.
@lehni @glenzli @sapics
Wow I did not realize how complicated that is. Started some project with paper.js assuming this would be a nobrainer and now I‘m stuck.
What is the current status here?
Are there any workarounds via external libraries?
Thanks
@northamerican I should have time to start working on this again in about 6 weeks. Please get in touch at juerg@lehni.org so we can discuss this further. Thanks!
@northamerican did you try to get in touch? I didn't receive anything :)
@lehni e-mail has been sent. ready to discuss when you are.
I've been following the library for some time, and have started a spare-time project based on it. I've been eagerly interested in the functionality to expand strokes into paths, particularly for bezier curves (where I know the problem is mathematically complex). Has a strategy / timeline for implementation been discussed? I may be able to contribute if new contributors are welcome.
I wasn't sure if the "outline" feature was separate from what I would describe in a CAD world as "offset". Is there some way to get equations for the position of the stroke edge that is not as complex as a full bezier-offset solution? See http://pomax.github.io/bezierinfo/#offsetting for a very detailed explanation of the math involved.
Thanks!