processing / p5.js

p5.js is a client-side JS platform that empowers artists, designers, students, and anyone to learn to code and express themselves creatively on the web. It is based on the core principles of Processing. http://twitter.com/p5xjs —
http://p5js.org/
GNU Lesser General Public License v2.1
21.39k stars 3.28k forks source link

Pass Texture Coordinates to bezierVertex, quadraticVertex, and curveVertex #5722

Open michaels-account opened 2 years ago

michaels-account commented 2 years ago

Increasing Access

Unsure

Most appropriate sub-area of p5.js?

Feature enhancement details

5699

Putting this feature request in as a response to the above issue. Could act in a similar fashion to UV assignment for custom shape vertexes, adding an extra 2 parameters on the end (in WEB_GL mode) for the UV coordinates.

davepagurek commented 2 years ago

Increasing Access

I can see this as a way to drop the barrier to entry for making more organic looking 3D shapes that you can render with the same tools p5 gives you for its own 3D primitives. To do this right now, you'd have to do a lot of math yourself, or learn 3D modelling software and export a model, and that has a large time cost (and potentially a monetary cost.)

Could act in a similar fashion to UV assignment for custom shape vertexes, adding an extra 2 parameters on the end (in WEB_GL mode) for the UV coordinates.

This will also need https://github.com/processing/p5.js/issues/5631 to be fixed in order to not lose the texture coordinate data that the user supplied.

For bezierVertex and quadraticVertex, I think we'll actually need UV coordinates for all of the control points (so 3 sets of UVs in bezierVertex and 2 for quadraticVertex.) Then, when we convert those to vertex calls, we'd also mix the UV values with the same weights we use for the position values. Maybe it's the same for curveVertex but I'd need to read up a bit on Catmull Rom splines first (are the curves also contained in the convex hull of the control points? If not, do we risk getting weird UVs in the interpolated regions?)

If we add a UV coordinate to the longest form of the bezierVertex function, its signature becomes this:

bezierVertex(x2, y2, z2, u2, v2, x3, y3, z3, u3, v3, x4, y4, z4, u4, v4)

This is kind of a lot to grok. It's a bigger API departure but maybe one could introduce a bezierControlPoint function, which lets you split the above into three calls:

bezierControlPoint(x2, y2, z2, u2, v2)
bezierControlPoint(x3, y3, z3, u3, v3)
bezierControlPoint(x4, y4, z4, u4, v4)
GregStanton commented 8 months ago

Could we use the starting point and direction of a shape's perimeter to infer the texture map, instead of requiring the user to pass in texture coordinates? [^1] This may improve the user experience without sacrificing much flexibility, and it’d eliminate the need for major API changes.

Having some clarity on this issue will help me to organize work on #6560. I'll explain my thinking based on the example from issue #5699 (shown below). I'm in unfamiliar territory here, so please don't hesitate to correct me :) I'll address direction and distance separately.

Direction

If the user specifies the curved shape between beginShape()/endShape() by starting from the top left and proceeding clockwise, then we could map $a$ to the top left corner of the curved shape, and when we move clockwise in $uv$-space, we could also move clockwise in $xy$-space. If the user draws their shape counterclockwise, a clockwise movement in $uv$-space could correspond to a counterclockwise movement in $xy$-space.

Distance

We might have two modes: SEGMENT and PERIMETER. [^2] Maybe we could set this with a textureMap() function that would accompany the existing textureMode() and textureWrap() functions.

In SEGMENT mode, we could map each edge of the texture image to a rendered path segment (the top edge could map to the first segment, the right edge could map to the second segment, and so on). [^3]

In PERIMETER mode, we could ignore edges and map the entire texture perimeter to the entire shape perimeter (i.e. moving along the texture perimeter could correspond to moving along the shape perimeter by a corresponding amount).

[^1]: This is at least somewhat similar to a feature of Vectorworks, in which "The starting point and direction the wall is drawn affect how a texture is applied." [^2]: In the future, we may want to support new kinds of surface primitives (e.g. we could allow the user to create Bézier triangles by using bezierVertex() together with the TRIANGLES shape kind). I haven't yet considered whether the mode names might need to be revised to accommodate those cases. [^3]: If there are fewer than four path segments, multiple texture edges could map to the last segment. If there are more than four segments, we could maybe fall back to PERIMETER mode.

davepagurek commented 8 months ago

Could we use the starting point and direction of a shape's perimeter to infer the texture map, instead of requiring the user to pass in texture coordinates?

I think the answer is yes we can, but that we can't eliminate it from the API. Some explanation:

Case for manual UVs

A common technique is to "pin" texture coordinates to vertices, but then move the vertices in 3D space without changing their texture coordinates, making the texture stretch to fit the shape as the shape changes. If the texture coordinates are derived from the positions and directions in 3D space, then it becomes hard to change one without changing the other.

The nice thing about many curve formulations used in graphics is that control points are effectively the same as vertices in terms of the data they store, and how you work with them. In most 3D programs, when doing UV mapping, you can see your curves in 3D and also where their texture coordinates lie on a 2D plane. Each vertex corresponds to a point in both views. Each control point on a curve also corresponds to a point on both. This is nice because it is predictable: the same algorithms for evaluating the curve in 3D space also apply to the 2D UV coordinates. Here's a screenshot of some older software using a curved model and showing its curves in 2D space as well:

image

Lastly, since evaluating the UV along a curve is done by interpolating the control point UVs the same way one interpolates "regular" points, I think if we want to infer UVs, we'd do so by generating UV values per control point. That would mean that under the hood, we would still end up with a representation like this regardless, so I think for that reason alone, it makes sense to start with this representation.

Deriving UVs for curves

I think it's still potentially useful to give users a way of getting UVs without having to define them themselves for cases when they aren't trying to pin a specific texture map image to a specific curve. I think my worry is that this is a pretty general tool, drawing curves, and that there are a lot of edge cases that come up. Something like manual mapping eliminates all the edge cases at the cost of more manual work for the programmer; derived UVs can maybe make an easier API but it puts a lot more pressure on us to answer questions like these:

I think something like this would definitely be useful, but I'm not sure I'd want to rely on that as the only API. Another thought: could we design the API in a way that allows p5 libraries to define mapping modes? e.g. if they had a manual UV mapping API available, could a library override the shape drawing behaviour for when you don't specify UVs, and derive them with whatever strategy they like?

Deriving UVs as a more general problem

One of the reasons I think deriving UVs is something that can be done separately from our curve implementation is because I think it's a useful feature for more than just curves.

One example: if you want to build a 3D shape out of spheres, you run into a similar problem to what I was saying about not mapping to the whole texture at once. Each p5 sphere has UVs that map to the whole texture. If you want to combine the spheres into one big shape, but then you apply a texture, you'll see that texture repeated onto every sphere. An idea that could help deal with that is to provide some APIs to reassign all the UVs in the whole model, and provide a few strategies for doing so:

Anyway I'm not advocating that we build the above as part of this curve drawing API, but it hopefully just paints the picture that we're tapping into a fairly complex problem that has a lot of directions it can go in, and why my inclination is to try to build something that others can build their own methods on top of in addition to some simpler solutions.

GregStanton commented 8 months ago

Thank you so much for your thoughtful reply @davepagurek! There's definitely a lot to consider. I'll start by sharing some initial thoughts about the API, under the assumption that users will manually specify texture coordinates in all cases. I still want to think about this more, but I'm pretty excited about it, since I think the API change alone could be a big improvement.

API options for vertex functions

Here are three options, exemplified by the case of Bézier curves:

The third option, which I added, is the same as Option 2 but uses "Vertex" instead of "ControlPoint."

Advantages of Option 3 (and for the most part, Option 2)

The new API could improve readability, consistency, and extensibility.

Disadvantages

Other advantages or disadvantages?

If others want to help extend these lists of advantages and disadvantages, I'd be happy to incorporate their comments into the lists above (with links to the original comments). That way we can compile everyone's thoughts in one place.

Implement now?

If we reach a consensus, it seems like we could solve the original requirements of this issue now (as part of #6560), rather than waiting for the next major version of p5.js. If we use "Vertex" in all the function names instead of "ControlPoint," we wouldn't need to maintain separate reference pages for deprecated features. Later, we could eliminate deprecated usage altogether in p5.js 2.0, which would eliminate any performance hit caused by having to process two types of parameter lists.

capGoblin commented 6 months ago

Just wanna save my progress on this issue, this sketch has working examples of passing texture coordinates to bezierVertex, quadraticVertex, and curveVertex by calling them multiple times.

nijatmursali commented 1 month ago

Is there any solution to this? I am also trying to implement it, but it gives error that bezierVertex expects maximum of 9 parameters.

davepagurek commented 1 month ago

Not yet currently, but this is something we aim to enable in our 2.0 release!

nijatmursali commented 1 month ago

Thank you for the reply, @davepagurek. Is there any date for release? We are currently working on big project which we have to place textures on each shape, and most of our shapes inclue bezier and quadratic vertex.

davepagurek commented 1 month ago

Not yet, so for now your best bet will be to manually convert your beziers to polylines that you can use with vertex(). To do this, you can use bezierPoint() to calculate positions along your curve. Normally you'd use three calls to this for x, y, and z, but for texture coordinates, you can add two additional calls for u and v.