davidson16807 / tectonics.js

3d plate tectonics in your web browser
http://davidson16807.github.io/tectonics.js/
Creative Commons Attribution 4.0 International
205 stars 28 forks source link

Save Sphere as 3d object #4

Open astrographer opened 9 years ago

astrographer commented 9 years ago

On the off chance that this is still active. Would it be possible to save the generated sphere as a 3d object uv-mapped to the various equirectangular displays?

davidson16807 commented 9 years ago

Well, its not necessarily inactive. I left it for a while after finding no easy answer for improvement to the supercontinent cycle.

3d models were something I hoped for as a long term objective, however I had not considered how to store the model, nor even whether to a uv map should be involved. From some preliminary research saving a texture to a file is easy enough. Besides that, ease of implementation depends greatly on the model file format. JSON could be used to easily store the model, but its a format not commonly used by modeling apps. Stl could allow for 3d printed globes, but would require something other than a uv map to implement. What sort of model format did you have in mind?

astrographer commented 9 years ago

I use Blender, Wings and... um... Bryce. I can Import and if necessary translate a lot of formats. I've seen JSON /exporters/ for Blender and one script that promised to implement /import/ "any time now." JSON tools seemed to barf up UV maps in any case... Stl, as I understand it, is pure geometry. The data here is in the surface texture(color, whatever), so that is probably a no-go.

Wavefront OBJ is pretty portable and there's a lot of code floating around for it. I'm not sure how easy it is to implement in javascript, but that would probably be the best go-to choice all else being equal.

As I understand it, tectonics.js implements the globe as a mesh with the surface faces being voronoi polygons rather than a Buckminsteresque icosphere with triangular faces or a conventional uv-sphere. Is that accurate?

davidson16807 commented 9 years ago

No, it actually does use icospheres. Actually, its a bit more complex than that. Each plate has its own icosphere geometry, with each vertex representing a grid cell in the model. Shaders modify the height and color of each vertex/grid cell in an icosphere. If the plate isn't present at a grid cell, the vertex is rendered with height set to something way below sea level. I've considered switching to a single icosphere and get the entire model to run through shaders, but there are a few nasty tradeoffs and it would take a major overhaul to perform. That's an aside, anyways.

The Voronoi stuff you see is only used for collision detection. A Voronoi diagram is used to map lat/lon coordinates to grid cell indices. This way we can detect collision between plates with O(1) efficiency.

So how this relates to generating 3d models: I don't expect it would be too difficult to generate textures. It doesn't matter how many icospheres are in play, they all get rendered to the screen in the end. What needs to be done is to take what's normally rendered to the screen and stick it in a texture, instead. Mesh generation should be trivial since its just a sphere. The real uncertainty for me is file format.

OBJ seems workable given its use of plain text. I see plenty of work done on javascript/three.js importers for obj. Not so much for exporters, though.

davidson16807 commented 9 years ago

OK, so there is an obj exporter included in the three.js project space: https://github.com/mrdoob/three.js/blob/master/examples/js/exporters/OBJExporter.js

davidson16807 commented 9 years ago

Another thing: OBJ appears only to store mesh information and texture mapping information - it does not appear to store the texture itself.

A full implementation of this feature would presumably include a zip file including the obj and texture files. Is this correct? If this is the case, could an intermediate solution work as well? I'm considering the following roadmap:

astrographer commented 9 years ago

Yeah. Doesn't necessarily need to be zipped, but what you need is the OBJ for geometry and(I think) the uv-mapping. You also need an MTL material file which describes the material(!) and gives the, hopefully relative, path to the texture image file. Presumably, the texture would be treated as the diffuse color. Finally, you need all the associated image textures. I'm not sure if javascript has the kind of filesystem permissions that would allow it to create a new directory and pack everything in that. If it can't, then a zipfile may be the only way to fly.

If the jagged edges on the fully zoomed out image that renders now actually represent an existing behind-the-scenes uv-mapping then you're golden. In fact, If I knew the actual arrangement of the icosphere used(in Blender terms, the number of subdivisions on an initial icosahedron), I could(probably…) create an equivalent icosphere in Blender and uv-map it to match. Could be difficult, so if the process could be automated, then Bully!

If pixels outside of the 90N,180W/90S,180E bounding box can accurately be recalculated back to the appropriate latitudes and longitudes(I'd have to pull out my old geometry texts to re-acquaint myself with the math), then creating a ball model may be superfluous. Any monkey(me) could simply create a ball in their favorite 3d modelling tool and use a simple spherical mapping for the texture. Although, as I understand it after a bit of googling and perusal of Stack Exchange, outputting the mesh ain't terribly hard in itself. Says the guy who couldn't actually do it, regardless of how "easy" it might be... :/

astrographer commented 9 years ago

Try this out for size. First, let's set the coordinate system so that the west edge of the final bounding box of the equirectangular projection is at x=0º, the north edge is at y=0º, the east edge is at x=360º, and the south edge is at y= 180º. This is basically a phi, theta map rather than latlong.

First thing to do is to check the phi-ccordinate of the current pixel, if it is less than 0 then the phi-coordinate is reset to abs(phi), and the theta coordinate is set to (theta + 180º) mod 360º. If phi is greater than 180º, then phi is reset to 360º-phi and theta is set to (theta + 180º) mod 360º. Now the phi or y-coordinate is dealt with, now to assure that the theta or x-coordinate is in range.

If theta is less than zero, then theta is reset to theta + 360º, phi is untouched. If theta is greater than 360º then theta should be reset to theta-360º, phi is unmodified.

This check, and if necessary change, should be made for each pixel before drawing to the screen. Obviously, this should take into account the actual coordinate system, and should be converted to radians if, as is likely the case, necessary.

Does this make sense, and did the math come out correctly?

davidson16807 commented 9 years ago

I'm not sure if javascript has the kind of filesystem permissions that would allow it to create a new directory and pack everything in that.

Almost certainly not, given the security concerns that go into browsers.

If the jagged edges on the fully zoomed out image that renders now actually represent an existing behind-the-scenes uv-mapping then you're golden

The jagged edges do indirectly correspond to uv mapping, but there are multiple icospheres at play here and they're all rotated in their own unique way, so it's not certain the jagged edges wouldn't propagate to the 3d model. The jagged edges are something I let slide when I first implemented but now they simply have to get fixed.

In fact, If I knew the actual arrangement of the icosphere used(in Blender terms, the number of subdivisions on an initial icosahedron), I could(probably…) create an equivalent icosphere in Blender and uv-map it to match

That's why I propose the incremental step of downloading the texture file. I'd suspect a simple spherical map should suffice. I plan to use the equirectangular projection as a basis for the texture file. On this projection, u = lon/360 and v = (lat+90)/180

Although, as I understand it after a bit of googling and perusal of Stack Exchange, outputting the mesh ain't terribly hard in itself.

Probably true, but it never hurts to release in small increments. Releasing functionality in large blocks risks bottlenecks that jeopardize everything from seeing use.

I'm not quite getting the explanation on phi/theta map. What problem is this addressing?

astrographer commented 9 years ago

I'm not sure if javascript has the kind of filesystem permissions that would allow it to create a new >>directory and pack everything in that. Almost certainly not, given the security concerns that go into browsers. No surprise.

If the jagged edges on the fully zoomed out image that renders now actually represent an existing >>behind-the-scenes uv-mapping then you're golden The jagged edges do indirectly correspond to uv mapping, but there are multiple icospheres at play >here and they're all rotated in their own unique way, so it's not certain the jagged edges wouldn't >propagate to the 3d model. To make the existing render work with uv-mapping, you'd probably have to save all of the nested icospheres with their rotation intact and probably mostly textured as transparent. I'm not sure if OBJ saves the rotation... That would still probably be a headache.

The jagged edges are something I let slide when I first implemented but now they simply have to >get fixed.

In fact, If I knew the actual arrangement of the icosphere used(in Blender terms, the number of >>subdivisions on an initial icosahedron), I could(probably…) create an equivalent icosphere in >>Blender and uv-map it to match That's why I propose the incremental step of downloading the texture file. I'd suspect a simple >spherical map should suffice. I plan to use the equirectangular projection as a basis for the texture >file. On this projection, u = lon/360 and v = (lat+90)/180 Absolutely. For me, that fixes a lot of problems right there.

it never hurts to release in small increments. Releasing functionality in large blocks risks >bottlenecks that jeopardize everything from seeing use. Further support. Especially since the first increment serves my purposes perfectly!

I'm not quite getting the explanation on phi/theta map. What problem is this addressing? Probably because it's a poor explanation and it's not addressing the problem at hand very well. Basically, it's an attempt to deal with the jagged edges.

Unfortunately, while I was able to play around with some of the simpler case problems. Like hiding the Icecap shading. This problem is completely outside of what I've been able to understand of the code(obviously). So a lot of what I'm doing depends a bit on conjecture as to how things are implemented. For a guy whose interests revolve so much around maps and imagery, my programming skills are very text-oriented. :-o

astrographer commented 9 years ago

Okay, my attempt at quote highlighting failed miserably.

davidson16807 commented 9 years ago

d28d663b01e7ba19d03a101a63a36abaf60c86ce adds the full screen view needed for texture export. That's part 1 of the road map.

astrographer commented 9 years ago

Nice. Still got the edge effects going on, but, because I can rely on the extents, I can make a second screenshot rotated somewhat to the east or west. Then all I have to do is use them as layers in Photoshop, use the Offset filter to realign the layers and delete the edges from the top layer.

That takes care of the side jaggies, for top and bottom I just have to rely on the fact that the north edge is all one thing as is the south edge.

Not quite perfect. Not altogether effortless. But workable. A very promising start.

davidson16807 commented 9 years ago

So fixing the jaggies I expect will have to be an additional step in the roadmap, between items 1 and 2.

There does not seem to be a well defined solution for this kind of problem using webgl's graphics architecture. We cannot move the coordinates of a pixel fragment, and we can't map a vertex to multiple clipspace coordinates. I believe this exhausts all possible solutions that are confined to glsl code, so something must be done in javascript. The solutions I envision all revolve around emulating what you do in photoshop: take two screencaps with different offsets, then stitch them together. This entails either:

Option 1 is right out. I could create two icosphere geometries for each plate but then I would have to do everything twice when updating the simulation.

Option 3 is better, but would require setting up a new pipeline, potentially also requiring me to re-flesh out classes for projection and view that I had abandoned in favor of much simpler vertex and fragment shader text. That, plus I'd have to swap offset values twice every frame.

Option 2 seems easiest to establish and most sensible.

davidson16807 commented 9 years ago

commit 8e0d3a859839823fa2e72d6032c5bff5988066d4 fixes the jaggies.

astrographer commented 9 years ago

Thank you. Thank you! Thank You!!!

For my own purposes, that's 99-100% of the game. I, speaking only for myself, mostly wanted the model output so I could endrun my way to the jaggy-free full planet map.

The obj-file mesh would definitely simplify things, particularly for users less experienced with 3d modeling tools(less experienced, even, than me(oh dear(!)). As it is, a simpleton like me could apply the texture to an image in Bryce in just a few minutes.

Just sayin',"Awesome work!"

BTW: Did'ja go with option 2 or 3?

davidson16807 commented 9 years ago

Option 2 worked as expected. It helps that the model and the view are partitioned - the mesh that stores rotation information about the plate is not the same as the meshes that get rendered to the screen.

davidson16807 commented 9 years ago

Well, I clearly don't understand what milestones do :\