davidson16807 / tectonics.js

3d plate tectonics in your web browser
http://davidson16807.github.io/tectonics.js/
Creative Commons Attribution 4.0 International
200 stars 28 forks source link

Permalink for a specific generation #3

Closed Estevo-Aleixo closed 6 years ago

Estevo-Aleixo commented 9 years ago

Data import and export preferably via URI.

davidson16807 commented 9 years ago

There's actually already a parameter you can pass to generate worlds with a specific seed. Just add "?seed=foo" to the end of the url. Replace "foo" with the seed you want to use, e.g.:

http://davidson16807.github.io/tectonics.js/?seed=bar

I've noticed simulations still tend to diverge over long periods of time, even if they have the same seed. AFAIK, this is not something that can be prevented. I think a lot of it has to do with how the supercontinent cycle is currently conducted, but that's something to be addressed for issue #1

Estevo-Aleixo commented 9 years ago

I would like to export the data wholesale, close the browser and import in a separate browser to get to a similar state as when exported. (with a optional manual modification step in-between)

davidson16807 commented 9 years ago

Ah. So this is more a "save to cloud" feature?

One problem I see with this is that there's currently no hosting for the app besides Github. Github AFAIK only hosts static content for free. It doesn't allow hosting server side apps, and it doesn't have any means to store files or records from the user.

That limits the ability to retrieve saves by URI, though it doesn't stop saves in general from being implemented. The user can still download a save file and restore to that save at a later date. If he wants to share it, he'd have to upload to dropbox or some other a file hosting solution.

So the easiest way to implement this is to encapsulate model state in a single JSON file. Start with the basics: World, Plate, Cell, RockColumn. View state is not to be considered until later. Saving to the cloud is an optional later step beyond that.

Estevo-Aleixo commented 9 years ago

I was thinking of something similar to serialization to base64 placed in url parameters. Similar to the way the seed is done. https://davidson16807.github.io/tectonics.js/?seed=bar&World=somedata&Plate=somedata&Cell=somedata&RockColumn=somedata

There is a good discussion on stackoverflow on how much data can be put into a http get request. http://stackoverflow.com/questions/2659952/maximum-length-of-http-get-request

davidson16807 commented 9 years ago

That's a clever idea. That shifts the problem though - now it comes down to whether model state can be stored in the url. Even the best browsers seem limited to 8KB. A typical simulation uses 10K rock columns. Each rock column has to store at least 2 floats for density and thickness - 8 bytes. Even in the best of situations we're talking compression ratios of 10:1. That's a little high to be using lossless compression.

It could be done by POST request, but POST data is handled server side, and again github only serves static content.

On the bright side, nothing I mention restricts the use of a JSON download. 10K rock columns with 8B per rock column = 80KB. Very manageable. It even allows some room for growth and error.

davidson16807 commented 9 years ago

Work on the save branch is being done towards this objective. Commit e11c23a is the first merge to production. This includes save and load functionality. Drag-and-drop is also implemented for loads. It's out there online, now.

Current file size is pretty consistently 170KB. I'm reasonably confident this number can go down as far as 40KB, uncompressed, without sacrificing model resolution. I'm doubtful whether compression would be able to bring this into the range needed to use a query string parameter, though

At present, the save file format offers no guarantees on forward compatibility. Don't get too attached to the worlds you save.

davidson16807 commented 9 years ago

A thought occurred to me. We could allow users to pass a url to the application. That url would link to their save file. If users want to share a link to a world build, they download the file from the application and upload it to a file host of their choosing, like pastebin or dropbox, then create a link that points tectonics.js to the file they uploaded. This would allow very terse urls to express very large save files, thereby extending browser compatibility and allowing future growth of the simulation.

My first concern is security - whether a url can be crafted to do something malicious when opened. It's definitely possible to crash the browser. All you need to do is send a json file for a world with an incredibly large number of grid cells. This threat also exists for the original proposal of this feature. This kind of DoS can be prevented by adding checks on grid resolution and plate count, though there's always the risk of something slipping through.

I don't think it is possible to do worse, though. Javascript is never evaluated from strings and I never intend to do so.

Estevo-Aleixo commented 9 years ago

I ran the simulation for 711 MY saved the sim file and ran some tests

sed "s/\],/\n/g;s/,\([0-9]\+\)/\n\1/g" 711.sim | sort | uniq -c | grep -v "^ \+[123] " | sort -k1n

this formats, sorts, and counts unique strings in a sim file

shows eight thousand strings of "2890," and 7059 of "7100," and 117 of "2700"

with 7057 "7100,2890" Perhaps these rock columns could be undefined as they seem to be unchanged from the default?

sed "s/\[[0-9]\+,7100,2890\]//g" 711.sim > 711.2.sim

which reduces a 175kiB json to a 65kiB json (removing 7k RockColumn from the 10k currently serialized)

sed "s/\],/\n/g;s/\[//g;" 711.sim | sed "s/\([0-9]\+\),\([0-9]\+\)/\1 \2\n/g" | grep -v "^," | sed -n "s/^\([0-9]\+\) .*/\1/p" | grep "^[0-9]" | wc
10237   10237   50638
# lines    words    characters

sed "s/\],/\n/g;s/\[//g;" 711.2.sim | sed "s/\([0-9]\+\),\([0-9]\+\)/\1 \2\n/g" | grep -v "^," | sed -n "s/^\([0-9]\+\) .*/\1/p" | grep "^[0-9]" | wc
2810    2810   13699
davidson16807 commented 9 years ago

There are two defaults. They're defined by World.prototype.ocean and World.prototype.continent.

A check for these defaults can be done. It would drastically reduce file size, at least for the present implementation. One of the things that's on my todo list is to modify the algorithm used for initial generation. I want something like the diamond square algorithm, except for a 3d sphere. If this ever gets implemented the only defaults used will be during rifting events. Defaults will occur much more rarely and file size will shoot back up.

davidson16807 commented 9 years ago

Commit 63801f6 is my first stab at using base64 encoded Uint16Arrays for rockColumns. This pushes file sizes down to 92KB, which is still much larger than the 40KB I was expecting. However, file loads are dreadfully slow. I won't merge until I can get load times down to something comparable to the mainline. If you want to take a stab at the time/space problems you're certainly welcome.

Ideally, I'd want save/load to be effortless. I've had an idea for some time for a slider that lets you look into the past and see in real time how the planet has evolved. This would require save events to be done automatically at regular intervals, followed by load events done in rapid succession.

davidson16807 commented 9 years ago

ed54e06 merges base64 to master. I discovered most of the lag in file loads were caused by event loop blocking from the model/view. File loads are consistently under 100 ms. If json conversion gets any faster I could probably ship off model code to a web worker.

davidson16807 commented 9 years ago

Save/load events occur in under 10ms as of b6dda07. 77cf6c4 on the zip-save branch attempts to compress file sizes. At maximum compression, file size is down to 30KB. In addition, 55bfbd4 allows some means to specify a save file through query string, provided the user hosts that save file publicly somewhere, for instance Dropbox. I think this is about as close to the original goal as can be realistically expected.

Estevo-Aleixo commented 9 years ago

Could storing only the change between each cell be stored instead? Would that improve storage?

Something like cells horizontally that look like this [123,145,123,124,125,123,123,123] be stored/saved as a difference from the last thickness_scanline=123 thickness_scan=[0,22,-22,1,1,-2,0,0] would this bring our numbers within tighter byte constraints? (as we would not need to store all those 0's) The first byte could be outside of the tight byte constraints.

This would drop the option of external modification without a de-serialization software or an alternate save.

Off the cuff idea. Feel free to ignore : )

davidson16807 commented 9 years ago

Yes, it could. A thought I considered was to store offsets from a mean thickness/density. RockColumn ids could be stored in the way you described, with id_scanline = 0. Either way would allow space savings, but only in the event you could store things with an Int8Array or a Uint8Array, instead of the Uint16Arrays we use currently. RockColumns could not be spaced further than 255 cells apart. Thickness could not differ by more than 127 meters from the mean. Densities could differ by no more 127 kg/m^3. You could introduce an additional term expressing maximum deviation from the mean, but this comes at the expense of complicating code and reducing precision.

The very best case scenario means halving your file size. 15KB. If I get a pull request that shrinks it consistently below 8KB without reducing cell counts I'll merge it, but I don't consider that scenario likely enough to pursue it personally.

davidson16807 commented 6 years ago

I'm closing this issue out. It's been here a few years and I see no way to make progress on it. If anything, it's going to become harder to implement as I continue adding things to the model and increasing save file size.

We at least made some useful progress. Save files wouldn't have been implemented without for this issue. Also, in case I never mentioned it: if you store your save files somewhere public you can reference them using the load querystring, like so:

http://davidson16807.github.io/tectonics.js/?load=http://foo.com/blah.sim