Dadido3 / noita-mapcap

A tool to capture a Noita world as a huge image
MIT License
66 stars 8 forks source link

Map capturing process is frequently interrupted by crashes and "ASSERT FAILED!" error messages #7

Closed ghost closed 4 years ago

ghost commented 4 years ago

Bug description

The map capturing process proceeds smoothly until around 4,627/14,400, at which point the game starts crashing and spitting out error messages at seemingly arbitrary intervals.

Steps to reproduce

Environment

Hardware

Screenshots

Dadido3 commented 4 years ago

I haven't done a map capture recently, so i don't know if it got worse in the latest versions. But capturing the extended map (3.8 gigapixels) always was a sure way to get like 5 to 10 crashes and some ASSERT FAILED! message boxes.

Unfortunately there is not much i can do. The mod basically lets the viewport jump around and take screenshots of the game.

There are basically two problems here:

Address space

Noita is a 32-bit program and can, for legacy reasons on windows, by default only access the lower half of its virtual address space, so 2 GiB. Even with the Large Address Aware flag of the executable changed to true the program can only use 4 GiB. This would theoretically be enough if Noita wouldn't need more and more memory over time.

I don't know if there is a memory leak with map generation/chunk loading/... or if Noita just frees that memory too late (Maybe some sort of caching that's time or frame based). But if the process' working set size (Shown in the TaskManager) goes above 3 GiB the chances of a crash are high (You will hardly see the working set to go near the full 4 GiB, because there may be other stuff mapped to the virtual address space that occupies that space but isn't counted towards the working set. Also, memory fragmentation may reduce the space efficiency, too). Anyways, the details don't matter and the result is that Noita needs more and more memory until it runs out of virtual memory and crashes.

The mod even tries to reduce the amount of chunks that need to be reloaded/generated each jump by using a plane filling Hilbert curve.

I hope that they either publish a 64 bit version or fix this "memory leak" like behaviour. Or even better: they do both.

ASSERT FAILED!

I don't know what fails exactly, but this is caused by some internal check that the devs placed. So this means that something happens that isn't supposed to happen. I have no influence on those checks or the circumstances that cause these checks to fail.

It would be nice if i knew of a way to disable these (Or always ignore them automatically), i suppose they are disabled in the normal release version. At least i have never seen them there myself.

Using the normal release version has some other disadvantages though:

Conclusion and notes

I wish capturing would be a simple fire and forget process. But some of these things need to be changed/fixed for that to happen. Unfortunately it's not in my control.

ghost commented 4 years ago

@Dadido3 Thanks for the detailed feedback! It's good to have confirmation that these are known issues which are unrelated to my setup. Just to clarify: I was able to finish the capture process successfully by clicking "Ignore" on the error message prompts and resuming the process after every crash, so this bug just presents itself as a small annoyance. I'll close this issue since it can't be fixed on the side of the mod, but it might be worthwhile to add a "known issues" section to README.md.

The rest of this comment is unrelated to the bug, but I learned some neat things about the world of ultra-high-resolution images once I finished capturing the map for the first time a few days ago and thought you might be interested in my findings.

The resolution of the stitched PNG is so massive that opening it in any conventional image viewing software (HoneyView, IrfanView, JPEGView, XnViewMP, nomacs, FastStone Image Viewer, Imagine, etc.) is impossible on conventional hardware because all of them try to load the uncompressed image into memory before displaying it and even 16GB isn't sufficient for that. I was able to open it in image editing tools like Photoshop and Paint.NET, but they ended up taking a long time to display it and completely exhausted the system's resources in the process.

These performance issues lead me to research how biologists who work with tissue scans and astronomers deal with images which have such a high resolution that even 1TB of RAM wouldn't be enough to load the uncompressed image into memory. It turns out that viewing large images like this with reasonable performance requires one to do two things.

The first step is to convert the image to a format which was created with ultra-high-resolutions in mind, e.g. TIFF.

A standard TIFF file isn't suited for this purpose because it has a maximum theoretical size of 4GB for the same reason that Noita can't use more than 2GB RAM (i.e. because the TIFF format uses 32-bit byte offsets internally). There is a variant of the TIFF format called BigTIFF that uses 64-bit offsets internally, which allows for a maximum theoretical size of 19PB. TIFF files can use either strips (the default) or tiles to store subimages internally; tiles are preferred these days.

The most important aspect of the TIFF format is that it supports pyramids, i.e. it supports saving multiple resolutions of the same image in one file. The first level of the pyramid is the original resolution. The original resolution is then halved and saved as the second level. Next, the original resolution is quartered and saved as the third level and so on and so forth until the final level (1x1 pixels) is reached.

This graphic illustrates the concept of pyramidal tiled TIFF nicely:

image

I used the command line client of the image processing library VIPS to perform this conversion.

vips tiffsave output.png output.tif --tile --pyramid --bigtiff --compression deflate

The second step is to use an image viewer that can open pyramidal tiled TIFFs. There are a couple good choices for this. The fastest FOSS tool I've found for this purpose is OpenSlide. Another solid option is nip2, which uses the image processing library VIPS internally. The latter is also able to open the original PNG file while only using ~300MB RAM because VIPS doesn't process entire images in memory; it streams images as a series of small regions.

Another thing I asked myself once I reached this point is how I can allow other people to view this map in their browser. As you've discovered, there are a few hosted services like EasyZoom which have a free plan that's generous enough to do this, but I'm a big self-hosting enthusiast, so I prefer not having to rely on 3rd party services whenever feasible.

The most well-known image servers that specialize in streaming ultra-high-resolution images over the web are IIPImage and Cantaloupe. I might look into using one of them some day, but for now, I opted to use a software called OpenSeaDragon instead, which allows one to serve ultra-high-resolution images as a static website (provided one converts them to a compatible format first). I opted to go with the open format IIIFv3, which the VIPS command line client can generate like this:

vips dzsave --layout iiif3 --tile-size 256 output.png iiif3

You can see what the result looks like on my website.

Edit: Updated the URL to my website and switched from DeepZoom to IIIFv3.

Dadido3 commented 4 years ago

Thanks for the information.

When i created this mod and the tool to stitch the image i also looked a bit into how to how to store the end result. But i noticed that the final raw image of a large map capture is "only" between 11-15 GB in size (Depending on the exact resolution and if the software adds an alpha channel), so i decided not to put much effort into using a file format which supports tiling.

I just use golang's native PNG lib to stream the image data into a file and let it compress on the fly. For viewing the result i use the old Windows Photo Viewer. No idea if it comes with a fresh Windows 10 installation anymore, but it works good enough for me. Even on my older system with 16 GB of RAM i was able to view the images without any input lag after i let it load for a minute or so. On my current system i can open several of those images, and it takes like 25 seconds to load them initially:

grafik

It's fine for a quick glance over the result, it retains the pixellated look (nearest neighbour interpolation) and zooming/panning is reacting instantly. Also, paint.net opens the files even faster and takes "only" below 11 GB of RAM.

But i suppose there are people who want to do a bit more with the result. For them i'll link your text in the README.md. Also, thanks for the information on how to self host the images.