Let's say you want to generate a 10k collection with 100 frames each. Most computers don't have enough space
to handle hundreds of GB of data, plus the whole process can take days and might get interrupted halfway.
Instead of paying for a remote server and having to pay for tons of storage, now you can run
the whole generation process locally in parts! This is different than the "batching" mentioned in previous sections, which
batches the frames into smaller batches. Generating in parts means you can generate only part of the collection at a time.
For example, let's say you have a 10k collection with 120 frames. Your global config might look like:
Now instead of just running make all which will most likely error, you can genereate only 1k editions at a time. You will need to edit all.py.
Look for START_EDITION and END_EDITION. These are going to be which range of editions you want to generate. For example first we could generate 0 - 1000. Edit the file with START_EDITION = 0, and END_EDITION = 1000. This under the hood will generate all 10K JSON files, but only generate the first 1K. You can check rarity and other metadata info now. NOTEEND_EDITION is EXCLUSIVE, meaning this will only generate 0 - 999 (total of 1000).
After this finishes, move the build folder to somewhere else on your computer, or an external hard
drive, and start generating START_EDITION = 1000 and END_EDITION = 2000. This will generate 1000 - 1999. Repeat the process until (making sure to move the files out of the build folder), START_EDITION = 9000 to END_EDITION = 10000.
Generate entire collection in parts
🧪 BETA FEATURE
Let's say you want to generate a 10k collection with 100 frames each. Most computers don't have enough space to handle hundreds of GB of data, plus the whole process can take days and might get interrupted halfway.
Instead of paying for a remote server and having to pay for tons of storage, now you can run the whole generation process locally in parts! This is different than the "batching" mentioned in previous sections, which batches the frames into smaller batches. Generating in parts means you can generate only part of the collection at a time.
For example, let's say you have a 10k collection with 120 frames. Your global config might look like:
Now instead of just running
make all
which will most likely error, you can genereate only 1k editions at a time. You will need to editall.py
.Look for
START_EDITION
andEND_EDITION
. These are going to be which range of editions you want to generate. For example first we could generate 0 - 1000. Edit the file withSTART_EDITION = 0
, andEND_EDITION = 1000
. This under the hood will generate all 10K JSON files, but only generate the first 1K. You can check rarity and other metadata info now. NOTEEND_EDITION
is EXCLUSIVE, meaning this will only generate 0 - 999 (total of 1000).After this finishes, move the
build
folder to somewhere else on your computer, or an external hard drive, and start generatingSTART_EDITION = 1000
andEND_EDITION = 2000
. This will generate 1000 - 1999. Repeat the process until (making sure to move the files out of the build folder),START_EDITION = 9000
toEND_EDITION = 10000
.