Closed scheibo closed 5 years ago
Some more thoughts:
The type structure I'm going for is as follows:
type DeepPartial<T> = {
[P in keyof T]?: T[P] extends (infer I)[]
? (DeepPartial<I>)[]
: DeepPartial<T[P]>;
};
// eg. 'gen1.json'
export interface GenerationData {
[formatid: string]: FormatData;
}
// eg. 'gen7balancedhackmons.json'
export interface FormatData {
sets: {
[source: string]: {
[speciesid: string]: {
[name: string]: DeepPartial<PokemonSet>;
};
};
};
weights: {
species: {[id: string]: number};
abilities: {[id: string]: number};
items: {[id: string]: number};
moves: {[id: string]: number};
};
}
sets
will be further keyed by 'source' (eg. https://smogon.com/dex
, https://smogon.com/stats
, https://damagecalc.trainertower.com/
, etc) to allow for clients to filter out certain setsweights
will be ID: index
- this uses up more space than simply doing sorted arrays, but I expect the space to be negligible compared to sets
and after compression, and it will make weights
more immediately useful without requiring transformation. I will actually get concrete data about the cost size-wise here, but I think it'll be trivialThe logic for generating the sets data package will live in tools/set-import/index.ts
(built to tools/set-import/index.ts
), and tools/{simulate.js,SIMULATE.md}
will be moved do its only subdir within tools (ie. tools/simulate/index.js
, so node tools/simulate
still works). The directory will be set up as follows (build artifacts indicated with *
):
tools/
set-import/
index.ts
index.js*
sets/
package.json
index.js
index.d.ts
gen1.json*
...
gen7zu.json*
Thus, one will run npm build
at the top level, then node tools/set-import
which will write new files to the tools/set-import/sets/
subdir (and bump the version on its package.json
?) and then cd tools/set-import/sets && npm publish
can be run to release a new package.
This is now code complete and verified (yay!). Assigning to @Zarel to resolve logistics with Smogon.
Which logistics do you need me to resolve?
In order to support easily accessing sets in the client teambuilder, the newly committed
sets
package from the damage calc will be moved to this repository.@pokemon-showdown/sets
package will be hosted in the main https://github.com/Pokemon-Showdown/Pokemon-Showdown repo. The code will be modified to drop the@pokemon-showdown/calc
dep and will depend directly onDex
without a shim. Because the@pokemon-showdown/sets
package will only be exporting the data (see 3 below), isolating the generation code into a subpackage is less important (though can be done at a later date once https://pkmn.cc/ps-core-design is complete).PokemonSet
conventions instead of those used by the damage calc (ie. standard as opposed to abbreviatedStatName
)sets
package will actually publish the data, not the code for fetching the data (ie/ two 'build' folders, compiled TS which can then be run to fetch the script, and the actual data files which is going to be what developers importing@pokemon-showdown/sets
will be accessing). The npm package's will also include anindex.js
andindex.d.ts
for loading logic and typings, but Pokemon-Showdown-Client will likely just pull the raw JSON files out of thedist
folder.sets
package pushes data, sets are updated beforenpm publish
.node build full
on the client will simply copy the data out ofnode_modules/
, so updating sets actually involves running the set importer locally, bumping the version, publishing, updating the client's package.json and runningnode build full
@pokemon-showdown/sets
and Parcel will handle bundling and lazy loading for us. The current UI can simply copy the files like the client.Originally the
@pokemon-showdown/sets
package will support slicing by generation only (gen1.json
etc), but I would like to also support supporting by generation + format (gen1lgpeou.json
) as well.