tnc-ca-geo / animl-api

Backend for https://animl.camera
4 stars 0 forks source link

Initial TS setup + port ML handler #176

Closed alukach closed 1 month ago

alukach commented 2 months ago

What I'm changing

This PR adds initial TypeScript tooling to our project and ports the ml API handler to TypeScript.

How I did it

tsconfig.json

I've added a tsconfig.json file to the base of this repo. These are the rules for how TS converts our repository to standard JS. I'm leaning on this @tsconfig/node14 base to target the version of JS that would be available in the NodeJS 14 environment available on AWS Lambda. The notable bits of this tsconfig.json is that 1) we are running strict: true, which applies a whack of strict rules to ensure that our system is well types (docs), 2) we are using "allowJs": true, to permit us to gradually port the codebase over (we will eventually turn this to false).

serverless-plugin-typescript

I've integrated the serverless-plugin-typescript plugin into our Serverless setup. I admit that I'm not 100% sure what this plugin does, but I believe that it integrates the TS build process in with the Serverless operations (e.g. sls package, sls deploy). I've pointed it to our tsconfig.json to attempt to avoid any surprises; the plugin brings its own tsconfig.json but I think it's preferable for us to control the TS rules and we want to ensure that local dev builds exactly match the Serverless builds.

source-map-support/register

I've added import 'source-map-support/register' to the head of all of our handlers. This is a bit of a hassle but allows outputted stack traces to point to the source code in our TS files rather than in the compiled JS files, which is useful when debugging errors.

types

I did my best to deduce the types of the code we were using based on the properties that were examined. However, I am missing some context with regards to a few types:

  1. I'm a bit confused by this bit of code:

    https://github.com/tnc-ca-geo/animl-api/blob/f78c738b891282f232f8dd11f04f8f53ce3f6781/src/ml/handler.ts#L83

    We later pass that config into inference operations, meaning that it must contain the the ModelInterfaceParams. Is that what comes from remoteConfig from SSM here?

    https://github.com/tnc-ca-geo/animl-api/blob/f78c738b891282f232f8dd11f04f8f53ce3f6781/src/config/config.js#L111-L116

    TS is able to auto-generate the following type for the getConfig() function by looking at our codebase:

    (alias) getConfig(): Promise<{
        APIKEY: any;
        TIME_FORMATS: {
            EXIF: string;
        };
        EMAIL_ALERT_SENDER: string;
        CSV_EXPORT_ERROR_COLUMNS: string[];
        CSV_EXPORT_COLUMNS: string[];
    }>
    import getConfig

    @nathanielrindlaub Please let me know what we can expect from remoteConfig at runtime and I can use that to properly type the config module.

  2. We need to build out more info about these types: https://github.com/tnc-ca-geo/animl-api/blob/f78c738b891282f232f8dd11f04f8f53ce3f6781/src/ml/modelInterfaces.ts#L224-L226

    @nathanielrindlaub Any tips on the expected shapes of data?

package.json and Dockerfile-test

I've added the build command to our package.json to make it clear how to build the project (ie transpile TS to JS). I've also updated the Dockerfile-test project to run the build during CICD, which is where errors will be thrown if TS types aren't lining up across the file.

How you can test it

If using a browser with TS support (eg VSCode), the IntelliSense should now pickup types as you work with code exported from our module. Also, while writing code, you can run npm run build -- -w to build the project everytime files change (this will likely become a standard practice when developing on this project).