googleapis / google-cloud-node

Google Cloud Client Library for Node.js
https://cloud.google.com/nodejs
Apache License 2.0
2.9k stars 592 forks source link

Webpack and other bundler support #2933

Open callmehiphop opened 5 years ago

callmehiphop commented 5 years ago

Seems like a lot of users want to use webpack with a number of the nodejs client libraries.

Threads where this comes up:

Threads demonstrating use of fallback true:

alexander-fenster commented 5 years ago

There are short-term and long-term plans for this.

The short-term solution is to make google-gax webpack'able by replacing all the gRPC specifics with grpc-fallback, which is, essentially, sending serialized protobuf to HTTP/1 endpoints. Our two interns, @libanodowa and @ramya-ramalingam, are working on this project. We expect it to be done some time this June/July.

The long-term solution is to use grpc-web library, which (unfortunately) is not compatible to grpc / @grpc/grpc-js in terms of the interface it provides, so plugging it into gax would not be that easy. @eoogbe is working on making grpc-web work for our use cases.

guybedford commented 5 years ago

We've been able to support Webpack builds of google cloud libraries in ncc through the use of static analysis in our asset relocation loader (https://github.com/zeit/webpack-asset-relocator-loader), which should support both client and server use cases. We've been fixing bugs as we find them on the grpc libraries, and need to support this anyway.

If it would be useful to provide some examples of browser workflows with this approach we could gladly assist with this.

bcoe commented 5 years ago

@guybedford :wave: would love to work with you to start adding some web examples over time.

guybedford commented 5 years ago

@bcoe more than happy to share how we achieved this with ncc, and possibly even make our approach available to other webpack users, but it does involve an asset analysis running against all source code to determine the ".proto" references.

Having a fallback JSON-loading / browser bundling approach seems to make sense though architecturally for this project. Perhaps a simple "browser": { "./loader.js": "./loader-browser" } mapping in the package.json could fork between the loader for Node.js and browsers, switching between the binary and JSON formats appropriately. I think that's a well-enough established pattern to work for users, and might simplify the detections and code inclusions.

bcoe commented 4 years ago

@alexander-fenster is there a thread you can link to here that demonstrates how to use the grpc-fallback, I know you documented this in another thread.

alexander-fenster commented 4 years ago

@bcoe For using it with Node.js and service account, just pass fallback: true in the client constructor and it will go on the fallback route without gRPC and will use require() to read the JSON proto file:

const fooClient = new client.FooClient({fallback: true});

To use in non-Node setting, one cannot use service account and will need to use OAuth2 workflow. It can be done using the following code. It really depends on how your application works (if it's a website, or an Electron app, or something else), so I'm just showing the general idea of performing the OAuth2 workflow leaving the specifics behind (e.g. how to redirect the user to auth URI, how to extract the auth code from the query string, etc.)

The parameters client_id and client_secret should be obtained from Google Cloud Console.

const client = require('@google-cloud/foo'); // any GAPIC library

const auth = require('google-auth-library');
const oauth2client = new auth.OAuth2Client(client_id, client_secret, callback_uri);
const authUrl = oauth2client.generateAuthUrl({
  access_type: 'offline',
  scope: client.FooClient.scopes
});
// redirect user to authUrl and wait for them coming back to callback_uri

// in callback_uri handler, get the auth code from query string and obtain a token:
const tokenResponse = await oauth2client.getToken(code);
oauth2client.setCredentials(tokenResponse.tokens);

// now use this oauth2client!
const fooClient = new client.FooClient({ auth: oauth2client }); // <-- auth passed here

If the library detects it's running in a browser (has a window global object), it will follow the fallback route automatically (no need to pass fallback: true).

ddittoPersevere commented 4 years ago

I am very new to development, so please forgive me if I'm overlooking the answer to this... I am using the Google Drive api as a database of sorts to send json from an Electron application to the Drive, and then I need to retrieve that json in a web application. I'm having to do this due to the Electron application being used offline. Since the web application uses webpack (using React and Redux), it won't let me pull the data from the Drive, without running the script that pulls the data independently (command line).

ERROR in ./node_modules/google-auth-library/build/src/auth/googleauth.js
Module not found: Error: Can't resolve 'child_process' in 'C:\Users\SSC05\Desktop\Class\indecision\pid_web\node_modules\google-auth-library\build\src\auth'
 @ ./node_modules/google-auth-library/build/src/auth/googleauth.js 16:24-48
 @ ./node_modules/google-auth-library/build/src/index.js
 @ ./node_modules/googleapis-common/build/src/index.js
 @ ./node_modules/googleapis/build/src/googleapis.js
 @ ./node_modules/googleapis/build/src/index.js
 @ ./src/scripts/dataPull.js
 @ ./src/routers/AppRouter.js
 @ ./src/app.js

ERROR in ./node_modules/https-proxy-agent/index.js
Module not found: Error: Can't resolve 'net' in 'C:\Users\SSC05\Desktop\Class\indecision\pid_web\node_modules\https-proxy-agent'
 @ ./node_modules/https-proxy-agent/index.js 5:10-24
 @ ./node_modules/gaxios/build/src/gaxios.js
 @ ./node_modules/gaxios/build/src/index.js
 @ ./node_modules/google-auth-library/build/src/transporters.js
 @ ./node_modules/google-auth-library/build/src/index.js
 @ ./node_modules/googleapis-common/build/src/index.js
 @ ./node_modules/googleapis/build/src/googleapis.js
 @ ./node_modules/googleapis/build/src/index.js
 @ ./src/scripts/dataPull.js
 @ ./src/routers/AppRouter.js
 @ ./src/app.js

ERROR in ./src/scripts/dataPull.js
Module not found: Error: Can't resolve 'readline' in 'C:\Users\SSC05\Desktop\Class\indecision\pid_web\src\scripts'
 @ ./src/scripts/dataPull.js 3:15-34
 @ ./src/routers/AppRouter.js
 @ ./src/app.js

ERROR in ./node_modules/https-proxy-agent/index.js
Module not found: Error: Can't resolve 'tls' in 'C:\Users\SSC05\Desktop\Class\indecision\pid_web\node_modules\https-proxy-agent'
 @ ./node_modules/https-proxy-agent/index.js 6:10-24
 @ ./node_modules/gaxios/build/src/gaxios.js
 @ ./node_modules/gaxios/build/src/index.js
 @ ./node_modules/google-auth-library/build/src/transporters.js
 @ ./node_modules/google-auth-library/build/src/index.js
 @ ./node_modules/googleapis-common/build/src/index.js
 @ ./node_modules/googleapis/build/src/googleapis.js
 @ ./node_modules/googleapis/build/src/index.js
 @ ./src/scripts/dataPull.js
 @ ./src/routers/AppRouter.js
 @ ./src/app.js

This, of course, isn't going to work for the web application, because it needs to pull data from the Drive every time the site is visited. I know this is a very specific situation, but I'm just not sure where to implement the 'fallback-route', or if that is even what needs to occur. Again, please forgive me if this is basic knowledge.

alexander-fenster commented 4 years ago

Hi @ddittoPersevere, thank you for asking! It's possible to use the Drive library in your web application if you prepare a separate Webpack bundle by following these steps. Please let us know if anything is unclear!

omrib15 commented 4 years ago

Hey @alexander-fenster, I'm trying to use googleapis/nodejs-bigtable on the express server of an angular universal project (which uses webpack). I've read this thread and many others related to it but couldn't understand what is the current status of googleapis nodejs client libraries support for webpack. I tried using the loader @guybedford provided and it works great, but I guess it's not the official solution to this problem. So basically what I'm asking is:

  1. Is there going to be an inherent webpack support in those libraries soon?
  2. What is the team's suggested solution right now? is it @guybedford 's loader?
alexander-fenster commented 4 years ago

@omrib15 The webpack support across libraries works using the gRPC-fallback mode (enabled automatically if you're in browser, or can be enabled explicitly by passing { fallback: true } to a client constructor). This is the suggested solution. One thing that is different is the authentication. There are two options:

When I say "it works", I mean that (a) it's implemented, and (b) we do have real users using it for different libraries, and (c) we are not going to break it or remove it, and (d) we consider the compatibility when we do semver releases.

Having that said, at this time this feature is considered experimental, that's why there is no official documentation yet. I'll be happy to provide you with any kind of support you might need - share your code and we'll try to figure out what's wrong.

RXminuS commented 4 years ago

How does it work with something like @google-cloud/kms where you create the client like

import {KeyManagementServiceClient} from "@google-cloud/kms";

const client = new KeyManagementServiceClient({...}); // I don't see  fallback: true here?
omrib15 commented 4 years ago

@alexander-fenster Thanks a lot for the support, I really appreciate it. Up until recently, we have used bigtable for pure node projects. Now, we want to integrate it into our Angular Universal projects, to get our bigtable data for server side rendering. I will emphasise that No access to it will be made in the browser.

So, my plan is to create the bigtable client instance on our express server and inject it into angular server services (achieved by having different service implementation provided for server and browser). Note that the express server is built with webpack for internal reasons.

I first tried creating the client the way I did on our node projects:

const Bigtable = require('@google-cloud/bigtable');
// also tried import as part of my sanity check
const bigtableClient = new Bigtable({projectId: 'my-project-id'})

and got the runtime error:

path.js:39
    throw new ERR_INVALID_ARG_TYPE('path', 'string', path);
    ^

TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string. Received type number
    at assertPath (path.js:39:11)
    at Object.dirname (path.js:1268:5)
    at Object.<anonymous> (.../dist/server.js:503027:36)
    at __webpack_require__ (.../dist/server.js:20:30)
    at Object.<anonymous> (.../dist/server.js:502530:25)
    at Object.defineProperty.value (.../dist/server.js:502790:30)
    at __webpack_require__ (.../dist/server.js:20:30)
    at Object.defineProperty.value (.../dist/server.js:502451:16)
    at __webpack_require__ (.../dist/server.js:20:30)
    at Object.defineProperty.value (.../dist/server.js:501408:22)

Tried creating the bigtable client with fallback set to true as you said: const bigtableClient = new Bigtable({fallback: true, projectId: 'my-project-id'}) and without projectId: const bigtableClient = new Bigtable({fallback: true}) but still got the same path.js error.

I also tried adding the same ignore rules as in bigtable's webpack.config and install null-loader. got a different error:

ERROR in ./node_modules/@grpc/grpc-js/package.json
Module parse failed: Unexpected end of JSON input while parsing near ''
You may need an appropriate loader to handle this file type.
SyntaxError: Unexpected end of JSON input while parsing near ''

Last, I tried @zeit/webpack-asset-relocator-loader, and It worked. The only change it made was relocating protobuf and protos dirs, along with operations.json and protos.json files in the dist folder. Couldn't really figure out why this worked.

I realise that using the fallback option is the suggested solution right now, but it not working for me with bigtable client.

Thanks again for helping me with this, using bigtable on our universal projects is really crucial to us and this compatabillity issue is the only thing holding it back.

alexander-fenster commented 4 years ago

@omrib15 Just to confirm, you're going to webpack the package with "target": "node" and the environment you're going to run it in will have a real Node.js environment (with http2 module), is that right? If so, you don't need fallback, it should just work, but...

I was able to reproduce the problem you mentioned,

ERROR in ./node_modules/google-gax/node_modules/@grpc/grpc-js/package.json
Module parse failed: Unexpected end of JSON input while parsing near ''
File was processed with these loaders:
 * ./node_modules/null-loader/dist/cjs.js

The package.json file mentioned is pretty valid (can be parsed without any problems), I will try to dig deeper to see what the issue might be.

omrib15 commented 4 years ago

@alexander-fenster Yup, I webpack the express server file (the only one who imports and instantiates Bigtable) with target: 'node' and then I run the wepacked file with node v10.13.0.

RXminuS commented 4 years ago

Last, I tried @zeit/webpack-asset-relocator-loader, and It worked. The only change it made was relocating protobuf and protos dirs, along with operations.json and protos.json files in the dist folder. Couldn't really figure out why this worked.

@omrib15 Could you share your asset-relocator config?

RXminuS commented 4 years ago
  • if you're in Node or in any other environment that supports reading service account keys from the filesystem using Node-compatible fs, the regular Node.js authentication should just work for fallback mode.

What about GKE workload identity? When I set fallback: true I get an error that I need to provide auth

omrib15 commented 4 years ago

@RXminuS webpack config for the asset relocator is as mentioned in the readme:

module: {
    rules: [
      {
        // For node binary relocations, include ".node" files as well here
        test: /\.(m?js|node)$/,
        // it is recommended for Node builds to turn off AMD support
        parser: { amd: false },
        use: {
          loader: '@zeit/webpack-asset-relocator-loader',
          options: {
            // optional, base folder for asset emission (eg assets/name.ext)
            outputAssetBase: 'assets',
            // optional, restrict asset emissions to only the given folder.
            filterAssetBase: process.cwd(),
            // optional, permit entire __dirname emission
            // eg `const nonAnalyzable = __dirname` can emit everything in the folder
            emitDirnameAll: false,
            // optional, permit entire filterAssetBase emission
            // eg `const nonAnalyzable = process.cwd()` can emit everything in the cwd()
            emitFilterAssetBaseAll: false,
            // optional, a list of asset names already emitted or
            // defined that should not be emitted
            existingAssetNames: [],
            wrapperCompatibility: false, // optional, default
            // build for process.env.NODE_ENV = 'production'
            production: true, // optional, default is undefined
            cwd: process.cwd(), // optional, default
            debugLog: false, // optional, default
          }
        }
      }
    ]
  }
RXminuS commented 4 years ago

@omrib15 worked like a charm! Thank you ❤️

omrib15 commented 4 years ago

@alexander-fenster Hey, any updates on the issue?

JustinBeckwith commented 4 years ago

No updates. If there are, we'll happily post them!

Dio-sam commented 4 years ago

@alexander-fenster hey, I have the same bug in https://github.com/googleapis/nodejs-speech/issues/489 when I run serve:ssr command. Can you help me to fix this bug.

throw err; ^

Error: ENOENT: no such file or directory, open 'google/protobuf/api.proto' at Object.openSync (fs.js:440:3) at Object.readFileSync (fs.js:342:35)

Thank you !

BenLirio commented 4 years ago

@alexander-fenster Thank you for the example above. I am attempting to use this flow with Firebase using their SDK for Google OAuth2 following the documentation here (https://firebase.google.com/docs/auth/web/google-signin). Upon the user sign in I received "response.credential" and "response.user". Will I be able to authenticate a Google API with this information, or do I need to set up a custom back end Client. Thank you and please let me know if there is a more standard workflow.

Use case: Text-detection within a web app. Although there were MLKits for mobile, I was unable to find one for Web applications. Previously I have used cloud run with a service account to pass data, but due to file uploading and such, I think having a direct link to the API within my web app will reduce complexity and help with speech (no need to upload and then check).

a281983 commented 3 years ago

@bcoe For using it with Node.js and service account, just pass fallback: true in the client constructor and it will go on the fallback route without gRPC and will use require() to read the JSON proto file:

const fooClient = new client.FooClient({fallback: true});

To use in non-Node setting, one cannot use service account and will need to use OAuth2 workflow. It can be done using the following code. It really depends on how your application works (if it's a website, or an Electron app, or something else), so I'm just showing the general idea of performing the OAuth2 workflow leaving the specifics behind (e.g. how to redirect the user to auth URI, how to extract the auth code from the query string, etc.)

The parameters client_id and client_secret should be obtained from Google Cloud Console.

const client = require('@google-cloud/foo'); // any GAPIC library

const auth = require('google-auth-library');
const oauth2client = new auth.OAuth2Client(client_id, client_secret, callback_uri);
const authUrl = oauth2client.generateAuthUrl({
  access_type: 'offline',
  scope: client.FooClient.scopes
});
// redirect user to authUrl and wait for them coming back to callback_uri

// in callback_uri handler, get the auth code from query string and obtain a token:
const tokenResponse = await oauth2client.getToken(code);
oauth2client.setCredentials(tokenResponse.tokens);

// now use this oauth2client!
const fooClient = new client.FooClient({ auth: oauth2client }); // <-- auth passed here

If the library detects it's running in a browser (has a window global object), it will follow the fallback route automatically (no need to pass fallback: true).

Using {fallback: true} in the constructor within node implementation gives a new error -->TypeError: fetch is not a function Apologies a newbie here, but finding is really tough to implement GCP Vision node client library

bcoe commented 3 years ago

@a281983 what environment are you attempting to run the @google-cloud/vision library in, would you be able to provide a code example that reproduces the issue you're seeing.

alexander-fenster commented 3 years ago

@a281983 What should happen is: in Node, fetch will be taken from node-fetch, while in browser it will use window.fetch. Some corner cases might include Electron. How exactly do you use it?

en4letto commented 3 years ago

Hi @alexander-fenster , I have a problem similar to the one @a281983 posted here. I am using the @google-cloud/speech library and bundling an express app with webpack.

I had the proto error and fixed it by passing {fallback: true} to the constructor. Right now I have the error UnhandledPromiseRejectionWarning: TypeError: fetch is not a function, which is triggered when in my code I execute the recognize function of the client. Here the related portions of my code:

// THIS IS THE CONFIG PASSED TO THE CONSTRUCTOR
authSTTConfig: {
        projectId: process.env.STT_PROJECT_ID,
        keyFilename: process.env.STT_KEY_FILENAME,
        fallback: true,
    }
//

...

class GCloudSpeechAPIv2 {
    constructor() {
        this.audioChunks = [];
        this.finalAudio = null;

        this.instanceSTT = new speech.SpeechClient(authSTTConfig);

        this.requestSTT = {
            // interimResults: true,
            config: cloudSTTConfig,
        };
    }

    async transcribeAudio() {
        this.requestSTT.audio = {
            content: this.finalAudio.toString("base64"),
        };

        // console.log("transcribeAudio");
        const [response] = await this.instanceSTT.recognize(this.requestSTT); // ERROR IS THROWN HERE
        const transcription = response.results
            .map((result) => result.alternatives[0].transcript)
            .join("");
        return transcription;
    }

...code continues
alexander-fenster commented 3 years ago

@en4letto If you pass {fallback: true}, the execution should go to node_modules/google-gax/build/src/fallback.js. There is just one fetch call there, could you add some console.log statements to see what kind of fetch is being called there? Should be window.fetch, not the one from node-fetch.

en4letto commented 3 years ago

@alexander-fenster thanks for the insight!

I did some additional checks, here's what I found: As you said the problematic fetch lies in the node_modules/google-gax/build/src/fallback.js, the code is:

const fetch = isbrowser_1.isBrowser()
                    ? // eslint-disable-next-line no-undef
                        window.fetch
                    : nodeFetch;

I've added some logs and the code correctly assigns the nodeFetch variable to the fetch one.

I found an issue about webpack and node-fetch and, as suggested in this comment, I tried to change const nodeFetch = require("node-fetch"); to const nodeFetch = require("node-fetch").default;.

WIth this edit the code works perfectly! It seems that the problem is related to webpack and node-fetch, so the speech library is not directly involved.

Do you happen to have some suggestions on how to possibly apply this fix when bundling the code with webpack?

Thanks again!

alexander-fenster commented 3 years ago

Thank you for these details! I will test if adding .default does not break any of the scenarios covered by CI, and can just release google-gax with this fix. If we are lucky it will happen early next week and so you won't need to make any hacks to temporary work around this.

Having said that - @en4letto can you tell us more about the environment you're running your code in? If this is an express app that runs with Node, why do you need to webpack it? Fallback mode was initially supposed to help run the code in browsers (where gRPC is not available), so I just want to understand your use case and maybe make some recommendations.

en4letto commented 3 years ago

@alexander-fenster I am setting up a production environment in GCP. I'm using the managed instance groups, and I wanted the lowest startup time possible for the instances inside the node group. The deploy is managed via Gitlab CI and the build job (npm install and webpack bundling) is done by a "service" instance (a simple compute engine VM outside of the groups). I didn't want to move the node_modules folder between different VMs and I found a way to bundle the whole express app in a single JS file with webpack so that the instances inside the managed group only have to get this JS bundle and execute it with node.

Hope this is clear! If you have some recommendations and suggestions I'll be happy to know them, as Node.js is not my specialty :)

denniskribl commented 3 years ago

Hey @alexander-fenster @en4letto,

is there an issue which I can follow for the google-gax node-fetch thing? Had no luck finding the right one. Trying to use the @google-cloud/recaptcha-enterprise package with the fallback option. We are bundling our node lambdas with webpack and running into this error.

Thanks in advance :)

gmcdev commented 3 years ago

@alexander-fenster confirming that I run across this fetch problem in an Electron environment, as you noted above

MrHash commented 3 years ago

I was generally able to workaround webpack issues using { fallback: true } in client constructors with target: 'node' & { resolve: { mainFields: ['main'] } } (https://webpack.js.org/configuration/resolve/#resolvemainfields) in the webpack config. This also seems to work for other libs using gax.

alexander-fenster commented 3 years ago

The fix is coming to all the libraries that will stop using fs and will start using require to load the JSONs instead. It should help all of you folks who use webpack with target: 'node'. Stay tuned :)

ghost commented 2 years ago

Any updates on this? I need to use @google-cloud/trace-agent in my cloudfunctions bundled via esbuild.

alexander-fenster commented 2 years ago

@s-dschro any update on what exactly? :) This issue is a collection of various problems related to webpack and/or other bundlers. What is the exact error you have with @google-cloud/trace-agent?

ghost commented 2 years ago

@alexander-fenster https://github.com/googleapis/cloud-trace-nodejs/issues/1393

alexander-fenster commented 2 years ago

Thanks @s-dschro, I'll try to route that issue as appropriate.

JJPell commented 2 years ago

@bcoe For using it with Node.js and service account, just pass fallback: true in the client constructor and it will go on the fallback route without gRPC and will use require() to read the JSON proto file:

const fooClient = new client.FooClient({fallback: true});

To use in non-Node setting, one cannot use service account and will need to use OAuth2 workflow. It can be done using the following code. It really depends on how your application works (if it's a website, or an Electron app, or something else), so I'm just showing the general idea of performing the OAuth2 workflow leaving the specifics behind (e.g. how to redirect the user to auth URI, how to extract the auth code from the query string, etc.)

The parameters client_id and client_secret should be obtained from Google Cloud Console.

const client = require('@google-cloud/foo'); // any GAPIC library

const auth = require('google-auth-library');
const oauth2client = new auth.OAuth2Client(client_id, client_secret, callback_uri);
const authUrl = oauth2client.generateAuthUrl({
  access_type: 'offline',
  scope: client.FooClient.scopes
});
// redirect user to authUrl and wait for them coming back to callback_uri

// in callback_uri handler, get the auth code from query string and obtain a token:
const tokenResponse = await oauth2client.getToken(code);
oauth2client.setCredentials(tokenResponse.tokens);

// now use this oauth2client!
const fooClient = new client.FooClient({ auth: oauth2client }); // <-- auth passed here

If the library detects it's running in a browser (has a window global object), it will follow the fallback route automatically (no need to pass fallback: true).

@alexander-fenster When trying the non-Node sample provided within a React app, I get this error message:

Module not found: Can't resolve 'http2' in ...

Any ideas what could be causing this?

alexander-fenster commented 2 years ago

@JJPell http2 only exists in Node.js and is used by gRPC transport. In React app, try using fallback: true when you create an instance of the client, and set your bundler config to ignore http2.

Stevemoretz commented 1 year ago

Hi I'm trying to use this package or https://github.com/google/google-api-javascript-client for using google tag manager on react-native, react-native has fetch and no window object, anyone has a working build configuration for it?

ErwinThompsonF commented 6 months ago

Hi I'm trying to use google cloud secret manager but this error keeps showing up whenever I try to import the SecretManagerServiceClient I tried following the documentation here but still can't make it run I also tried adding fallback: true to the constructor but it keep showing this error

There are multiple errors that are similar to this error message so I didn't copy them but essentially all error are coming from tunnel-agent

./node_modules/tunnel-agent/index.js:6:12-28 - Error: Module not found: Error: Can't resolve 'https' in '/[file path]/node_modules/tunnel-agent'

BREAKING CHANGE: webpack < 5 used to include polyfills for node.js core modules by default.
This is no longer the case. Verify if you need this module and configure a polyfill for it.

If you want to include a polyfill, you need to:
        - add a fallback 'resolve.fallback: { "https": require.resolve("https-browserify") }'
        - install 'https-browserify'
If you don't want to include a polyfill, you can use an empty module like this:
        resolve.fallback: { "https": false }

sample code

async accessSecret() {
        const client = new SecretManagerServiceClient({ fallback: true });
        const [secret] = await client.accessSecretVersion({
            name: 'projects/my-project/secrets/my-secret',
        });

        // The secret payload is a base64-encoded string.sa
        const payload = secret.payload.data.toString();

        console.log(`The secret payload is: ${payload}`);
        console.log('test')
    }

btw I am using angular 14.2.12 and using google cloud secret manager version 5.2.0

Any help would be much appreciated thank you.