jeanbmar / payload-s3-upload

Send Payload CMS uploads to Amazon S3
MIT License
54 stars 9 forks source link

Cannot read properties of undefined (reading 'config') #1

Closed quornik closed 2 years ago

quornik commented 2 years ago

I'm trying to build S3 upload with the use of your solution, and it keeps returning this error: TypeError: Cannot read properties of undefined (reading 'config') at Payload.find (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\collections\operations\find.ts:28:25) at Payload.find (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\collections\operations\local\find.ts:35:38) at Payload.find (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\index.ts:221:12) at module.exports (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload-s3-upload\lib\file-exists.js:2:33) at C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload-s3-upload\lib\index.js:28:44 at C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\collections\operations\create.ts:186:19 at C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\collections\operations\create.ts:184:5 at Payload.create (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\collections\operations\create.ts:183:3) at Payload.create (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\collections\requestHandlers\create.ts:14:17)

(fabricstudio-backend is obviously my repo) Any ideas why it accures? I tried in Typescript installation, but also in pure js. It all comes back the same.

The payload.config file has this declaration :

`plugins: [

s3Upload({
  region: process.env.AWS_REGION,
  credentials: {
    accessKeyId: process.env.AWS_KEY,
    secretAccessKey: process.env.AWS_SECRET,
  },
}),

], `

Which is exactly like the one You provided in instructions. AWS credentials are legit. Double checked.

Would appreciate help!

jeanbmar commented 2 years ago

Hi there. Does your upload collection have a slug defined? Does it work if you use the upload collection with local storage?

quornik commented 2 years ago

Hi, thanks for rapid reply! It works with local storage. Upload collection has slug defined. There was a mistake in that: the slug was defined capitalized. I changed it, and it started partially working with s3 ( it was working with local before):

BUT:

Cannot read properties of undefined (reading 'startsWith')
    at C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\uploads\mimeTypeValidator.ts:6:44
    at Array.some (<anonymous>)
    at Object.validate (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\uploads\mimeTypeValidator.ts:6:28)
    at validationPromise (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\fields\validationPromise.ts:31:47)
    at C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\fields\traverseFields.ts:377:56
    at C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\fields\performFieldOperations.ts:113:43
    at Array.forEach (<anonymous>)
    at Payload.performFieldOperations (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\fields\performFieldOperations.ts:113:22)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
    at Payload.create (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\collections\operations\create.ts:197:29)

[nodemon] app crashed - waiting for file changes before starting...

My media config looks like this:

const Media = {
    slug: 'media',
    labels: {
        singular: 'Media',
        plural: 'Media',
    },
    upload: {
      staticURL: '/media',
      staticDir: 'media',
      disableLocalStorage: true,
      s3: {
        bucket: process.env.AWS_BUCKET_NAME,
        prefix: 'fabricstudio/assets', 
        commandInput: {
          ACL: 'public-read',  
        },
      },
      adminThumbnail: ({ doc }) => `https://{mybucket-name}.s3.eu-central-1.amazonaws.com/fabricstudio/assets/${doc.filename}`,
        imageSizes: [
          {
            name: 'thumbnail',
            width: 400,
            height: 300,
            crop: 'centre',
          },
          {
            name: 'card',
            width: 768,
            height: 1024,
            crop: 'centre',
          },
          {
            name: 'tablet',
            width: 1024,
            height: null,
            crop: 'centre',
          }
        ],
        mimeTypes: ['image/*'],
      },
      fields: [
      ]
}
export default Media

AND, If i comment out mimeTypes line, the error chages to:

ValidationError: The following field is invalid: sizes.card.filename
    at new ExtendableError (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\errors\APIError.ts:26:11)
    at new APIError (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\errors\APIError.ts:43:5)
    at new ValidationError (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\errors\ValidationError.ts:6:5)
    at Payload.create (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\collections\operations\create.ts:236:11)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
    at Payload.create (C:\DEV PROJECTS\FABRIC_STUDIO\fabricstudio-backend\node_modules\payload\src\collections\requestHandlers\create.ts:14:17)

What do You think?

quornik commented 2 years ago

I tried uploading very large media, with mimeTypes line commented out and it works! Looks like the error finally was due to not large enough images for the imageSizes to work. I'll go deeper in to documentatation of Paylod to see how to prevent images that are not large enough, from being processed at all, and still being uploaded.

Thanks for your help!

If You have something to add - please do. If not, you may close this issue, my problem has been solved.

jeanbmar commented 2 years ago

Ok great then :). Might be worth telling Payload team so they can maybe add a check on image size.