Autodesk-Forge / design.automation-.net-custom.activity.sample

Design Automation Sample in C#: C# sample to demonstrate custom Activities and AppPackages creation
MIT License
16 stars 71 forks source link

Error: Upload failed. Reason = Response status code does not indicate success: 403 (Forbidden). #7

Open NitinMalave20 opened 3 years ago

NitinMalave20 commented 3 years ago

Hello Autodesk team & @szilvaa,

I am not sure whether this is the correct place to ask my query, but i would appreciate if get any help on this topic.

We have drawing file template (.dwg file) which has block attributes with default values. The goal is to fetch the template file, read & update block attributes of file with data and return the new instance of drawing file to client/frontend.

After lot of googling, we came across Autodesk forge design automation api and trying to use it for solving above problem. I watched your video tutorial and tried those steps by cloning this repo. Faced some minor issues related updated command syntax and fix those. But now i am stuck at the last stage where o/p of DAA should get uploaded to s3 bucket and the error is as below:

<Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message><AWSAccessKeyId>AK**************PV</AWSAccessKeyId><StringToSign>AWS4-HMAC-SHA256
202140721T101227Z
202140721/us-west-2/s3/aws4_request
5210d32b88b1d453934d9e59fa8f40df8b9e1d6fa7da19e4dd5028f43e67934ba4a</StringToSign><SignatureProvided>5a878d21601643551f8e6fb279559213d22a2ce5143313263aa886a684e9739d246c</SignatureProvided><StringToSignBytes>41 57 53 34 2d 48 4d 41 43 2d 53 48 41 32 35 36 0a 32 30 32 31 30 37 32 31 54 31 30 31 32 32 37 5a 0a 32 30 32 31 30 37 32 31 2f 75 73 2d 77 65 73 74 2d 32 2f 73 33 2f 61 77 73 34 5f 72 65 71 75 65 73 74 0a 35 32 31 30 64 33 32 62 38 38 62 31 64 33 39 33 34 64 39 65 35 39 66 61 38 66 34 30 64 66 38 62 39 65 31 64 36 66 61 37 64 61 31 39 65 34 64 64 35 30 32 38 66 34 33 65 36 37 39 33 62 61 34 61</StringToSignBytes><CanonicalRequest>PUT
/result.zip
X-Amz-Algorithm=AWS4-HMAC-SHA256&amp;X-Amz-Credential=AK***************PV%2F20210721%2Fus-west-2%2Fs3%2Faws4_request&amp;X-Amz-Date=20210721T101227Z&amp;X-Amz-Expires=3600&amp;X-Amz-SignedHeaders=host
host:engsw-equipment-designer-client-ak.s3.amazonaws.com

Created a presign S3 URL with AWS CLI by command : "aws s3 presign s3://bucket-name/result.zip --region us-west-2"

and replaced it with upload URL. Tried with another AWS account and it's credentials, but still getting the same error and not getting the final o/p's in S3. I am attaching the entire log file as well for better understanding the problem.

How can i fix above issue and get the o/p result.zip in S3 bucket?

We want to move this entire code to microservices and C# lambda hosted on AWS, but before that we are testing it on standalone C# project. Can you please share any available resources/github repos/youtube videos/blogs which can help us with integration of DAA with serverless microservices & AWS?

NitinMalave20 commented 3 years ago

here is a detailed log of process : Das-report.txt

MadhukarMoogala commented 3 years ago

@NitinMalave20 - Can you confirm that AWS Url is write-access enabled, did you test on uploading a zip file the aws url using some Restclient ? or with CURL?

NitinMalave20 commented 3 years ago

Hi @MadhukarMoogala, The bucket has public access but i have not checked with CURL or RestClient request. How should i test with CURL request?

MadhukarMoogala commented 3 years ago

@NitinMalave20 - Assuming it is public url, I tried at my end, I got access denied 403.

On curl it should be fairly straightforward.

curl "https://fpd-uploads.s3.us-west-2.amazonaws.com/test.zip?X-Amz-Expires=3555&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAJIN5W34I637RGLCA/20210721/us-west-2/s3/aws4_request&X-Amz-Date=20210721T155032Z&X-Amz-SignedHeaders=content-type;host&X-Amz-Signature=c3f43550a91f1efd91d5c2e0942fdb015c95750bb676b4fd671d6e6b3030cbc3" --upload-file "D:\Work\DWG\DWG\test.zip" -H "Content-Type: application/octet-stream"

Common root causes for SignatureDoesNotMatch

NitinMalave20 commented 3 years ago

Hi @MadhukarMoogala, Thank you so much for the direction and response from your end. I really appreciate your help 😊

Actually issue was with S3 Presigned URL which i was creating through AWS CLI as a destination. It was throwing signature mismatch error even i followed this points:

I think the issue was with Presign URL generated through AWS CLI as per this article and they recommend to generate URL through SDK.

What worked for me?

Thank you so much 😊.

Now i am looking forward to integrate this code with C# lambda and microservices and i hope it will work as it is intended to be.
Can you please share any available resources/github repos/youtube videos/blogs which can help us with integration of DAA with serverless microservices & AWS?

Underlying tech-stack is :

MadhukarMoogala commented 3 years ago

@NitinMalave20 - I did something similar but on Azure Functions - https://github.com/MadhukarMoogala/da-azfunc But you can easily integrate with C# Lambda, do let me know you if have any question.

Don't forget to bother us at forge.help@autodesk.com

NitinMalave20 commented 3 years ago

Hi @MadhukarMoogala, As per our last discussion, i tried to setup the code in C# microservice by creating a custom plugin to modify block attributes of drawing file and it worked as it intended. Getting the o/p in the S3 bucket from forge DAA.

The next step is to fetch the file from S3 and pass it to frontend client so it can get downloaded in user's local system.

We are trying to read a file from s3 with streamreader and then store it as string content, and pass this string content to Angular frontend where data is received as Blob object and then the file is getting saved with file-save. The code looks like below:

` Backend Api :

private async Task<Template> GetDataFromS3(string incomingObjKey)
        {
            var s3ObjectKey = incomingObjKey;
            var responseBody = "";
            try
            {
                var request = new GetObjectRequest { BucketName = s3Bucket, Key = s3ObjectKey};
                using (var response = await s3Client.GetObjectAsync(request))
                {
                    using (var reader = new StreamReader(response.ResponseStream))
                    {
                        responseBody = reader.ReadToEnd();
                    }
                }
            }
            catch (AmazonS3Exception ex)
            {
                Console.WriteLine($"Failed to get object from S3. Amazon S3 exception error message: {ex.Message}");
            }
            return new Template {Body = responseBody};
        }

Frontend:

this.DAAService.getFileData().subscribe((data)=> {
      let myBlob: Blob = new Blob([data.body, 'binary'], { type: data.ContentType});
      saveAs(myBlob, "output-file.dwg");
    }, (err) => {
      console.error(err);
    })

` After this step, the file is getting downloaded but somehow it's getting curropted as Autocad shows "Autocad file is not valid" prompt while accessing it.

Can you please tell me where i am doing wrong and what can be alternative approach to read the file content from backend to client?

MadhukarMoogala commented 3 years ago

@NitinMalave20 - can write the stream to a file at server side, check if the chunk downloaded correctly without fragmentation.

We are trying to read a file from s3 with streamreader and then store it as string content

Shouldn't it be octet-stream ?

I did something similar - https://github.com/MadhukarMoogala/azBlobZip/blob/master/Controllers/DownloadController.cs

NitinMalave20 commented 3 years ago

Thank you @MadhukarMoogala for quick replies. Eventually i tried with above octet-stream header but it didn't worked as well

What worked for me?

I tried getting the presigned url of file on S3(o/p file from DAA to S3 bucket), i passed it to client and made http request at client side to download the file, and it worked!! I faced CORS issues initially but solved it by setting CORS policies on bucket.

Thanks & Regards, Nitin Malave