RamjiP / winston-azure-table-storage

Apache License 2.0
0 stars 1 forks source link

SAS Token Not Working #2

Open ps-from-md opened 2 years ago

ps-from-md commented 2 years ago

I attempting to create a SAS based connection to a table that works when using account and key:

new azureTableTransport ({
    //account: "MYACCOUNTNAME",  //works when enabled
    //key: "MYACCOUNTKEY",  //works when enabled
    host:"MYACCOUNTNAME.table.core.windows.net",
    sas: "MYACCOUNTHOSTQUERYSTRING",
    table: "log",
    partition: require('os').hostname(),
    level: 'silly',
    metaAsColumns: true
})

If I comment out HOST and SAS and uncomment ACCOUNT and KEY, everything works. I have verified that its not the SAS key by connecting using the full URL key to Azure Storage Explorer.

Specifically I am getting a 403 error followed by URL scheme "requestid" is not supported and URL scheme "time" is not supported.

Here is my package file if needed:

{ "name": "------", "version": "0.1.0", "description": "-------", "private": true, "dependencies": { "@azure/msal-browser": "^2.14.2", "@azure/msal-react": "^1.0.0", "@fluentui/react": "^8.30.2", "@fluentui/react-file-type-icons": "^8.3.1", "@microsoft/applicationinsights-react-js": "^3.2.0", "@microsoft/applicationinsights-web": "^2.7.0", "@microsoft/file-browser": "^1.0.0-preview.0", "@types/jest": "^27.0.1", "@types/node": "^16.7.10", "@types/react": "^16.14.14", "@types/react-dom": "^16.9.14", "bootstrap": "^4.5.3", "jquery": "^3.5.1", "office-ui-fabric-react": "^5.123.0", "popper.js": "^1.16.1", "prop-types": "^15.7.2", "react": "^17.0.1", "react-bootstrap": "^1.3.0", "react-dom": "^17.0.0", "react-router": "^5.2.1", "react-router-dom": "^5.3.0", "react-scripts": "^4.0.2", "uuid": "^8.3.2", "winston": "^3.3.3", "winston-azure-table-storage": "^1.0.9", "winston3-azureblob-transport": "^0.1.1" }, "scripts": { "start": "react-scripts start", "build": "react-scripts build" }, "browserslist": { "production": [ ">0.2%", "not dead", "not op_mini all" ], "development": [ "last 1 chrome version", "last 1 firefox version", "last 1 safari version" ] }, "devDependencies": { "@types/react-router": "^5.1.16", "@types/react-router-dom": "^5.1.8", "@types/uuid": "^8.3.1", "@typescript-eslint/eslint-plugin": "^4.30.0", "@typescript-eslint/parser": "^4.30.0", "eslint": "^7.32.0", "eslint-config-airbnb": "^18.2.1", "eslint-plugin-import": "^2.24.2", "eslint-plugin-jsx-a11y": "^6.4.1", "eslint-plugin-react": "^7.25.1", "eslint-plugin-react-hooks": "^4.2.0", "typescript": "^4.4.2" } }

ps-from-md commented 2 years ago

I was attempting to generate a policy and token from the actual table rather than the storage account. Using a SAS from the storage account itself (with Table Access) worked. Sorry for the issue.

ps-from-md commented 2 years ago

I wanted to follow up on this after experimenting a bit just to insure that its the intended behavior. This transport works when you generate a SAS token in Azure (not from the Azure Storage Explorer) on the storage account. This type of token that is generated is not specific to the Table in the storage account at all, but for all resources of a specific type or types (blob, queue, file, table.)

When you attempt to create an access policy on a specific Table and then generate a SAS from that policy (see tutorial example ), the transport does not work.

Additionally, when you create a SAS token directly on the Table and not the storage account, that SAS token does not work as well. It would appear that only a token generated at the storage account level will work, but this can be problematic.

I would like to be able to limit the token and control it using Access Policies or even use a token generated at the Table level and not the storage account level.

My guess is that maybe this issue is related to how the URLs are being built in the transport. Working SAS tokens (those generated at the storage level) dont have the name of the table in them. However, those that are generated at the Table level have an additional querystring parameter (tn=log).

Hopefully this is an easy fix and something worth implementing!