Closed igilham closed 1 year ago
This happens as data.Body
is now of type Readable | ReadableStream | Blob
https://github.com/aws/aws-sdk-js-v3/blob/25cb359e69966c549eb505956c2aeee809819245/clients/client-s3/models/models_0.ts#L6560
For your specific example, you can write a streamToString function to convert ReadableStream to a string.
const { S3Client, GetObjectCommand } = require("@aws-sdk/client-s3");
(async () => {
const region = "us-west-2";
const client = new S3Client({ region });
const streamToString = (stream) =>
new Promise((resolve, reject) => {
const chunks = [];
stream.on("data", (chunk) => chunks.push(chunk));
stream.on("error", reject);
stream.on("end", () => resolve(Buffer.concat(chunks).toString("utf8")));
});
const command = new GetObjectCommand({
Bucket: "test-aws-sdk-js-1877",
Key: "readme.txt",
});
const { Body } = await client.send(command);
const bodyContents = await streamToString(Body);
console.log(bodyContents);
})();
@igilham Does this resolve your query?
Thanks, @trivikr. This works in my application but raises a few concerns about the library that are worth sharing:
response.Body
is. It tells me that it's any
. Perhaps the library configuration could be improved to export the correct type information.ReadableStream
and Blob
so it appears to be impossible to make this code type-safe.For reference, I've rewritten the streamToString
with the missing types added back in to comply with my team's linter settings.
import { Readable } from 'stream';
// Apparently the stream parameter should be of type Readable|ReadableStream|Blob
// The latter 2 don't seem to exist anywhere.
async function streamToString (stream: Readable): Promise<string> {
return await new Promise((resolve, reject) => {
const chunks: Uint8Array[] = [];
stream.on('data', (chunk) => chunks.push(chunk));
stream.on('error', reject);
stream.on('end', () => resolve(Buffer.concat(chunks).toString('utf-8')));
});
}
There is no documentation for clients and the GetObjectCommand is not documented in the user guide or sample code. The project Readme file implies I could expect the same behaviour as SDKv2.
Documentation for getObject operation lists that GetObjectOutput.Body
is Readable | ReadableStream | Blob
API Reference: https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/classes/s3.html#getobject
My IDE can't tell me what the type of response.Body is. It tells me that it's any. Perhaps the library configuration could be improved to export the correct type information.
I'm using Visual Studio Code, and it shows type of response.Body
as internal.Readable | ReadableStream<any> | Blob
on hover.
Please create a new issue with details of your IDE and code if problem persists.
- As noted below, I can't find an export of
ReadableStream
andBlob
so it appears to be impossible to make this code type-safe.For reference, I've rewritten the
streamToString
with the missing types added back in to comply with my team's linter settings.import { Readable } from 'stream'; // Apparently the stream parameter should be of type Readable|ReadableStream|Blob // The latter 2 don't seem to exist anywhere. async function streamToString (stream: Readable): Promise<string> { return await new Promise((resolve, reject) => { const chunks: Uint8Array[] = []; stream.on('data', (chunk) => chunks.push(chunk)); stream.on('error', reject); stream.on('end', () => resolve(Buffer.concat(chunks).toString('utf-8'))); }); }
As this code is run on Node.js, you can pass Body as Readable
as follows:
const bodyContents = await streamToString(Body as Readable);
Thanks for following up.
I didn't realise the methods and types were documented. I took the description on the client landing page (go to the README) to mean it was a dead-end. Perhaps improving the wording should be a separate issue.
I can't explain the IDE issue. I'm also on VSCode and it says it's an any
. I find the IDE quite unstable though, so maybe it's just me.
I didn't realise the methods and types were documented. I took the description on the client landing page (go to the README) to mean it was a dead-end. Perhaps improving the wording should be a separate issue.
I've created documentation update request at https://github.com/aws/aws-sdk-js-v3/issues/1878
Thanks. I think that covers my remaining frustrations. I appreciate that it can take time for documentation elsewhere to catch up when a major version is released.
The codesnippet works in Node.js environment, in the browser, you would have a ReadableStream instead of Readable.
Here is my implementation of handling the ReadableStream:
const streamToString = (stream) => {
return new Promise((resolve, reject) => {
if (stream instanceof ReadableStream === false) {
reject(
"Expected stream to be instance of ReadableStream, but got " +
typeof stream
);
}
let text = "";
const decoder = new TextDecoder("utf-8");
const reader = stream.getReader();
const processRead = ({ done, value }) => {
if (done) {
// resolve promise with chunks
console.log("done");
// resolve(Buffer.concat(chunks).toString("utf8"));
resolve(text);
return;
}
text += decoder.decode(value);
// Not done, keep reading
reader.read().then(processRead);
};
// start read
reader.read().then(processRead);
});
};
@trivikr Thanks for that link to the docs! I didn't even know they existed till just now.
The codesnippet works in Node.js environment, in the browser, you would have a ReadableStream instead of Readable.
Here is my implementation of handling the ReadableStream:
const streamToString = (stream) => { return new Promise((resolve, reject) => { if (stream instanceof ReadableStream === false) { reject( "Expected stream to be instance of ReadableStream, but got " + typeof stream ); } let text = ""; const decoder = new TextDecoder("utf-8"); const reader = stream.getReader(); const processRead = ({ done, value }) => { if (done) { // resolve promise with chunks console.log("done"); // resolve(Buffer.concat(chunks).toString("utf8")); resolve(text); return; } text += decoder.decode(value); // Not done, keep reading reader.read().then(processRead); }; // start read reader.read().then(processRead); }); };
I also wasted lots of time on GetObject
and the trifecta of its types. Also, the fact that ReadableStream | Blob
is only Browser, and Readable
only Node made it extremely annoying :)
The streamToString
solution posted above works for Node.
For the browser, I found that using the Response
object from fetch
seems a shorter solution:
new Response(response!.body, {});
This will return a Response
object which will then allow us to use any of the helper methods it has to convert to String, Buffer, Json, etc. See more at https://developer.mozilla.org/en-US/docs/Web/API/Response#methods.
Full example:
const s3 = new S3({
region: "us-east-1",
credentials: {
accessKeyId: "replace-it",
secretAccessKey: "replace-it",
},
});
const resp = await s3.getObject({
Bucket: "your-bucket",
Key: "your-object-key",
});
console.info(await new Response(resp.Body, {}).text())
It's quite unfortunate that everybody has to go through these hoops to get the content out of the response though. Especially considering that we have to do type checking with things like if (resp.Body instanceof Readable)
, or declare special interfaces to avoid differences between browser/Node.
The codesnippet works in Node.js environment, in the browser, you would have a ReadableStream instead of Readable. Here is my implementation of handling the ReadableStream:
const streamToString = (stream) => { return new Promise((resolve, reject) => { if (stream instanceof ReadableStream === false) { reject( "Expected stream to be instance of ReadableStream, but got " + typeof stream ); } let text = ""; const decoder = new TextDecoder("utf-8"); const reader = stream.getReader(); const processRead = ({ done, value }) => { if (done) { // resolve promise with chunks console.log("done"); // resolve(Buffer.concat(chunks).toString("utf8")); resolve(text); return; } text += decoder.decode(value); // Not done, keep reading reader.read().then(processRead); }; // start read reader.read().then(processRead); }); };
I also wasted lots of time on
GetObject
and the trifecta of its types. Also, the fact thatReadableStream | Blob
is only Browser, andReadable
only Node made it extremely annoying :)The
streamToString
solution posted above works for Node. For the browser, I found that using theResponse
object fromfetch
seems a shorter solution:new Response(response!.body, {});
This will return a
Response
object which will then allow us to use any of the helper methods it has to convert to String, Buffer, Json, etc. See more at https://developer.mozilla.org/en-US/docs/Web/API/Response#methods.Full example:
const s3 = new S3({ region: "us-east-1", credentials: { accessKeyId: "replace-it", secretAccessKey: "replace-it", }, }); const resp = await s3.getObject({ Bucket: "your-bucket", Key: "your-object-key", }); console.info(await new Response(resp.Body, {}).text())
It's quite unfortunate that everybody has to go through these hoops to get the content out of the response though. Especially considering that we have to do type checking with things like
if (resp.Body instanceof Readable)
, or declare special interfaces to avoid differences between browser/Node.
the use of Response looks as the neatest solution right now, for json and text payloads.
I've been running into these pain points as well, including Lambda invocation. The payload returned is now a Uint8Array, so it takes a few hoops to get it into a usable format:
const payload = JSON.parse(Buffer.from(data.Payload).toString());
Whereas in the previous JS SDK, it was simply:
const payload = JSON.parse(data.Payload);
I don't understand this new direction with the SDK. I can't say I'm a fan. Maybe @trivikr can weigh in.
Reopening as lot of customers have raised questions. Tagging @AllanZhengYP for comment.
export interface GetObjectOutput {
/**
* <p>Object data.</p>
*/
Body?: Readable | ReadableStream | Blob;
// ... snip
}
This is throwing an error in a NodeJS app, because TS config does not load DOM libs.
This results in the Body
being set to any
.
I'm also very confused about how to read S3 Body responses with SDK v3. The SDK documentation for GetObjectCommand does not describe how to do it, and the SDK examples are also missing it (https://github.com/awsdocs/aws-doc-sdk-examples/issues/1677).
I would ask the AWS SDK team to include in the SDK a simple way to read S3 Body responses. We don't want to re-implement complicated event handlers and helper functions for this simple purpose every time we use GetObject in a project.
In v2 we could just say something like JSON.parse(response.Body?.toString())
. Please make it as simple in v3. Stream-based processing is also important, but it should be only an alternative for the simple case for parsing small JSON objects.
For reference, I was able to do this in Node.js by utilizing node-fetch. I would like something like this be included in AWS SDK.
npm install node-fetch
npm install --save-dev @types/node-fetch
import { Response } from 'node-fetch'
const response = new Response(s3Response.Body)
const data = await response.json()
A one-line alternative is to use get-stream package, as posted here: https://github.com/aws/aws-sdk-js-v3/issues/1096#issuecomment-616743375
I understand the reason for returning a ReadableStream, but a built-in helper method would be nice. Reading the whole body into string in memory is probably good for 99% of cases.
If some helper method would be a part of the SDK we could just call it as readStream(response.Body)
and everyone would be happy not having to add another dependency or 10 lines of boilerplate code to every new project.
I've been running into these pain points as well, including Lambda invocation. The payload returned is now a Uint8Array, so it takes a few hoops to get it into a usable format:
const payload = JSON.parse(Buffer.from(data.Payload).toString());
Whereas in the previous JS SDK, it was simply:
const payload = JSON.parse(data.Payload);
I don't understand this new direction with the SDK. I can't say I'm a fan. Maybe @trivikr can weigh in.
This was by far the easiest solution with the least overhead. Worked for me at least! Much appreciated!
I've been running into these pain points as well, including Lambda invocation. The payload returned is now a Uint8Array, so it takes a few hoops to get it into a usable format:
const payload = JSON.parse(Buffer.from(data.Payload).toString());
Whereas in the previous JS SDK, it was simply:
const payload = JSON.parse(data.Payload);
I don't understand this new direction with the SDK. I can't say I'm a fan. Maybe @trivikr can weigh in.
This was by far the easiest solution with the least overhead. Worked for me at least! Much appreciated!
@sinsys where do you get data.Payload
from? I do not see that in the GetObject model.
I only see data.Body
.
@porteron That was my code snippet - apologies for the confusion. In my comment, I was calling out Lambda, which uses data.Payload
to return a response.
@porteron That was my code snippet - apologies for the confusion. In my comment, I was calling out Lambda, which uses
data.Payload
to return a response.
@ffxsam I attempted your approach using Buffer.from
but for some reason data.Body
is not being recognized as a Buffer. It's just and instance of IncomingMessage
.
const payload = JSON.parse(Buffer.from(data.Body).toString());
Leads to the following error:
TypeError [ERR_INVALID_ARG_TYPE]: The first argument must be of type string or an instance of Buffer, ArrayBuffer, or Array or an Array-like Object. Received an instance of IncomingMessage
Some others are having this same issue. e.g.) https://github.com/aws/aws-sdk-js/issues/3649
Going to try out this approach: https://github.com/aws/aws-sdk-js-v3/issues/1877#issuecomment-764404713
@porteron This might help too. This is what I use in my code:
const data = await S3Object.s3.getObject({
Bucket: this.bucket,
Key: this.key,
});
return assembleStream(data.Body);
export async function assembleStream(
stream: Readable,
options: AssembleStreamOptions = {}
): Promise<string | Buffer> {
return new Promise((resolve, reject) => {
const chunks: Uint8Array[] = [];
stream.on('data', chunk => chunks.push(chunk));
stream.on('error', reject);
stream.on('end', () => {
const result = Buffer.concat(chunks);
resolve(options.string ? result.toString('utf-8') : result);
});
});
}
If you find a method that works that's shorter than mine, let me know!
If you're ok with using a (very popular) 3rd party library, this is shorter - https://github.com/aws/aws-sdk-js-v3/issues/1877#issuecomment-799697205
@ohana54 Ahhh yes, thank you for that!
Hard to understand why getting a string from a simple storage service is unnecessary complicated.
@HerrSchwarz I imagine it was for flexibility, so the dev can decide if they want to stream data into another process or just grab a string directly. This saves having to come up with multiple API functions (getObject, getStream).
But I agree, it is a bit of a pain. There might've been a simpler way to accomplish this and have better DX (developer experience).
I encountered this issue too, in a nodejs application. Since typescript won't compile due to types not found (ReadableStream
and Blob
here, but also File
from dynamodb native attribute types) complaining with error TS2304: Cannot find name 'ReadableStream'.
, what should be the workaround/definitive fix for a nodejs only application?
I can solve the compiling issue by adding "DOM" to my lib
compiler options, but I would rather not if possible.
@kristiannotari rather than adding "DOM" to tsconfig, you should use one of the solutions suggested above - they require additional code, but work in Node+TS.
You can use assembleStream
function from @ffxsam comment or get-stream package I posted.
@kristiannotari rather than adding "DOM" to tsconfig, you should use one of the solutions suggested above - they require additional code, but work in Node+TS.
You can use
assembleStream
function from @ffxsam comment or get-stream package I posted.
Thanks, but that's not the issue I'm facing. I correctly fixed my s3 output Body stream retrieval using get-stream
to convert it to a Buffer. The problem I have now is that I have the typescript compiler set to check definition files for libraries. This create the errors I cited above, because, rightly so, it doesn't find ReadableStream
and Blob
types, among others (see File
from dynamodb native attribute types). I don't know how to manage a scenario like this one where a library supports multiple environments but I only need type definitions for nodejs env. Obviously, this was not an issue with aws sdk v2.
I just noticed that a lot of the comments here are also covered in the v3 documentation, see section "Getting a file from an Amazon S3 bucket".
That is awesome @berenddeboer, it has been added recently to the documentation. For me, the biggest problem is that I can't be sure my own ad-hoc stream implementations are 100% correct and error-resilient in every situation. When AWS provides the implementation, I will trust it.
A bit late to the party, but I just cannot hold it:
Guys, when you did design this API - did you really tried it yourself? I understand why it was improved in a way it was, but this improvement shouldn't be done at the cost of practicality. Like for real, do you think it is ok to write this every time I simply need to read an object in memory:
const streamToString = (stream) =>
new Promise((resolve, reject) => {
const chunks = [];
stream.on("data", (chunk) => chunks.push(chunk));
stream.on("error", reject);
stream.on("end", () => resolve(Buffer.concat(chunks).toString("utf8")));
});
Should I now just memorize it? Keep it in my personal list of handy AWS snippets? Add an entire 3rd party dependency that does it in one line?
I bet this is how the API design session goes: -- Hey Dave, we're doing a new v3 API and it's really not a simple thing to read a file, looks like 99.999999% of our users will suffer from that. Whaddaya think, should we improve it? -- Nah.
Like seriously, the most complex and obscure thing in the API of the file storage... is the file read itself. Come on, guys.
@berenddeboer you are right, it was added to the documentation, but the documentation itself matches perfectly the poor attention to details this whole issue is about. It is poorly idented (not a problem itself) and not behaving as one would expect, as there's a return statement that might work for the guy responsible for the unit tests (and apparently also for the documentation), but not for the other poor lost guy who had to face the complex use case of reading a text file and after sometime may overlook a return data; // For unit tests.
so, @dzmitry-kankalovich feel free to add to the script -- Ok, so whaddaya think, should we improve its documentation? -- Nah...
maybe it's just our fault to expect a usable SDK for the biggest object storage product from the bigger cloud provider
@dzmitry-kankalovich Amazon is well-known to not have a solid focus on DX. Often times they build tools that work, but are not great experiences to work with.
IMO, every internal team at Amazon should be looking at Amplify as a model of how a library or SDK should be designed and documented.
so, @dzmitry-kankalovich feel free to add to the script -- Ok, so whaddaya think, should we improve its documentation? -- Nah...
Jokes aside, I do want to mention that yes, documentation is poor.
Google still contains many links pointing to the v2 version of docs (good luck trying to spot you're actually reading docs for v2 instead of v3), not saying that finding example here took me eternity since it's somewhere in the middle of the page, lost between less frequently used stuff.
Where it really should be is in the JS docs for GetObjectCommand or S3 Client, so I don't need to google such a basic thing, but rather simply look at the source code.
Does anyone know in what circumstances body will be a Blob
or a ReadableStream
(for browser env)?
My code is currently returning ReadableStreams, but it makes me wonder what APIs or configs result in blobs being returned.
File
inherits from Blob
.
<!-- Pardon the Vue syntax here, but you get the idea -->
<input type="file" @change="choseFile">
and then:
choseFile({ target }) {
const file = target.files[0];
console.log(file instanceof Blob); // true
}
This is true in Chrome, Firefox, and Safari (14.x) and is native—not dependent on any type of config or framework, as far as I know.
Oh yes, I use Blobs a lot in browser. I'm just wondering in what circumstances the AWS SDK will return a Blob rather than a ReadableStream since it has typings for both. I'm guessing it might be if the browser doesn't support ReadableStreams...
Turns out the body
(ReadableStream) property of fetch responses was implemented slightly later than the other methods (like blob(), text(), etc). So looks like AWS is trying to support a few older browsers by falling back on Blob()
when body/ReadableStream isn't available.
So if you want to support those older browsers you need to be prepared to handle both a Blob
and a ReadableStream
in browser.
This kind of thing should really be in the docs...
I just noticed that a lot of the comments here are also covered in the v3 documentation, see section "Getting a file from an Amazon S3 bucket".
It boogles my mind that I had to read this open issue to find what I was looking for: How to read objects in v3 Really, this is what everyone would expect to find in the information hierarchy:
There's no mention to the word "object" in this entire list. This should not be just a section many pages below a headline.
For anyone that got stuck trying to download the s3 item into a local repository (I'm doing so in lambda), you can reference this code, it utilizes nodejs Readable Streams.
import { S3Client, GetObjectCommand } from "@aws-sdk/client-s3";
import { Readable } from "stream";
import fs from "fs";
import path from "path";
const command = new GetObjectCommand(params);
const object = await s3Client.send(command);
const { ContentType, Body } = object;
const body = Body as Readable;
const tempFileName = path.join("/tmp", "downloadedimage." + tmpFileSuffix);
const tempFile = fs.createWriteStream(tempFileName);
body.pipe(tempFile);
Be sure to replace tmpFileSuffix
with the correct file suffix.
Has anyone gotten this to work with binary data a react native environment? I get a Blob from the SDK, but I'm having a doozy of a time to get that blob saved on disk.
The response method above https://github.com/aws/aws-sdk-js-v3/issues/1877#issuecomment-776187712 works great for text, but doesn't seem to do the trick for binary data.
RN Blobs don't implement stream https://github.com/facebook/react-native/blob/main/Libraries/Blob/Blob.js and the rn-fetch-blob library the open source project I'm using expects a stream. (I'm new to modern JS and just wanna help fix this bug 😞)
I've tried my hand at a few different methods of taking the blob to a stream but all seem to not work on React Native...
Hey Dave, we're doing a new v3 API and it's really not a simple thing to read a file, looks like 99.999999% of our users will suffer from that. Whaddaya think, should we improve it?
My thoughts exactly. I am easily in that % above.
This definitely looks like it would simply take 2-3 days dev time to implement an AWS official package to define a specific async class to provide the data stream conversion for all supported libraries under a common model. Nothing else to add beyond @igilham 's posts. Concerning, and not a good sign that this wasn't even considered when creating the API and user guide. Consider this a nonconstructive bump.
Yeah, it's really quite unbelievable this hasn't been fixed yet. Creates the impression that nobody at AWS cares. Unfortunately, that impression is also in line with the SDK documentation at https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/. The documentation is uncomfortable to browse, with lots of unnecessary duplication, verbosity and link clicking back and forth. Overall, I think something is wrong as developer experience seems to be getting worse.
@dzmitry-kankalovich definitely summed it up correctly, I believe. A high level manager that doesn't actually care about usability is in charge of the documentation specs as well as things like reading an object to memory (pretty sure it's all about total lines of docs, not whether or not they matter?). No doubt this SDK is really powerful, and well written (props to the devs)... but reading an object to memory clearly has slipped through the cracks, as well as the entirety of dev UX concerning docs. Been using AWS for years, but this issue (existence, duration, lack of quick response time from AWS team) is pretty jaw dropping - to say the least.
There are 3 clear implementations/packages stated from the community. Adopt one. Ensure type safety. Patch it. Make hundreds, if not thousands, of other dev's lives easier... that or have us devs asking our companies to never have to use AWS S3 with node again, I guess.
Also, as far as documentation goes... take a look at the Google/Angular team and take some notes @aws
stream didn't work for me. This is how I ended up getting a JSON file from the s3 bucket
export const getFileData = (Bucket: string, Key: string) =>
new Promise(async (res, rej) => {
const decoder = new TextDecoder()
const client = new S3Client({ region: 'us-west-3' })
const command = new GetObjectCommand({ Bucket, Key })
const response = await client?.send(command)
// @ts-ignore - thanks aws
const reader = await response.Body.getReader()
await reader.read().then((resp: any) => {
res(JSON.parse(decoder.decode(resp.value)))
})
})
i think now can use for s3.GetObject to return the result as a string https://newbedev.com/how-to-get-response-from-s3-getobject-in-node-js
@dike321 the article you linked refers to AWS JS SDK v2, not v3.
Throwing into the pot here, I lost 4 hours of time trying to track this down.
@igilham has it right in his earlier message.
It's still not noted in the docs that the latter two shapes are from DOM and only apply in browsers. I agree it's frustrating that helper methods for common scenarios aren't included with the client.
We've run into similar issues so often we've taken to wrapping all SDK clients in our own classes to extend their operations to handle situations like this. S3 shouldn't be one of the services we should have to do this for. With such a large library of offerings, I know its hard to keep up with everything, but S3 is hands down one of the most used resources Amazon has available, so it should be receiving the most attention when it comes to DX. Bad experiences on the most common use cases definitely sour the impression of your products, and lowers the likelihood of developer evangelization.
Here's to hoping 2021 closes out with a packaged implementation for this scenario.
Has anyone gotten this to work with binary data a react native environment? I get a Blob from the SDK, but I'm having a doozy of a time to get that blob saved on disk.
The response method above #1877 (comment) works great for text, but doesn't seem to do the trick for binary data.
RN Blobs don't implement stream https://github.com/facebook/react-native/blob/main/Libraries/Blob/Blob.js and the rn-fetch-blob library the open source project I'm using expects a stream. (I'm new to modern JS and just wanna help fix this bug 😞)
I've tried my hand at a few different methods of taking the blob to a stream but all seem to not work on React Native...
I heard you want to save result of getObject to disk? Who would want to do that anyway? Sounds crazy. Anyway, here is how I managed to do it.
import { S3 } from '@aws-sdk/client-s3';
const params = {
Bucket: bucketName,
Key: filePath,
};
const client = new S3({
region: 'us-east-1',
});
const s3Object = await this.client.getObject(params);
// if, to make node typescript happier
if (s3Object.Body && s3Object.Body instanceof stream.Readable) {
const writeStream = fs.createWriteStream('/tmp/file.txt');
s3Object.Body.pipe(writeStream);
// magic
}
Oh hmm i see it was kind of mentioned here https://github.com/aws/aws-sdk-js-v3/issues/1877#issuecomment-886308913
Describe the bug I'm using the
GetObjectCommand
with anS3Client
to pull a file down from S3. In v2 of the SDK I can writeresponse.Body.toString('utf-8')
to turn the response into a string. In v3 of the SDKresponse.Body
is a complex object that does not seem to expose the result of reading from the socket.It's not clear if the SDK's current behaviour is intentional, but the change in behaviour since v2 is significant and undocumented.
SDK version number 3.1.0
Is the issue in the browser/Node.js/ReactNative? Node.js
Details of the browser/Node.js/ReactNative version
v12.18.0
To Reproduce (observed behavior)
Expected behavior It should print the text of the file.
Additional context
data.Body
is a complex object with circular references.Object.keys(data.Body)
returns the following: