When the Accept header of the outgoing request is either "text/plain" or "application/json", set the response type of the underlying XHR request to "text".
In turn, the call to fetchResponse.json(); does the following:
function readBlobAsText(blob) {
var reader = new FileReader()
var promise = fileReaderReady(reader)
var match = /charset=([A-Za-z0-9_-]+)/.exec(blob.type)
var encoding = match ? match[1] : 'utf-8'
reader.readAsText(blob, encoding)
return promise
}
That is, it takes the response body (which is already in JSON format), it then constructs a FileReader, does a regex match to find the encoding and then parses the blob to finally return the string.
That’s a tonne of overhead that could be avoided by just setting the responseType of the XMLHttpRequest to "text" when we expect text to be returned in the response.
Performance improvement
Background
This polyfil is used to provide the fetch API for React Native, here.
At Relive, our app is built using React Native. We came across this when looking at using the resourceEventMapper provided by Datadog, where the resulting network request is returned as the underlying XMLHttpRequest object, but it had the response type set to "blob".
Experiment
We ran an experiment to compare the impact of this overhead.
Definitions
"test" refers to the cohort of users that received the modified behaviour of this fetch polyfil, whereby the response type of the underlying XHR request is set to "text" as described above
"control" refers to the cohort of users that received the unmodified behaviour of this fetch polyfil, where the response type is always "blob"
the time measured in the results are in milliseconds, and measures the time it takes to make the request and convert it to JSON (end - start):
With about 29k users in each control and test cohort, with both cohorts doing about 3.1M requests each, we see the following:
NB: p95 and above, we start seeing a lot of noise due to network errors, so the most interesting bits are p90 and downward.
avg
p25
p50
p75
p90
p95
p99
test
1369.85
318
507
865
1744
4132
21969.17
control
1426.37
328
528
909
1836
4318
23086.31
%
3.96%
3.05%
3.98%
4.84%
5.01%
4.31%
4.84%
That is, on average, a 4% performance improvement, and a 5% improvement for p95.
Locally
Since the change proposed in the PR really should have the biggest affect on the .json() call, I was also able to confirm a performance improvement by creating some performance tests based on the existing test suite here.
Again, time in ms.
blob
text
%
small
0.5455999999
0.4628999991
15.16%
medium
0.6596000004
0.4721000018
28.43%
large
4.524300001
4.141100001
8.47%
NB: "small" is a JSON object w/ 10 fields, "medium" has 100 fields and "large" has 10000 fields.
Open questions
[ ] Since we're only using this polyfill in React Native, we could test it there. But where else is this polyfill being widely used, and is this proposed change safe for those?
[ ] Is there a more elegant way to set the response type other than looking at the Accept header?
[ ] XMLHttpRequest also supports json as a response type. Perhaps it's worth considering setting the response type to json and delegate the JSON parsing to XMLHttpRequest. Could this lead to even further performance improvements?
[ ] What happens as the response object gets even bigger than the "large" version above?
What
When the Accept header of the outgoing request is either
"text/plain"
or"application/json"
, set the response type of the underlying XHR request to"text"
.Why
This polyfill sets the responseType of the underlying
XMLHttpRequest
to "blob" if the global context supports it: https://github.com/JakeChampion/fetch/blob/main/fetch.js#L595In turn, the call to
fetchResponse.json();
does the following:That is, it takes the response body (which is already in JSON format), it then constructs a
FileReader
, does a regex match to find the encoding and then parses the blob to finally return the string.That’s a tonne of overhead that could be avoided by just setting the
responseType
of theXMLHttpRequest
to"text"
when we expect text to be returned in the response.Performance improvement
Background
This polyfil is used to provide the
fetch
API for React Native, here.At Relive, our app is built using React Native. We came across this when looking at using the
resourceEventMapper
provided by Datadog, where the resulting network request is returned as the underlyingXMLHttpRequest
object, but it had the response type set to"blob"
.Experiment
We ran an experiment to compare the impact of this overhead.
Definitions
end - start
):Results
With about 29k users in each
control
andtest
cohort, with both cohorts doing about 3.1M requests each, we see the following:NB: p95 and above, we start seeing a lot of noise due to network errors, so the most interesting bits are p90 and downward.
That is, on average, a 4% performance improvement, and a 5% improvement for p95.
Locally
Since the change proposed in the PR really should have the biggest affect on the
.json()
call, I was also able to confirm a performance improvement by creating some performance tests based on the existing test suite here.Again, time in ms.
NB: "small" is a JSON object w/ 10 fields, "medium" has 100 fields and "large" has 10000 fields.
Open questions
[ ] Since we're only using this polyfill in React Native, we could test it there. But where else is this polyfill being widely used, and is this proposed change safe for those? [ ] Is there a more elegant way to set the response type other than looking at the Accept header? [ ]
XMLHttpRequest
also supportsjson
as a response type. Perhaps it's worth considering setting the response type tojson
and delegate the JSON parsing toXMLHttpRequest
. Could this lead to even further performance improvements? [ ] What happens as the response object gets even bigger than the "large" version above?Tests
All passing ✅