cypress-io / cypress

Fast, easy and reliable testing for anything that runs in a browser.
https://cypress.io
MIT License
47.02k stars 3.18k forks source link

Too large read data error in Cypress during multiple test case execution #30186

Open Nilesh10101998 opened 2 months ago

Nilesh10101998 commented 2 months ago

Current behavior

Description: Hello Cypress Support,

I am encountering an issue when running multiple test cases in both headless and headed modes using Cypress Test Runner. During execution, the tests fail with the following error:

codeToo large read data is pending: capacity=104857600, max_buffer_size=104857600, read=104857600

This error seems to indicate that Cypress is hitting a buffer size limit. I have attempted to resolve this issue by upgrading to various versions of Cypress, including the latest version (13.14.1), but the problem persists. The issue occurs consistently when running multiple tests, and I suspect it may be related to resource management or limitations in how Cypress handles data buffering during test execution. I would appreciate any guidance or suggestions you can provide to help resolve this issue.

If you need any additional information, such as configuration files or logs, please let me know. Thank you for your assistance.

Desired behavior

The user should able to run multiple test cases in both headless and headed modes using Cypress Test Runner

Test code to reproduce

Cypress Terminal issue

Cypress Version

13.14.1

Node version

22.8.0

Operating System

Windows 10 (Version: 1809) (OS: 17763.6189)

Debug Logs

No response

Other

No response

jennifer-shehane commented 2 months ago

@Nilesh10101998 We haven't actually seen this error before. I have a few followups:

You're hitting this error from within Chromium specifically: https://source.chromium.org/chromium/chromium/src/+/main:net/server/http_connection.cc;l=41?q=http_connection.cc&ss=chromium%2Fchromium%2Fsrc

vladttt1123 commented 1 month ago

After upgrading from 13.2.0 to 13.15.0 I am also receiving similar error when uploading a large file of around 80mb, The error I am facing is:

[434:0930/103650.171610:ERROR:http_connection.cc(37)] Too large read data is pending: capacity=104857600, max_buffer_size=104857600, read=104857600

Once I hit this error on the pipeline the job gets stuck. the issue happens both when running locally and on the CI. Using: node:18 image in both cases.

Could you please advise what could be done to resolve this ? Local machine that Iam using is Macbook Pro 2020 Intel

I have researched it further by trying to narrow down the version where this issue has been introduced , and it appears to be 13.6.0.

harrisj commented 1 month ago

We have observed this also with a GitHub action running on ubuntu-latest and Node 20 so it unfortunately does not seem to be linked to a specific operating system

sami-neara commented 1 month ago

This 100MiB limit is hardcoded in Chrome, so to bypass it we'd either have to recompile Chrome with a higher limit, or Cypress could potentially chunk larger requests into multiple messages.

In our case, it turned out our interceptor rules were too broad and catching too many requests. After we made sure the larger requests weren't being intercepted, this bug went away.

(In particular, if you're using https://github.com/bahmutov/cypress-network-idle, all GET requests will be intercepted even after cy.waitForNetworkIdle() returns.)

SadiqRahim commented 4 days ago

Yea, to resolve those persistent messages, we had to revert back to v13.5.1. Is there an estimated timeline for when this issue will be resolved?