mageddo / javascript-csv

Automatically exported from code.google.com/p/jquery-csv
MIT License
1 stars 1 forks source link

chrome crashes with: "Aw, Snap!" on large CSV files #19

Open GoogleCodeExporter opened 8 years ago

GoogleCodeExporter commented 8 years ago
What steps will reproduce the problem?
1. Trying to have a csv file processed of 79 Mb
2.
3.

What is the expected output? What do you see instead?
display of the table is expected, but chrome crashes instead

What version of the product are you using? On what operating system?
Chrome Version 23.0.1271.64 m // Windows 7 Home Premium 64 bit SP1

Please provide any additional information below.
tried to upgrade the priority of the process related to the web page to 
Realtime / high  this didn't solve the issue.

the csv file is too big to be attached, but I uploaded it on Google Drive and 
shared it with evanplaice@gmail.com  
https://docs.google.com/open?id=0B1nPY5NSSvp1OF81TkdiLW11WWc

Original issue reported on code.google.com by alan.lin...@gtempaccount.com on 25 Nov 2012 at 11:19

GoogleCodeExporter commented 8 years ago
p.s. this is in reference to:
http://jquery-csv.googlecode.com/git/examples/file-handling.html

Original comment by alan.lin...@gtempaccount.com on 26 Nov 2012 at 12:14

GoogleCodeExporter commented 8 years ago
That's hilarious...

I assume that you're using Google Chrome.

Without even looking at your code I'm going to guess that you're hitting a 
memory limit.

Before it even starts processing the file it needs to load all of the data into 
memory.

There are two thing you might want to try:

1. Try breaking up the dataset into smaller chunks until you figure out the 
maximum file size that can be processed in one call.

2. Take a look at this link 
(http://www.html5rocks.com/en/tutorials/file/filesystem/). It looks like if the 
call is made to window.requestFileSystem() there's a parameter where you can 
set the file size before you read the file.

I'd play with it myself but have other pressing needs ATM. If you make any 
progress post your results here. Large file support is on the TODO list it's 
complicated, especially in the browser where the limitations are much more 
extreme.

Original comment by evanpla...@gmail.com on 26 Nov 2012 at 1:54

GoogleCodeExporter commented 8 years ago
Any progress on this?

Another option would be to slice the file into smaller parts. A sensible 
approach would be to detect the file size maximum then process the file in 
chunks.

One solution could be to add another state tracking variable to count the 
number of characters that have been processed. Then have the parser pause and 
load more data after a pre-defined amount of the chunk is passed (ex 75%).

This isn't as much a limitation with jquery-csv as it is a limitation of the 
HTML5 File API. It takes a little creativity to handle large files but it 
should be far from impossible.

The next question I'd ask would be, can Javascript even store 79MB of data in 
an array to begin with? If not, then the whole data processing chain (ie load, 
parse, save) will need to be done in chunks.

Here's some reading material to help:
http://www.html5rocks.com/en/tutorials/file/dndfiles/

Original comment by evanpla...@gmail.com on 25 Jan 2013 at 5:03

GoogleCodeExporter commented 8 years ago
Apologies for the late reply. 
It was for a school project and it was not very important to have it working 
for big files so I didn't look into it anymore. 
If I have more time, probably from June onwards I might have a look at it again.

Original comment by alan.lin...@gtempaccount.com on 28 Jan 2013 at 7:22