s-alad / toofake

View friends BeReals without posting or them knowing. Post custom BeReal's whenever. Add custom realmojis
https://toofake.lol/
MIT License
153 stars 34 forks source link

Too many memories to download into zip? #124

Open softgrass opened 2 weeks ago

softgrass commented 2 weeks ago

Currently trying to download all my memories (haven't missed a day in almost 2 years) however when the site says "Zip will download shortly" it does not download. Checked the console and saw this error:

`memories-e8d40e471d9d4123.js:1 Object

memories-e8d40e471d9d4123.js:1 newmemories

memories-e8d40e471d9d4123.js:1 Array(724)

354-383234f0868aca6d.js:1 Uncaught (in promise) RangeError: Array buffer allocation failed

    at new ArrayBuffer (<anonymous>)

    at new Uint8Array (<anonymous>)

    at 354-383234f0868aca6d.js:1:29429

    at c.<anonymous> (354-383234f0868aca6d.js:1:29598)

    at 354-383234f0868aca6d.js:1:36931

    at 354-383234f0868aca6d.js:1:98298

    at d (354-383234f0868aca6d.js:1:98413)

    at p (354-383234f0868aca6d.js:1:98514)`

I believe that I may have too many memories too download it & the browser doesn't have enough resources to hold the array of the data for all my memories? Is there any fix or way around this?

Edit: fixed formatting lol

softgrass commented 2 weeks ago

Getting this error when I run my own instance:


Unhandled Runtime Error

Error: Bug : can't construct the Blob.

Call Stack

[32]</a.newBlob

node_modules\jszip\dist\jszip.min.js (13:31867)

l/</</e<

node_modules\jszip\dist\jszip.min.js (13:26703)

l/</<

node_modules\jszip\dist\jszip.min.js (13:26832)

[32]</a.delay/<

node_modules\jszip\dist\jszip.min.js (13:34948)

c/<

node_modules\jszip\dist\jszip.min.js (13:96815)

c
node_modules\jszip\dist\jszip.min.js (13:96932)

d

node_modules\jszip\dist\jszip.min.js (13:96971)
s-alad commented 2 weeks ago

Running own instance locally or hosted?

softgrass commented 2 weeks ago

Locally

s-alad commented 2 weeks ago

Just tested and worked fine, not sure. Could be some unsupported image format or such. I wasn't the one who added the memories dump code so I don't really know

Your best bet with a large memories dump would probably be to make like a python script to run through the auth + dump flow for you

softgrass commented 2 weeks ago

I think you're right, cause it works perfectly fine when I download the primary & secondary as separate images.

s-alad commented 2 weeks ago

Yeah unsure, could try seeing if it fails on a specific image or on all