Open dimaslanjaka opened 2 years ago
Hellooooo why still getting errors ?
FATAL {
err: RangeError: Invalid string length
at JSON.stringify (<anonymous>)
at _Model._export (D:\Repositories\gh-pages-old\node_modules\warehouse\lib\model.js:932:17)
at exportAsync (D:\Repositories\gh-pages-old\node_modules\warehouse\lib\database.js:50:44)
} Something's wrong. Maybe you can find the solution here: %s https://hexo.io/docs/troubleshooting.html
FATAL RangeError: Invalid string length
at JSON.stringify (<anonymous>)
at _Model._export (D:\Repositories\gh-pages-old\node_modules\warehouse\lib\model.js:932:17)
at exportAsync (D:\Repositories\gh-pages-old\node_modules\warehouse\lib\database.js:50:44)
i've tried remove all plugins, change themes.
@dimaslanjaka
It seems Out of memory exception and it is from v8.
https://github.com/nodejs/node-v0.x-archive/issues/14170 https://stackoverflow.com/questions/29175877/json-stringify-throws-rangeerror-invalid-string-length-for-huge-objects
@dimaslanjaka
It seems Out of memory exception and it is from v8.
nodejs/node-v0.x-archive#14170 https://stackoverflow.com/questions/29175877/json-stringify-throws-rangeerror-invalid-string-length-for-huge-objects
ive tried with this, same result
name: Build
on:
push:
branches:
- compiler # run when this branch pushed
# cancel previous workflows, run only one workflow
concurrency:
group: build-${{ github.event.push.number || github.event.pull_request.number || github.ref }}
#cancel-in-progress: true
jobs:
build:
if: {{ $false }}
runs-on: ubuntu-latest
timeout-minutes: 120
env:
NODE_OPTIONS: "--max_old_space_size=8192" #8192 4096 --expose-gc
GITFLOW: true
steps:
- name: Checkout repository and submodules
uses: actions/checkout@v2
with:
submodules: recursive
token: "${{ secrets.GITHUB_TOKEN }}"
- name: Setup Node.js 16.x
uses: actions/setup-node@v1
with:
node-version: "16.x"
- ....
i cant increase "--max_old_space_size=8192" more than 8192. its limited.
and my computer only have 8GB RAM
now i fork some hexojs functions to create my own generator https://github.com/dimaslanjaka/dimaslanjaka.github.io/tree/compiler
my own generator work for huge posts. might helpful for hexojs dev to improve the HexoJS package
HexoJS need 16GB RAM to process 1000+ posts i guess. ive tried forked hexojs/site repository and copied all of my posts, same result (error).
HexoJS is not yet qualified for an article writer like me. i hope, i can use HexoJS in future. so i dont worry to creating article without a lot coding.
SUGGEST:
hexo generate tag # generating only tags archives
hexo generate categories # generating only categories archives
hexo generate posts # generating posts
hexo generate index # generating index
hexo generate page # generating all page except `source/_posts`
hexo generate sitemap # generating sitemaps, rss
nah from generator, caching all posts object and reuse them on next process, its good idea
Perhaps this issue does not caused by number of posts, but the size of the file. Do you have a large size of article or page? Isn't it?
HexoJS need 16GB RAM to process 1000+ posts i guess.
I have a 1200+ post and I try to generate them with NODE_OPTIONS=--max_old_space_size=512
. It works.
set NODE_OPTIONS=--max_old_space_size=512 && hexo generate
INFO Validating config
INFO Start processing
INFO Files loaded in 20 s
.....
INFO Generated: 2016/06/24/jenkins-install/1.jpg
INFO Generated: 2018/12/01/update-site-layout/header.gif
INFO Generated: 2019/01/02/create-photo-gallery/gallery.gif
INFO 3918 files generated in 28 s
256MB is failed.
set NODE_OPTIONS=--max_old_space_size=256 && hexo generate
INFO Validating config
INFO Start processing
INFO Files loaded in 19 s
<--- Last few GCs --->
[7404:000001F892D401D0] 30883 ms: Scavenge (reduce) 241.0 (260.3) -> 240.8 (261.0) MB, 1.4 / 0.0 ms (average mu = 0.780, current mu = 0.194) allocation failure
[7404:000001F892D401D0] 31016 ms: Mark-sweep (reduce) 242.5 (261.8) -> 241.0 (261.8) MB, 130.3 / 0.1 ms (average mu = 0.661, current mu = 0.121) allocation failure scavenge might not succeed
<--- JS stacktrace --->
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
1: 00007FF68A2CE3EF v8::internal::CodeObjectRegistry::~CodeObjectRegistry+111951
2: 00007FF68A25DA36 v8::internal::WebSnapshotDeserializer::context_count+65446
3: 00007FF68A25E8ED node::OnFatalError+301
I have dynamic posts (such as the legend of Neverland quiz) that post actually generated by user input (based on game update). I thought maybe because of that. Try check site: webmanajemen.com quiz the legend of Neverland
from google
Perhaps this issue does not caused by number of posts, but the size of the file. Do you have a large size of article or page? Isn't it?
HexoJS need 16GB RAM to process 1000+ posts i guess.
I have a 1200+ post and I try to generate them with
NODE_OPTIONS=--max_old_space_size=512
. It works.set NODE_OPTIONS=--max_old_space_size=512 && hexo generate INFO Validating config INFO Start processing INFO Files loaded in 20 s ..... INFO Generated: 2016/06/24/jenkins-install/1.jpg INFO Generated: 2018/12/01/update-site-layout/header.gif INFO Generated: 2019/01/02/create-photo-gallery/gallery.gif INFO 3918 files generated in 28 s
256MB is failed.
set NODE_OPTIONS=--max_old_space_size=256 && hexo generate INFO Validating config INFO Start processing INFO Files loaded in 19 s <--- Last few GCs ---> [7404:000001F892D401D0] 30883 ms: Scavenge (reduce) 241.0 (260.3) -> 240.8 (261.0) MB, 1.4 / 0.0 ms (average mu = 0.780, current mu = 0.194) allocation failure [7404:000001F892D401D0] 31016 ms: Mark-sweep (reduce) 242.5 (261.8) -> 241.0 (261.8) MB, 130.3 / 0.1 ms (average mu = 0.661, current mu = 0.121) allocation failure scavenge might not succeed <--- JS stacktrace ---> FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory 1: 00007FF68A2CE3EF v8::internal::CodeObjectRegistry::~CodeObjectRegistry+111951 2: 00007FF68A25DA36 v8::internal::WebSnapshotDeserializer::context_count+65446 3: 00007FF68A25E8ED node::OnFatalError+301
i dont know the detail, but full log detail here https://github.com/dimaslanjaka/dimaslanjaka.github.io/runs/5711205131?check_suite_focus=true
hexo generate
same error resultthe errors is random: sometime invalid string length, also sometime like this screenshot
now i backup the hexo project on https://github.com/dimaslanjaka/dimaslanjaka.github.io/tree/hexo-compiler
Memory leaks have been confirmed by me because hexo doesn't separate all processes but wraps them all into one process which makes resource usage (memory/cpu) increase instantly.
I made a project like hexo https://github.com/dimaslanjaka/dimaslanjaka.github.io/tree/compiler (if I wrap everything into 1 command gulp
, then the result will be error memory leaks, but if I run task gulp
` one by one, the result is perfect even though there are hundreds of posts/pages in large numbers).
HEXO NEED TO UPDATE
We released hexo 6.2.0 just now. It includes this issue workaround.
I found that the external_link
filter is consuming a huge amount of memory. You can try to disable it in _config.yml
external_link:
enable: false
See also #3886
You can use Heap Profiler to inspect memory issues.
This function takes up about 2G of memory and I'm not sure what the reason is
Now iam using my own task and removing a lot of plugins from hexo project. Now i can fix seo, external links (anonymizing), optimize images automatically with separated task using gulp. disvantages is the process taking more times, but works.
posts total: 1419 (exclude pages) RAM my pc: 8 GB Hardisk type: HDD processor engine: https://github.com/dimaslanjaka/static-blog-generator-hexo/tree/master/packages/gulp-sbg deployed to: https://github.com/dimaslanjaka/dimaslanjaka.github.io -> https://www.webmanajemen.com/
@dimaslanjaka How many tags and categories are there for your 1400+ posts? I have also found that Hexo takes a lot of time querying posts and tags, because the database query algorithm is not optimized.
@dimaslanjaka How many tags and categories are there for your 1400+ posts? I have also found that Hexo takes a lot of time querying posts and tags, because the database query algorithm is not optimized.
emm around 100+, you can check it on sidebar https://webmanajemen.com
there has counter of post, tag, category.
because this blog is imported from blogger using my plugin hexo-blogger-xml
. So old tags and uncategorized still exists.
Now iam using my own task and removing a lot of plugins from hexo project. Now i can fix seo, external links (anonymizing), optimize images automatically with separated task using gulp. disvantages is the process taking more times, but works.
posts total: 1419 (exclude pages) RAM my pc: 8 GB Hardisk type: HDD processor engine: https://github.com/dimaslanjaka/static-blog-generator-hexo/tree/master/packages/gulp-sbg deployed to: https://github.com/dimaslanjaka/dimaslanjaka.github.io -> https://www.webmanajemen.com/
hexo project on: https://github.com/dimaslanjaka/static-blog-generator-hexo
Check List
Please check followings before submitting a new issue.
hexo version
to check)I don't know why this error occurs. Even though yesterday it was okay.
hexo generate
Environtment