Closed sulthonzh closed 5 years ago
Logparser would parse Scrapy log files periodically and incrementally,
so there's no need to worry about the file size of your log file.
The screenshot shows that MemoryError occurred when saving the parsed result,
use the top
command to check the free RAM of your server instead.
https://github.com/my8100/logparser/blob/c2cff79af45fde303d9a8a254cf3f2cc727a626d/logparser/settings.py#L70-L72
Add self.logger.error(sys.getsizeof(data))
before the code below
to show the size of data object in bytes
https://github.com/my8100/logparser/blob/c2cff79af45fde303d9a8a254cf3f2cc727a626d/logparser/logparser.py#L464
i just created PR for this https://github.com/my8100/logparser/pull/6
Could you share the free RAM and the data size?
File size: Free RAM:
The key is the data size (json file) when parsing the log file, but not the log file itself.
Could you post the content of the settings.py file?
this is my settings.py im using default settings.py
Marked wontix as it's not a common case. See https://github.com/my8100/logparser/pull/6#issuecomment-503409528 for more info.
Added LOG_CATEGORIES_LIMIT
option in https://github.com/my8100/logparser/commit/349613ef7885bcfeaa94235f527e51f3be0db7dc
# Keep only the last N logs of each item (e.g. critical_logs) in log_categories.
# The default is 10, set it to 0 to keep all.
LOG_CATEGORIES_LIMIT = 10
@sulthonzh
pip install -U git+https://github.com/my8100/logparser.git
LOG_CATEGORIES_LIMIT
option in the config file.logparser --delete_json_files
Very often get the following error: