Open adanribeiro opened 3 years ago
Could it be that the bkp.sql file has only single line?
Sorry. For single line you mean "just one large and straight line"? No idea. Unfortunely I can't manage how the file is created. It's sent to me through a client's third party software. When I run glogg it tries for a while in the line 240. It seems a long string, but the fact there is a line 240 maybe can tell us something.
@GeadSolutions could you run wc -L bkp.sql
This command should print the length of the longest line in the file bkp.sql.
glogg uses QByteArray/QString/QRegularExpression classes from Qt library. These classes can't handle huge lines. Changing this would require major refactoring as glogg currently need to keep the whole line in memory to do search pattern matching.
in: wc -L bkp.sql out: 1043925 bkp.sql
It is definitely not a huge length. I've no more ideas at the moment. You could try klogg, I've rewritten a lot of glogg's code, so this particular issue might be fixed.
Well. I tried to run klogg with taht file but all system freezed and crashed. If you have another guess let me know, but I think perharps my system doesn't be prepared to make it properly.
Sorry for that. Looks like out of memory issue.
Could you restart klogg (without any files)? It might have created a crash dump, and on next start it should ask permission to upload it for analysis.
When I restarted it doesn't show me any report, so I created one for myself and sent to you. I've created this issue with the incoming content for anyone who want to consult too.
Using glogg_1.1.4-1.1build1_amd64 on Ubuntu 20.04.2 LTS QIODevice::read (QFile, "/home/ar/scripts/bkp.sql"): maxSize argument exceeds QByteArray size limit
This message return when I try to search for a string like
Plan
onto "bkp.sql" (4GB).There is any settings or known issues that I can consult about it?