robotframework / OldSeleniumLibrary

Deprecated Selenium library for Robot Framework
Apache License 2.0
13 stars 3 forks source link

"page should contain..." causes XML logs to become unparseable #116

Closed spooning closed 9 years ago

spooning commented 9 years ago

Originally submitted to Google Code by bryan.oakley on 12 May 2010

When "page should contain..." keywords fail, they log the entire page. This dumps a potentially huge amount of escaped HTML into the log file. In our case, we can no longer parse the robot xml file and convert it to an HTML report because we get a "SAXPArseException: Parser has reached the entity expansion limit" error.

I'm not sure what the right solution is, but I think the current solution of logging all of the HTML may not be the best. Maybe a better solution is to have the selenium library create a separate log file specifically for all this HTML. The robot log could say something like "page source has been logged to foo.html" or something similar.

spooning commented 9 years ago

Originally submitted to Google Code by Andreas.EbbertKarroum on 14 May 2010

Maybe we should log that out when in debug mode only?

spooning commented 9 years ago

Originally submitted to Google Code by @pekkaklarck on 10 Nov 2010

Forgot this issue was already reported when we started to fix issue 147 .