I have a fairly large HTML page that has several JSON objects & arrays in it. I
use a PushbackReader to read the file and handle any text I'm interested in
(and determine when to start a JsonReader read). If I pass my Reader (or any
reader) to JsonReader, it will read 1024 characters regardless of content. This
means I may miss some JSON.
i.e. I may miss myObject2 in the following example because it might remain
unprocessed in the JsonReader buffer.
<h1>some html</h1>
<script language=javascript>
var myObject1={"a":1};
alert("some non-JSON that may or may not exceed the JsonReader buffer size");
var myObject2={"b":2};
NOTE: This is a much simplified version of my problem. The JSON data is
actually quiet large as is the HTML content.
As a temporary workaround, I've added the following method to JsonReader that
allows me to push the unprocessed characters in the JsonReader buffer back into
my PushbackReader (so I can start searching for "var myObject2=").
public char[] getUnreadCharacters() {
char[] output = new char[limit - pos];
if (pos < limit) {
System.arraycopy(buffer, pos, output, 0, limit - pos);
}
return output;
}
I also added a constructor that allows me to modify the buffer size.
Original issue reported on code.google.com by christia...@gmail.com on 6 Oct 2013 at 3:44
Original issue reported on code.google.com by
christia...@gmail.com
on 6 Oct 2013 at 3:44