Open Gadsoc opened 9 years ago
I have edited the source code to fix this issue. I changed the SmartSocketClient class as follows: This should be around line 98.
//# Sometimes two or more sends get jammed together when sending from the server when sent right next to eachother
//# This separates them into individial packets for processing / routing.
var arr:Array = incoming.split("\r\n");
arr.pop();
I added a class variable as follows:
/**
* This string will be used to store json data when it has been broken up into 8kb chunks.
* The json data will be continuously appended to this string until the onJSON function
* recognizes that the entirety of the JSON object has been received. Once all data is
* received this string will be parsed through the onJSON function and the string will be
* reset as an empty string.
*/
private var dataStorage:String = "";
And added the following below the array creation line shown above in the onJSON function:
var arr:Array = incoming.split("\r\n"); // <- existing line
if (dataStorage.length > 0) {
arr[0] = dataStorage + arr[0];
dataStorage = "";
}
if (arr.length == 1) {
dataStorage += arr[0];
} else if (arr[arr.length-1] != "") {
// Find last complete object
var lastCompleteIndex:int = -1;
for (var j:int = arr.length-1; j > 0; j--) {
if (arr[j-1].slice(-1) == "}") {
lastCompleteIndex = j-1;
break;
}
}
// Add all objects after last complete object to dataStorage
for (j = lastCompleteIndex+1; j < arr.length; j++) {
dataStorage += arr[j];
}
// Resize arr to remove everything after last complete object
arr.length = lastCompleteIndex+1;
// If arr.length > 0 then add blank to end and process, otherwise return
if (arr.length > 0) {
arr.push("");
} else {
return;
}
}
arr.pop(); // <- existing line
I know very little about this library so use this at your own risk. I believe this adjustment works simply. When the JSON object I'm sending is broken up into chunks because of it's size (at least on my tests) the initial chunks arrive as a single string with no breaks (all crammed into arr[0]). When the end of the JSON arrives it contains a break at the end resulting in an empty arr[1].
Since the uncompleted JSON results in an arr.length = 1, and the end of the JSON always has a length property > 1 the above code simply stores the preceding data concatenating it until the end of the JSON object arrives, performing one final concatenation before processing.
*Edited to handle cases where multiple JSON objects are processed together but the last object is unfinished.
**Edited again to handle cases where ... it gets even more messy.
Hello again :)
When I send calls from the server and they contain a field that is rather large (in this case about 12kb) the Json object on the receiving end is broken up into multiple chunks and throws a parse error.
"JSONParseError: Unexepected (char)"
Apparently the deserialization process tries to run through the object before it has the whole thing, and then when the next chunk is processed it has improper formatting due to the random place it was broken in half.
Is there something that can be done to handle these large transfers? Or should I restrict the size of messages to avoid this problem (and if so, how big is too big)?
Currently the passed object is a JsonArray made up of JsonPrimitive Strings. The Array has a length/size of 402.