Closed AziiMaymon closed 12 months ago
My first look says you use the library against its philosophy. This line $dataToFetch[$id][$key] = $value;
saves all the items to memory and all of it is then processed again by array_values
on the next line. The main purpose of JSON Machine is not to process all items in memory but rather one by one.
If you look above. i am only fetching records which matches in json this one. $jsonData = Items::fromFile($jsonFilePath, [ 'pointer' => $array
$jsonData only returns me the records which matches. then i save those into a variable $dataToFetch and return it . without foreach it's not possible for me to save into variable. if there is any alternative please do share.
as you said i have removed array_values function but it is taking alot of time please help this is my new code
function readCronJob($array)
{
$dataToFetch=[];
$jsonFilePath= dirname(__FILE__)."/cronJob.json";
foreach($array as $value)
{
$compare[]="/".$value;
}
$compare;
try{
$jsonData = Items::fromFile($jsonFilePath, [
'pointer' => $compare
]);
}
catch (\Throwable $e) {
return $dataToFetch;
}
$increment=-1;
try{
$newPointer="";
foreach($jsonData as $key=>$value)
{
if($jsonData->getCurrentJsonPointer()!=$newPointer)
{
$newPointer=$jsonData->getCurrentJsonPointer();
$increment++;
}
$dataToFetch[$increment][$key]=$value;
}
}
catch (\Throwable $e) {
return $dataToFetch;
}
return $dataToFetch;
}
How big is the JSON? How big is the part you are iterating? If your pointer is far in the document, the parser has to parse all the JSON content before getting to it, thus taking the same time as if you would iterate over the whole thing.
i have more than 2million records in json file . previously i was running queries with joins while fetching data from database . it was working good untill my database records exceeds 1million. now as a alternate I am saving all records via cron job in json file. and looking into json file.
but if i am using pointer why would it parse whole document ?
The speed of parsing on my average PC is about 15 MB/s. Optimizing the db will bring you far better results than using file scanning in PHP. About two orders of magnitude better I'd say.
but if i am using pointer why would it parse whole document ?
It has to scan the whole thing to find the place specified by the pointer. There's no way of knowing beforehand where in the file it is.
Is there anything else I can do for you?
foreach while parsing data is taking alot of time on live server rather than local. need help . here is my code