Closed Alvazz closed 4 years ago
I guess it may also be a cache issue.
i havent tried this, so it might be a performance issue, i think we can have the request work in chunks instead of trying to load all of them at once
so bro, what should we do next? perfect it, make it stronger, the sooner the better.
we will basically change the current return data to become a paginated data instead of one bulk, and using intersection observer to load extra as you scroll.
it will take some time to get this one done, if u want to give it a try then go for it and i will help with any questions u might have
OK, I hope that you will stay online for a few hours, I am starting this work now.
awesome, i will try to answer ur questions when possible.
Hey, @ctf0 Are you here? what's your email address? I will send some resources to you, you try it
no need for the email, u can create a test repo and send me the link throgh here
ok, forget it. what I thought was that I would give you a zip file with 10,000 images for you to test.
I looked at the source code and found that it seems that the entire project, the place to load the file is in the GetData inside GetContent.php, which uses two foreach methods.
foreach ($storageFolders as $folder) {
$path = $folder['path'];
$time = $folder['timestamp'];
if (!preg_grep($pattern, [$path])) {
if ($this->GFI) {
$info = $this->getFolderInfo($path);
}
$list[] = [
'name' => $folder['basename'],
'type' => 'folder',
'path' => $this->resolveUrl($path),
'size' => isset($info) ? $info['size'] : 0,
'count' => isset($info) ? $info['count'] : 0,
'last_modified' => $time,
'last_modified_formated' => $this->getItemTime($time),
];
}
}
foreach ($storageFiles as $file) {
$path = $file['path'];
$time = $file['timestamp'];
if (!preg_grep($pattern, [$path])) {
$list[] = [
'name' => $file['basename'],
'type' => $file['mimetype'],
'path' => $this->resolveUrl($path),
'size' => $file['size'],
'visibility' => $file['visibility'],
'last_modified' => $time,
'last_modified_formated' => $this->getItemTime($time),
];
}
}
I want to load a lot of files, why do I get stuck, probably because of these two methods?
and this function:
protected function getFolderListByType($list, $type)
{
$list = collect($list)->where('type', $type);
$sortBy = $list->pluck('basename')->values()->all(); //here..
$items = $list->values()->all(); //here..
array_multisort($sortBy, SORT_NATURAL, $items);
return $items;
}
in here,
so, we need use laravel chunk() function to sovle this issue? is right?
based on my previous suggestion
we will basically change the current return data to become a paginated data instead of one bulk, and using intersection observer to load extra as you scroll.
so what we need is to first change https://github.com/ctf0/Laravel-Media-Manager/blob/652b7a6e74a6cfeaa582f136cb649373b1e72406/src/Controllers/Modules/GetContent.php#L31
to return paginated data, then call https://github.com/ctf0/Laravel-Media-Manager/blob/652b7a6e74a6cfeaa582f136cb649373b1e72406/src/resources/assets/js/modules/form.js#L128 periodically depend on the scrolling events to load the extra images.
the main problem would be with caching the returned data as a stack not as a bulk as it currently have
In fact, I think that instead of processing paged data in
'items' => $this->getData($folder)
it should be handled in the
protected function getFolderListByType($list, $type)
method if As you said, in
'items' => $this->getData($folder)
there is still no way to avoid foreach large amounts of data. Do you mean doing the following in foreach? Like this:
foreach ($storageFiles->chunk(100) as $file) {...}
If you write this, it seems to be a problem. So I tried to do this in getFolderListByType:
Protected function getFolderListByType($list, $type)
{
$list = collect($list)->where('type', $type)->forPage(1, 100);
Log::info('list --> : '.$list);
$sortBy = $list->pluck('basename')->values()->all();
$items = $list->values()->all();
Array_multisort($sortBy, SORT_NATURAL, $items);
Return $items;
}
But think about it, it seems that I am doing this, it is not very correct, in this case, the folder information will only show 100 items. Of course, this is just a question about paging processing data, another one is like you said, the problem of caching mechanism, this should also be very important.
I personally think it is a very important function to load a lot of data performance, but I only use PHP for 1 week, I still don't understand Laravel's whole system. If you can, I hope you can improve this large data loading performance. This is very important. Otherwise, many projects do not dare to use this library.
Thank you very much.
actually it will need more work than a shallow pagination, as we will need to
It feels quite complicated, maybe at the beginning of your design, you didn't consider the performance of a lot of file loading, it may be a design flaw, which leads to fix these problems now.
I will try again..
Try it, or not, if need to support this large-scale data loading problem, maybe more than half of the logic needs to be redesigned, it is better to design one from the beginning. Maybe my technology not good, hahaha, but thank you for your enthusiastic reply.
@ctf0 Hi, bro, pls fix this issue, I try a lot of times, cant sovle it.
Thanks and Please!!
this is will take some times to get it done, while i have other obligations, however u r free to work on it if u want
@ctf0 After some hard work, I almost have to complete this function, but there are a few questions, can you help me, for example, I want to use the console.log to print some logs in folder.js, why he is not realistic, just a simple log .
E.g:
openFolder(file) {
console.log('openFolder + 1'); //here
if (this.fileTypeIs(file, 'folder')) {
this.folders.push(file.name)
this.getFiles(this.folders).then(() => {
this.updatePageUrl()
})
}
},
goToFolder(index) {
console.log('goToFolder + 1');
if (!this.isBulkSelecting()) {
let folders = this.folders
let prev_folder_name = folders[index]
this.folders = folders.splice(0, index)
this.getFiles(this.folders, prev_folder_name).then(() => {
this.updatePageUrl()
})
}
},
goToPrevFolder(e = null, cls = null) {
console.log('goToPrevFolder + 1');
EventHub.fire('stopHammerPropagate')
let manager = this
function run() {
if (manager.restrictModeIsOn()) {
return false
}
let length = manager.folders.length
return length == 0
? false
: manager.goToFolder(length - 1)
}
e && e.target.classList.contains(cls) ? run() : run()
},
Is it because I am running in a laravel project, using the require method to install the package, so will not print the console.log inside the package?
no not like that, but u need to recompile the js/cc after any changes, thats why u should run npm run watch
on a side note i would recommend u stop the caching functionality on the front end and start with the data chunking + intersection observer
yes, I stopped all front-end caching features, which looks beautiful and doesn't actually meet my needs.
@ctf0 need to recompile the js/cc after any changes,
your mean, recompile at the laravel project or Laravel-Media-Manager,
I run the npm run watch at the laravel project folder, but also cant see the console.log print.
my current idea is to start paging through the openFolder(file) method. E.g:
openFolder(file) { Console.log('openFolder', file); If (this.fileTypeIs(file, 'folder')) { This.folders.push(file.name) this.getFiles(this.folders).then(() => { this.updatePageUrl() }) } },
Get the total number of files from file.count and pass this.getFiles(this.folders, file.count, currentPage)
Then do the next thing in the getFiles method. This experiment seems to be feasible?
all the work should be done inside https://github.com/ctf0/Laravel-Media-Manager/blob/652b7a6e74a6cfeaa582f136cb649373b1e72406/src/resources/assets/js/modules/form.js#L128-L188
this is the function responsible for the file fetching
yes
hi again, sorry for the late reply, i'ven't forget about you.
in next release this will be added.
5.7
usage
No obvious errors, maybe performance issues
...
Step Reproduction
First, manually put a folder containing 10,000 images into the storage public directory.
Then open a browser to access media.test/media. then there will be a prompt to load...balabala, then the animation will be stuck after a while, and then the browser will crash without response.
Then, I reopen the browser access again, or it will be like this.
So should this be a performance issue? Or is this project supporting so many files open?
In fact, I am still thinking about a question. Will it be fine if I add the pictures one by one through mediamanager? I am not sure