Open bnomei opened 7 years ago
cached hash to file maps
instead of using site()->pages()->index()
one could also cache the results to make it faster. that is when having al lot of pages and index()
is slower than a file-read.
public static function getPageByAutoID($autoid) {
$f = kirby()->roots()->cache().DS.'autoid-'.$autoid.'.txt';
if(f::exists($f)) {
$uri = f::read($f);
if($page = page($uri)) {
return $page;
}
}
$pageCollection = site()->pages()->index()->filterBy(c::get('autoid.name', 'autoid'), $autoid);
if($pageCollection->count() > 0) {
$page = $pageCollection->first();
f::write($f, $page->uri());
return $page;
}
return null;
}
You are making a valid point, site()->pages()->index()
is indeed quite slow on bigger sites. Will look into it soon!
hash you could forward the
$page->diruri()
intogetUniqueAutoId()
and use it as an additional param to create the hash. it would be a pretty good unique and the very expensive check usingsite()->pages()->index()
ingetUniqueAutoId
could be avoided.id why not write/read the latest value in an
autoid-storage
-file at/cache/
? and only check for index if file is missing? that is certainly faster thansite()->pages()->index()
every time.