I have successfully used the command line to scrape tiles using this command:
tl copy -z 0 -Z 10 -b '-180.0 -85.05 180.0 85.05' 'http://tile.stamen.com/watercolor/{z}/{x}/{y}.png' mbtiles://./tiles.mbtiles
However when attempting the next level:
tl copy -z 11 -Z 11 -b '-180.0 -85.05 180.0 85.05' 'http://tile.stamen.com/watercolor/{z}/{x}/{y}.png' mbtiles://./11.mbtiles
the process systematically stops processing at folder 11/1607/8, quite soon after the process begins.
I thought perhaps this was related to the number of open files limit, but changing ulimit had no effect on the process.
Other system resources do not seem to be adversely effected:
Could this be a bug or should I be focusing on some other system limitation ?
I have successfully used the command line to scrape tiles using this command:
tl copy -z 0 -Z 10 -b '-180.0 -85.05 180.0 85.05' 'http://tile.stamen.com/watercolor/{z}/{x}/{y}.png' mbtiles://./tiles.mbtiles
However when attempting the next level:
tl copy -z 11 -Z 11 -b '-180.0 -85.05 180.0 85.05' 'http://tile.stamen.com/watercolor/{z}/{x}/{y}.png' mbtiles://./11.mbtiles
the process systematically stops processing at folder 11/1607/8, quite soon after the process begins.
I thought perhaps this was related to the number of open files limit, but changing
ulimit
had no effect on the process.Other system resources do not seem to be adversely effected:
Could this be a bug or should I be focusing on some other system limitation ?