EOL / deprecated_eol_php_code

Encyclopedia of Life
http://eol.org/
Other
5 stars 7 forks source link

Temporary workaround: If cropping a remote file, get dimensions from remote too. #116

Closed hyanwong closed 9 years ago

hyanwong commented 9 years ago

Also ensure orig file exists

Note that using the remote file rather than a local copy has 2 problems: 1) we lose the original file (gif, png, whatever) 2) for Biopix images (which have the _580_360 version deliberately constrained to smaller then normal size), we do not preserve that constraint.

hyanwong commented 9 years ago

NB - this assumes that the EoL php instance allows getimagesize() on http files.

JRice commented 9 years ago

Thanks.

We should have everything on the same filesystem in a few months, actually, but we do need this workaround for now: the images are NOT otherwise available on that machine. :\

I'll have to talk to the master curators about the BioPix problem.

Thanks so much!

JRice commented 9 years ago

Hmmmn... It looks like we've lost the date information in the path we pass to getimagesize: Undefined variable: num in /opt/eol_php_code/lib/ContentManager.php on line 672 Warning: getimagesize(http://content71.eol.org/content/////_580_360.jpg): failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found in /o

JRice commented 9 years ago

Looks perhaps as if self::cache_num2path($num) should have been self::cache_num2path($data_object->object_cache_url); I am going to try that.

JRice commented 9 years ago

Yup, that fixed it. Will commit.

hyanwong commented 9 years ago

On 20 Dec 2014, at 14:57, Jeremy Rice notifications@github.com wrote:

Looks perhaps as if self::cache_num2path($num) should have been self::cache_num2path($data_object->object_cache_url); I am going to try that.

Oh, sorry. You are right. That was left in from my testing. Apologies.

Yan=

hyanwong commented 9 years ago

On 20 Dec 2014, at 14:52, Jeremy Rice notifications@github.com wrote:

We should have everything on the same filesystem in a few months, actually, but we do need this workaround for now: the images are NOT otherwise available on that machine. :\

OK, this should be fine as a temporary workaround then.

I'll have to talk to the master curators about the BioPix problem.

Well, the Biopix exception (which I’m assuming is because we only have permission from them to use 300x300px versions) didn’t work until I fixed it with Patrick about a year ago anyway.

Since we don’t check where the pictures are coming from in grab_file(), we can’t apply the biopix exception when we re-crop. But this only applies to the few images that are from biopix and which get a custom crop, so that should be OK.

Cheers

Yan=

hyanwong commented 9 years ago

On 20 Dec 2014, at 22:58, Jeremy Rice notifications@github.com wrote:

Yup, that fixed it. Will commit.

Seems to all be working now: I’ve been doing a couple of crops which I guess are being saved to the image_sizes table. I’m guessing that the reharvesting

Do you want me to run the python code to get the sizes for previously cropped images, or do you want to? I’m happy to do it if you send me a list of data_objectIDs.

Yan=

JRice commented 9 years ago

Sorry, been on vacation!

Just ran the check for the ids; I don't quite have the time to run the Python, so if you wouldn't mind:

[ids emailed]