Closed flowdee closed 9 years ago
I haven't tried it myself, but I guess you could write Cron script to periodically download the contents of an S3 bucket to the appropriate directory? s3cmd has a "synchronize a directory tree to S3" command.
Mh but in general I would prefer to host the files on Amazon S3 and prevent users downloading it from my server. That's the reason why I thought about something like a symlink for the packages folder.
Ah, so you want the download_url
to point directly to S3? That's certainly doable, but you'd need some extra code. You could subclass Wpup_UpdateServer
and override the method filterMetadata
to set $meta['download_url']
to whatever you want.
Does the download url extract the readme.txt file as well? In fact I dont want to host any plugin.zip file on my server, just the php application
Does the download url extract the readme.txt file as well?
No. Right now, the only supported way to extract metadata is to have the ZIPs in the "packages" directory.
Mh then there wouldn't be an additional benefit. I don't want to place the new zip within the packages folder AND an S3 bucket. That's double work :(
thanks anyway!
You could automate the "place new ZIPs in the packages
folder" part, but yes, overall it's pretty inconvenient.
Someone managed it (if possible at all) to combine the wp-update-server with Amazon S3? Thought about linking the packages folder to an S3 bucket.