Open davidmkrtchian opened 10 months ago
Might be a dumb question, but what is ${module.aws_iam_user.iam_access_key_id}
? I'm not sure Laravel can parse that
Might be a dumb question, but what is
${module.aws_iam_user.iam_access_key_id}
? I'm not sure Laravel can parse that
It is just environment variable in terraform. I Changed all to XXX ))
@jerm can you put eyes on this when you get a moment?
Also sharing another problem. After executing php artisan backup:clean, I encountered the following error message:
Cleanup failed because: Could not connect to disk backup because: InvalidArgumentException: Driver [s3_private] is not supported. in /var/www/html/vendor/laravel/framework/src/Illuminate/Filesystem/FilesystemManager.php:144
I followed the documentation provided here to install the prerequisites for S3 using Composer, but the error persists. https://laravel.com/docs/5.3/filesystem#driver-prerequisites
Laravel is a PHP web application framework with expressive, elegant syntax. We’ve already laid the foundation — freeing you to create without sweating the small things.
You don't ever want to mess with composer with this project. That's for us to do. If you change composer.json or composer.lock, you're gonna have a very bad time updating in the future. Snipe-IT already has S3 support, so there's nothing extra you should need to add. (This is open source, so obviously you can mess with composer, you're just going to have a bad time keeping your version of the code and ours in sync.)
IIRC, for our testing, we use:
PRIVATE_FILESYSTEM_DISK=s3_private
PUBLIC_FILESYSTEM_DISK=s3_public
PUBLIC_AWS_SECRET_ACCESS_KEY='OUR-KEY'
PUBLIC_AWS_ACCESS_KEY_ID=OUR-ACCESS-ID
PUBLIC_AWS_DEFAULT_REGION='us-west-2'
PUBLIC_AWS_BUCKET=OUR-BUCKET-NAME
PUBLIC_AWS_URL='https://snipe-flysystem-public-test.s3-us-west-2.amazonaws.com'
PRIVATE_AWS_SECRET_ACCESS_KEY='OUR-KEY'
PRIVATE_AWS_ACCESS_KEY_ID=OUR-ACCESS-ID
PRIVATE_AWS_DEFAULT_REGION='us-west-2'
PRIVATE_AWS_BUCKET=snipe-flysystem-private-test
You don't ever want to mess with composer with this project. That's for us to do. If you change composer.json or composer.lock, you're gonna have a very bad time updating in the future. Snipe-IT already has S3 support, so there's nothing extra you should need to add. (This is open source, so obviously you can mess with composer, you're just going to have a bad time keeping your version of the code and ours in sync.)
IIRC, for our testing, we use:
PRIVATE_FILESYSTEM_DISK=s3_private PUBLIC_FILESYSTEM_DISK=s3_public PUBLIC_AWS_SECRET_ACCESS_KEY='OUR-KEY' PUBLIC_AWS_ACCESS_KEY_ID=OUR-ACCESS-ID PUBLIC_AWS_DEFAULT_REGION='us-west-2' PUBLIC_AWS_BUCKET=OUR-BUCKET-NAME PUBLIC_AWS_URL='https://snipe-flysystem-public-test.s3-us-west-2.amazonaws.com' PRIVATE_AWS_SECRET_ACCESS_KEY='OUR-KEY' PRIVATE_AWS_ACCESS_KEY_ID=OUR-ACCESS-ID PRIVATE_AWS_DEFAULT_REGION='us-west-2' PRIVATE_AWS_BUCKET=snipe-flysystem-private-test
I don't using any Public S3 bucket , that's why I am not adding that config. Also an update for configuration when I am changing the PRIVATE_FILESYSTEM_DISK value from s3_private to local, the backup is finishing successfully. And saving them to my storage/app/backups.
the bucket doesn't actually need to be public, but the system has a concept of "public" files -- website images and css, etc, -- and "private" files -- your asset images, documents, etc... So you need to define that and the settings that go with it. It'll all still pass through snipe-it, same as ./public/ would on the would on the regular filesystem
@snipe Am I correct in understanding that all backups and data get into S3 through a public endpoint? If yes, is it possible to make them get through a private one? Thanks @jerm , I configured the settings for public access and encountered a situation where, when 'PRIVATE_FILESYSTEM_DISK=s3_private' and 'PUBLIC_FILESYSTEM_DISK=s3_public,' the issue remains the same. However, when 'PRIVATE_FILESYSTEM_DISK=local' and 'PUBLIC_FILESYSTEM_DISK=s3_public,' backups are being performed and saved locally.
Debug mode
- [x] I have enabled debug mode
- [x] I have read checked the Common Issues page
Describe the bug
When I am trying to to run the command for backups -- php artisan snipeit:backup -- Getting this output.
Starting backup... Dumping database snipeit... Determining files to backup... Zipping 24 files and directories... Created zip containing 24 files and directories. Size is 196.49 KB Copying zip to disk named backup... Copying zip failed because: There is a connection error when trying to connect to disk named `backup`. Backup completed!
Whats even weirder is that when after that command php artisan snipeit:backup giving error message, the pictures which I added in my asset like files are adding to my S3 bucket, but the backup is not getting done.
Reproduction steps
...
Expected behavior
Screenshots
No response
Snipe-IT Version
Using Snipe-IT Helm chart 3.4.1
Operating System
Ubuntu
Web Server
Apache
PHP Version
7.4.3
Operating System
No response
Browser
No response
Version
No response
Device
No response
Operating System
No response
Browser
No response
Version
No response
Error messages
No response
Additional context
Here is my enviroments for S3
PRIVATE_FILESYSTEM_DISK: s3_private PRIVATE_AWS_ACCESS_KEY_ID: XXX PRIVATE_AWS_SECRET_ACCESS_KEY: XXX PRIVATE_AWS_DEFAULT_REGION: us-east-1 PRIVATE_AWS_BUCKET: XXX AWS_DEFAULT_REGION: us-east-1
Also here is the part from config/backup.php
* The filename prefix used for the backup zip file. */ 'filename_prefix' => 'snipe-it-', /* * The disk names on which the backups will be stored. */ 'disks' => [ 'backup',
and config/filesystems.php
backup' => [ 'driver' => env('PRIVATE_FILESYSTEM_DISK', 'local'), 'root' => storage_path('app'),
Hi Team, I am also facing the same issue when i changed the PRIVATE_FILSYSTEM_DISK=s3_private and PUBLIC_FILESYSTEM_DISK=s3_public . When it's value as local then i can able to take backup using the command php artisan snipeit:backup
root@b89421dd1769:/var/www/html# php artisan snipeit:backup -vvv Starting backup... Dumping database snipeit... Determining files to backup... Zipping 41 files and directories... Created zip containing 41 files and directories. Size is 28.76 KB Copying zip to disk named backup... Copying zip failed because: There is a connection error when trying to connect to disk named
backup. Backup completed! root@b89421dd1769:/var/www/html#
Debug mode
Describe the bug
When I am trying to to run the command for backups -- php artisan snipeit:backup -- Getting this output.
Whats even weirder is that when after that command php artisan snipeit:backup giving error message, the pictures which I added in my asset like files are adding to my S3 bucket, but the backup is not getting done.
Reproduction steps
...
Expected behavior
*
Screenshots
No response
Snipe-IT Version
Using Snipe-IT Helm chart 3.4.1
Operating System
Ubuntu
Web Server
Apache
PHP Version
7.4.3
Operating System
No response
Browser
No response
Version
No response
Device
No response
Operating System
No response
Browser
No response
Version
No response
Error messages
No response
Additional context
Here is my enviroments for S3
Also here is the part from config/backup.php
and config/filesystems.php