Closed Krielkip closed 2 years ago
Hi @Krielkip, can you explain exactly what you are trying to do? Any FTP'ing/uploading will definitely be slower than syncing them with something like rsync or docker cp.
@markshust sure 😄
At this moment we are developing with Mac and PhpStorm. Because of the sync (cached / delegation) the Docker setup really slow compaired to the Linux coworkers.
For Shopware we are using Dockware. (https://dockware.io/docs#start-developing) Which uses at start of setup docker cp. When you code the IDE (PhpStorm) uses the SFtp auto upload to ftp it to the docker.
This is for on the fly development and not for the whole upload . For example in the docks: "https://dockware.io/docs#persisting-data" And more info about "slow upload" a little bit above.
With this we don't need to "sync" the whole app folder like now and we could have a speed boost. Thanks in advance.
Yea, I've went through all of the folder/filesystem already with performance https://markshust.com/2018/12/30/docker-mac-filesystem-volume-mount-approach-performance/
I do an initial docker cp, then selective sync what folders to mount. This project does not mount all folders. I think this will fallback to PHPStorm support of Docker and remote filesystems.
Hello @markshust,
Sorry for adding a reply. For us it would be a winner if we can have a sFTP in the container, to only upload changed files by the IDE, instead of mount a couple of folders.
Today i run some tests and it is winning 200ms on a "basic" setup. With app/design & app/code mounted LightHouse:
Without mounted: LightHouse:
Network loading Lighthouse; Left: No Mount / Sync; [ Ready in 7668 MS ] Right: Sync (app/code, app/design) [ Ready in 19268 MS ]
Difference of +/ 12.000 ms; on our side. It would be great if you can add sFTP, without changing for all other users 💯
Hi @Krielkip, thanks much for the perf reports, I reopened the ticket. Do you have instructions for setting up PHPStorm with this SFTP setup? If so, I can test this locally and will definitely merge in if we can get this working as desired.
Sorry for my late response.
I have used the example of this project: https://dockware.io/assets/docs/img/sftp-connection.png
Useraccount & password can be magento :)
More information : https://www.jetbrains.com/help/phpstorm/uploading-and-downloading-files.html
@Krielkip no worries, I mute GitHub notifications now so only on here once a week anyways. Thank you for the info, I will look into this.
@Krielkip I think I'll need some additional input/debugging from you.
I created a new Docker image for SSH, creating a Dockerfile
with the contents:
FROM debian:buster-slim
RUN apt-get update && apt-get install -y ssh
RUN groupadd -g 1000 app \
&& useradd -g 1000 -u 1000 -d /var/www -s /bin/bash app
RUN echo 'app:app' | chpasswd
RUN service ssh start
EXPOSE 22
CMD ["/usr/sbin/sshd","-D"]
Then build it with: docker build -t markoshust/ssh
. This creates an SSH server with a username & password both being app
.
Then, add these lines to the docker-compose.dev.yml
file:
...
ssh:
image: markoshust/ssh
ports:
- "22:22"
Then after running bin/start
, you'll have the SSH server running. I have verified this by connecting to it with PHPStorm:
However, when I right click on a file, then go to Deployment > Upload to server:
I get a "[12/5/20, 10:49 AM] Upload to m241.test failed: could not list the contents of folder "sftp://localhost/". (No such file)" error:
I'm using the instructions at https://www.jetbrains.com/help/phpstorm/uploading-and-downloading-files.html#551d8
I think we are so close, but I'm not sure why it's not receiving the directory contents. Any help would be appreciated.
Ah, it may be because I didn't prime the entire filesystem on the container. Let me try a completely new install with everything residing on the container volume and retest.
Yes i think you are missing the volumes :) It will try it also with your instructions.
ssh:
image: markoshust/ssh
ports:
- "22:22"
volumes: *appvolumes
Hello @markshust ,
I have tested it with your post. My docker-compose.yml;
services:
app:
image: markoshust/magento-nginx:1.13-7
ports:
- "80:8000"
- "443:8443"
links:
- db
- phpfpm
- ssh
...
ssh:
image: markoshust/ssh
ports:
- "22:22"
volumes: *appvolumes
...
so i have edited: add your ssh image; Added Volumes to your code And i did link the ssh to the nginx container.
My Remote Browser in PHPStorm;
And trial remote upload:
I forgot; This is my docker-compose.dev.yml
version: "3"
services:
app:
volumes: &appvolumes
- ./src/nginx.conf.sample:/var/www/html/nginx.conf:cached
phpfpm:
volumes: *appvolumes
mailhog:
image: mailhog/mailhog
ports:
- "1025"
- "8025:8025"
Maybe will update the nginx.conf.sample as copy on the server another optimalisation.
I have done all this and everything seems to be working, but for some reason seconds after saving a file it's just completely emptied in the remote and then on the host. Does anyone know any reason for this to happen? I'm really struggling to figure out what exactly is happening.
@joaolisboa I believe this is a really old ticket that was never closed.
Have you seen https://github.com/markshust/docker-magento/#ssh? If possible, can you try those instructions and see if it works better than anything defined in this ticket? Once confirmed I can close this one out.
I have done all this and everything seems to be working, but for some reason seconds after saving a file it's just completely emptied in the remote and then on the host. Does anyone know any reason for this to happen? I'm really struggling to figure out what exactly is happening.
This looks like some files are stil being synced on your Docker. This ticket is rather old, as markshust said.
in 2021 i have used the SSH of this ticket a lot. So the solution worked 👍
Need to find my 2021 files and share the yml i used. But then again. This project has been updated.
Thanks. I'm going to close this out, but if you are still having issues feel free to leave another comment 👍
Description The wish is to add (s)Ftp support to the docker, to upload the files towards the docker image, instead of mirroring. With Dockware (Shopware) this has some performance wins towards the cached way of syncing.