Closed jordic closed 4 years ago
Dockerfile fails
Removing intermediate container a539cb7e61e5
---> 1c55da0171a4
Step 11/12 : COPY . /app
---> e444d1c19eb3
Step 12/12 : RUN pip install /app
---> Running in ef12fa8581bf
Processing /app
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
[91m ERROR: Complete output from command /usr/local/bin/python /usr/local/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py get_requires_for_build_wheel /tmp/tmp_uqb6lmu:
ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py", line 207, in <module>
main()
File "/usr/local/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py", line 197, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/usr/local/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py", line 54, in get_requires_for_build_wheel
return hook(config_settings)
File "/tmp/pip-build-env-m_p85jvo/overlay/lib/python3.7/site-packages/setuptools/build_meta.py", line 148, in get_requires_for_build_wheel
config_settings, requirements=['wheel'])
File "/tmp/pip-build-env-m_p85jvo/overlay/lib/python3.7/site-packages/setuptools/build_meta.py", line 128, in _get_build_requires
self.run_setup()
File "/tmp/pip-build-env-m_p85jvo/overlay/lib/python3.7/site-packages/setuptools/build_meta.py", line 250, in run_setup
self).run_setup(setup_script=setup_script)
File "/tmp/pip-build-env-m_p85jvo/overlay/lib/python3.7/site-packages/setuptools/build_meta.py", line 143, in run_setup
exec(compile(code, __file__, 'exec'), locals())
File "setup.py", line 7, in <module>
long_description = open("README.rst").read() + "\n"
FileNotFoundError: [Errno 2] No such file or directory: 'README.rst'
----------------------------------------
[0m
Getting requirements to build wheel: finished with status 'error'
[91mERROR: Command "/usr/local/bin/python /usr/local/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py get_requires_for_build_wheel /tmp/tmp_uqb6lmu" failed with error code 1 in /tmp/pip-req-build-f04y9rl8
[0m
[91mWARNING: You are using pip version 19.1.1, however version 20.0.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
[0m
Removing intermediate container ef12fa8581bf
The command '/bin/sh -c pip install /app' returned a non-zero code: 1
@svx removing README.rst on .dockerignore makes that pip install does not work. I don't think that a two stage building justifies the size of a README.rst file
IMHO generating the docker image in github actions instead of docker hub brings the possibility to check at each commit the resulting image can be built and works...
EDIT: now i see the dockerci would also run from github
I think that the use cases are different. You need a docker image to test guillotina_react that is going to be hold on docker hub, so I think that the docker hub service itself is great. Github actions are fast and great for testing. Keep things simple to maintain :)
We build and pack the app, and later use this image to run all the tests on it. At least this way we can prevent the image not building.. :/
@bloodbare Yes! Sorry for breaking the image builds! I just created a PR for master. I was a bot too fast and forgot to check setup.py.
Clash between two worlds, what is common is Python is not always common in the container world.
Nevertheless I should check better, sorry again!
@jordic Builds are only done on tag and master, you should never merge to master something that is broken or even do a tag.
Closing as already there is a docker image at plone namespace
@jordic Builds are only done on tag and master, you should never merge to master something that is broken or even do a tag.
Sorry, but thought is different to build somehting (and use the artifact to run tests), on publishing and tagging that artifact :)
Agree for the second part, but this is something is happening some times on guillotina, mostly because we don't have a proper e2e testing story, and things got merged, without a proper QA process. That's why we want to have something automated, and at least, detect the failures, as soon as possible...
Closing as already there is a docker image at plone namespace
How we can know that the tag is built? (Can we relay on the guillotina:latest
?)
How we can know that the tag is built? (Can we relay on the
guillotina:latest
?)
Personally I always use version/build numbers and do not trust latest 😄
How we can know that the tag is built? (Can we relay on the
guillotina:latest
?)Personally I always use version/build numbers and do not trust latest
Yes, it's only to run guillotina_tests on :latest
Do we have any kind of automatic docker image builds? (On every commit to master, or on every new release)
We need it for guillotina_react. To be able to launch a guillotina (ideally) from a docker, and do some cypress testing on it.