Closed jeanconn closed 9 years ago
There is a way, which is dynamically look for apps and conditionally add them in INSTALLED_APPS and urls.py. It's not that hard, but what I'm worried about is that this leaves the production site vulnerable to being broken when failed imports are ignored. In this case something like the find_attitude
utility might not be available but we wouldn't know it without actually trying it.
So then you need more code to distinguish between test and production, and then it starts looking like too much.
Closed by 55c61deb9e, which outlines a procedure that should reliably allow imports of all dependencies.
What channel is supposed to provide pyyaks to use the conda test setup directions?
That would be ska
, but of course that channel is out of date. So the "low-fidelity" test is not really an option now, I think.
That would be ska, but of course that channel is out of date. So the "low-fidelity" test is not really an option now, I think.
For the time being, I was thinking about updating the directions to just suggest cloning a root dev ska .
The three words clone, root, and dev seem a bit contradictory and leave me confused. Do you mean more like "make a dev ska"?
The three words clone, root, and dev seem a bit contradictory and leave me confused. Do you mean more like "make a dev ska"?
I meant
conda create -n mykadi --clone root
from my dev ska so I'd still have a flight copy dev ska but would make an env in an area I can write.
I didn't realize that cloning like this also gets the pip-installed packages. (I don't understand how that can work, but I believe it does). So your process will work, but the more I think the more I prefer option 1 since it faithfully replicates how the deployed site is actually run using PYTHONPATH
+ flight Ska. Using your dev ska means trusting that it is up to date.
I didn't realize that cloning like this also gets the pip-installed packages. (I don't understand how that can work, but I believe it does).
I was also surprised, but when I saw that it seemed to work with the pip-installed packages I was sold on it as a possible solution.
So your process will work, but the more I think the more I prefer option 1 since it faithfully replicates how the deployed site is actually run using PYTHONPATH + flight Ska. Using your dev ska means trusting that it is up to date.
Right, but that's not that much of a stretch if my plan is to keep the root dev ska at flight and do any playing and updates in separate conda environments.
Right, but that's not that much of a stretch if my plan is to keep the root dev ska at flight and do any playing and updates in separate conda environments.
OK, but you need to add to the procedure to explicitly do a full update of the dev ska root at the time of doing testing in order to be sure. For instance I might have updated a package in skare by github edit and forgotten to mention it.
OK, but you need to add to the procedure to explicitly do a full update of the dev ska root at the time of doing testing in order to be sure. For instance I might have updated a package in skare by github edit and forgotten to mention it.
Do we have notes for how to do a full update? A "make all" isn't going to work anyway, right, because anaconda doesn't have an overwrite install mode. Did you have in mind something like
conda install --file=pkgs.conda
make python_modules
as a process?
What you wrote is probably right. I would still vote for using PYTHONPATH in this particular case as being simpler, faster, and more accurate. Just install the web branches of kadi and mica into the PYTHONPATH dir and it's done.
I made some updates in d59e6d8.
Is there a way to keep the web branch running even if a runtime doesn't have all requested apps?