Closed smattis closed 4 years ago
Merging #382 into master will decrease coverage by
1.90%
. The diff coverage is78.26%
.
@@ Coverage Diff @@
## master #382 +/- ##
==========================================
- Coverage 80.08% 78.18% -1.91%
==========================================
Files 23 23
Lines 4394 4263 -131
==========================================
- Hits 3519 3333 -186
- Misses 875 930 +55
Impacted Files | Coverage Δ | |
---|---|---|
bet/Comm.py | 82.00% <ø> (ø) |
|
bet/__init__.py | 100.00% <ø> (ø) |
|
bet/postProcess/__init__.py | 100.00% <ø> (ø) |
|
bet/postProcess/plotVoronoi.py | 81.15% <ø> (ø) |
|
bet/postProcess/postTools.py | 75.00% <ø> (ø) |
|
bet/sampling/LpGeneralizedSamples.py | 94.73% <ø> (ø) |
|
bet/sensitivity/__init__.py | 100.00% <ø> (ø) |
|
bet/surrogates.py | 77.90% <ø> (ø) |
|
bet/postProcess/plotP.py | 69.78% <62.31%> (+7.03%) |
:arrow_up: |
bet/util.py | 69.23% <66.66%> (-2.38%) |
:arrow_down: |
... and 19 more |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update b30b664...03e52c0. Read the comment docs.
@eecsu All ready for you.
I feel hesitant to bring this up, but I'm having trouble getting the tests to pass.
I ended up building a docker container to mirror what travis is doing, but there seems to be something fighting success.
I even tried the very specific git checkout -qf 191b09b18e693b7d7ec228924c2bb49055dfbc62
and it didn't make a difference.
FROM python:3.7-slim
USER root
# need git to install from source
# this install pattern keeps layers small
RUN apt-get update && \
apt-get install -y \
git \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* \
&& apt-get autoremove -y
# formatting tools for scripts
RUN pip install \
pyprind \
autopep8 \
black \
pytest-cov \
codecov Sphinx sphinx_rtd_theme \
&& rm -rf /tmp/* /var/tmp/*
# parallelism
RUN apt-get update && \
apt-get install -y \
gfortran \
libblas-dev \
liblapack-dev \
mpich \
libmpich-dev \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* \
&& apt-get autoremove -y
### Install package from source
# bet - dependencies for inverse problem
ENV GIT_ACCOUNT=UT-CHG
ENV CLONE_BRANCH=v3-dev
ENV REPO_NAME=bet
ENV DISPLAY=:99.0
RUN cd /tmp && \
git clone --single-branch --branch ${CLONE_BRANCH} https://github.com/${GIT_ACCOUNT}/${REPO_NAME}.git --depth=1 && \
cd ${REPO_NAME} && \
git checkout -qf 191b09b18e693b7d7ec228924c2bb49055dfbc62 && \
pip install git+https://github.com/CU-Denver-UQ/LUQ && \
pip install .
RUN cd /tmp/bet && pytest --cov=./bet/ ./test/
WORKDIR /tmp
# overwrite default python from fenics container (only needed if quay.io is source)
#RUN ln -f /usr/bin/python3 /usr/bin/python
#RUN ln -f /usr/local/bin/pip3 /usr/local/bin/pip
ENTRYPOINT
CMD ["python"]
Error I'm seeing is TypeError: 'numpy.float64' object cannot be interpreted as an integer
, so it's something silly like linspace not getting the right type.
... tracking it down, seems like test basic sampling, regular voronoi, something in the common setup files.
honestly wouldn't be an issue if python enforced types more strictly. don't have any clue why Travis is fine but this docker image, which should be pretty much the same thing... is not.
built on top of ubuntu:latest
and install python3 fresh... same problems.
I think it's like, one or two spots where an int
needs to be enforced. also, ran into this easy fix (towards the bottom):
#FROM python:3.7-slim
FROM ubuntu:latest
USER root
ENV DEBIAN_FRONTEND="noninteractive"
ENV TZ=America/Denver
# need git to install from source
# this install pattern keeps layers small
RUN apt-get update && \
apt-get install -y \
git \
python3-dev python3 python3-pip \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* \
&& apt-get autoremove -y
# overwrite default python from fenics container (only needed if using non-python source image)
RUN ln -f $(which python3) /usr/local/bin/python
RUN ln -f $(which pip3) /usr/local/bin/pip
# formatting tools for scripts
RUN pip install \
pyprind \
autopep8 \
black \
pytest-cov \
codecov Sphinx sphinx_rtd_theme \
&& rm -rf /tmp/* /var/tmp/*
# parallelism
RUN apt-get update && \
apt-get install -y \
gfortran \
libblas-dev \
liblapack-dev \
mpich \
libmpich-dev \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* \
&& apt-get autoremove -y
### Install package from source
# bet - dependencies for inverse problem
ENV GIT_ACCOUNT=UT-CHG
ENV CLONE_BRANCH=v3-dev
ENV REPO_NAME=bet
ENV DISPLAY=:99.0
RUN cd /tmp && \
git clone --single-branch --branch ${CLONE_BRANCH} https://github.com/${GIT_ACCOUNT}/${REPO_NAME}.git --depth=1 && \
cd ${REPO_NAME} && \
pip install git+https://github.com/CU-Denver-UQ/LUQ && \
pip install .
RUN cd /tmp/bet && pytest --cov=./bet/ ./test/
WORKDIR /tmp
ENTRYPOINT
CMD ["python"]
travis is using xenial (2016), so the versions of all the c libraries could be different. When I tried to build from xenial, it didn't like that it came with python 3.5, since 3.6 is the minimal version (and I'm not using conda/virtualenv to manage a version). Those are the differences I can think of. But the workflow I proposed in the Dockerfile seems reasonable to me, starting with the newest OS available. I'll try a virtualenv in xenial tomorrow to really mirror Travis, but... I do think something or other needs to change for these tests to pass. This install pattern works on the master branch right now no problem.
@mathematicalmichael I believe that #392 should take care of all of the problems. The new release of numpy a few days ago removed some stuff that was deprecated and throwing warnings, but I was ignoring for the time being. I thought that the deprecation period would last much longer than it did. Travis is using a cached version of numpy I think, so it didn't catch it yet, but it would eventually. I don't think it is necessary to use multiple CI services. It can overly complicate things.
@smattis I agree, I just want to get it installed + working on my desktop and thought the CI would help. The default actions I used was just closer to my dockerfile than travis was, so I thought it could be useful for github to be checking on my behalf automatically.
okay, everything seems to be working now... hopefully I dont jinx it.
cool. got my sanity check. now can start playing with it in earnest. thanks for addressing the bugs tonight so quickly @smattis
also, that makes sense. nothing (and I mean nothing) I do with testing relies on cached files. I've been burned too many times lol
Actual user testing:
measure methods:
[ ] FEniCS/BET_multiple_serial_models_script.py
fails on AttributeError
(I'm using fenics installed with conda install fenics
which I assume is how most people will get it). That said, they've largely dropped support for updates to the project and are starting over with dolfinx
. Don't even get me started on trying to build it from source... I have and the dependency lags are bad, so it's understandable they scrap a lot of the complexity and start over. My point is, I wouldn't worry about this example. Drop fenics ones altogether, use something simple like pyPDE
for "complex" pde examples (yes it's not as accurate bc FD not FE, but it's lightweight and demonstrates the same point).
[ ] FEniCS/BET_script.py
AttributeError
(the rest of the python files in FEniCS
run fine)
[ ] fromFile_ADCIRCMap
(all examples) failing for simple reason: files expected ../Q_ND.mat
, files present: ./Q_ND
.
mv *.mat ..
lets them all run successfully.
[ ] linearMap/BET_linearMapUniformSampling.py
(same in nonlinear
folders, and validationExample
)
suggest saving png
instead of eps
as many warnings are thrown, freezes up my system due to lack of transparency support with the matplotlib backend (I think it's throwing one for every scatterplot point or something and is overwhelming stdout
). Some images are saved to figs/
subdirs and others aren't. I think all should be. would be willing to fix it, easy change.
I don't think these need addressing for merge though.
useful oneliner: for p in `ls *.py`; do python $p; done
As a general comment in relation to something Michael brought up, I agree that the plotting file-extensions should all be .png for the examples. Better for anyone using this in windows.
As a general comment in relation to something Michael brought up, I agree that the plotting file-extensions should all be .png for the examples. Better for anyone using this in windows.
Handled in #393
@eecsu Should be ready for you to look at
@eecsu See https://github.com/UT-CHG/BET/compare/af59f09...03e52c0 for a diff between the current version of the branch and the last commit before your review.
This PR contains all of the updates for the soft release of version 3.0. This PR, mostly maintains backward compatibility with version 2.0, but it is not guaranteed, especially in cases of saving and loading.
Major attributes are:
bet.sample.sample_set_base
objects. Densities can be defined by Voronoi tessellations, random variables with certain statistics, Gaussian mixture models, Gaussians, or weighted Kernel Density Estimates.bet.calculateP.calculateR
.bet.sampling.adaptiveSampling
is completely removed. Closes #366.bet.sampling.basicSampling
is completely overhauled to handle any type of random variable of typescipy.stats.rv_continuous
. Closes #362.save_object
andload_object
functions inbet.util
. Closes #360.pytest
. Closes #364.bet.postProcess.compareP
is overhauled with new functionality and support for many types of sets. Comparison of marginals is added.pip
. Closes #356. Versions for requirements are enforced and worked. Closes #357.LUQ
is added withbet.sampling.useLUQ
.LUQ
-type cluster data structures are supported for density-based inversion.compute_QoI_and_create_discretization
deprecated. Going forward usecompute_qoi_and_create_discretization
.