Closed gpsaggese closed 1 year ago
im ready to work on this issue, is the skeleton in place for this @gpsaggese @jsmerix @PomazkinG? I'm not sure where to start
Basically, the problem we are trying to solve is, for new contributors, we want them to run unit tests successfully and this is also one of items on #189 checklist. However, some of the unit tests require dependencies on AWS which will be not configured for new contributors yet.
pytest.ini
in the root of this directory to add some more markers for segregating the test functions. pytest -m <marker>
command to check if the configuration works successfully.You will get clear idea if you learn and understand about how the ini file and the pytest markers work together.
Am I right @gpsaggese?
Correct.
1) I would run the tests in the usual way, e.g., i run_fast_tests
@DanilYachmenev / @samarth9008 can we assign to someone to convert the gdoc about running tests? https://docs.google.com/document/d/1M8I2qt5CuCw7537_1yayeZhIfh_VoeHT26h5jgSEm3Q/edit Maybe we can file a bug for @aryananwar since it's part of this work.
2) Find which tests fail on a private laptop and categorize (e.g., "need AWS", "need XYZ")
3) Then we can plan what test lists we need (ideally we want to use a negative one like skip_tests_requiring_AWS
, so if you don't specify the marker everything works fine).
4) If the command line becomes too complicated we can wrap it into an invoke
target or use a special option in our existing command
@jsmerix can you pls keep an eye / help with this since you know the system?
@aryananwar How is the progress?
Hi, sorry for the late reply I havent been able to get started yet. I will begin on Wednesday. I’m not sure if you saw my previous email that I sent to you as I’m currently working full time, I can’t work daily.
On Mon, Jun 5, 2023 at 8:18 AM Samarth KaPatel @.***> wrote:
@aryananwar https://github.com/aryananwar How is the progress?
— Reply to this email directly, view it on GitHub https://github.com/sorrentum/sorrentum/issues/283#issuecomment-1576680858, or unsubscribe https://github.com/notifications/unsubscribe-auth/ARKM67OSV75LRIGQ377U2ODXJXE75ANCNFSM6AAAAAAYPNUT2I . You are receiving this because you were mentioned.Message ID: @.***>
I understand, but you yourself said you are ready to work on an issue so we thought you will dedicate some time into it.
So i think my understanding is that,
Is this the correct idea, or am I missing something else?
Also, I don't have access to the gdoc for running tests
So i think my understanding is that,
- Run the tests
- Determine which tests fail on a private laptop (just running them on my laptop?)
Yes
- Based on the reasoning for why the tests fail, create markers in pytest.ini that describe the dependency (i.e -requiresaws, -requiresexyz)
- Update the functions in test/test_tasks.py with the correct markers.
Not in just this file. You have to update the markers in all the files which have failed the tests.
- Run the tests with the markers to determine if working correctly.
- Create negative markers (so users can only run the tests that work on their machine)
- I'm not sure how to use invoke or what it means however.
Learn what invoke is in python and how we have used it for our code base. Basically invoke is used to invoke tasks based on the functions.
Is this the correct idea, or am I missing something else?
yes
Also, I don't have access to the gdoc for running tests
Request it.
Hi, when running i run_fast_tests, I am seeing this error
this is the summary
I have tried re pulling the master branch as well as just doing pip install brownie, is there anything else that could be wrong?
I think the issue is related to the fact that we should skip the tests in defi
in Sorrentum for now. E.g., those are tests that should go in the defi test list and it should be excluded in some builds.
INFO: > cmd='/venv/bin/pytest -m not slow and not superslow . -o timeout_func_only=true --timeout 5 --reruns 2 --only-rerun Failed: Timeout'
report_memory_usage=False report_cpu_usage=False
INFO: Saving log to file 'tmp.pytest.log'
collected 2531 items / 6 errors / 185 deselected / 2340 selected
Adding @jsmerix and @tamriq
This change in pytest.ini should fix the problem by skipping the problematic tests
@jsmerix are you seeing the same problems when running the regressions? Also in an ideal world, we should control what tests are executed (and parsed) through test lists and we shouldn't have to tweak what pytest parses. Not sure how to fix this. pytest has a mechanism for handling this kind of issues https://docs.pytest.org/en/latest/how-to/skipping.html#skipping-on-a-missing-import-dependency
Ill do that, thank you!
how can i determine why a test has failed? im failing some tests, getting some errors, skips, and reruns. im not sure which tests to work on and why they are failing.
how can i determine why a test has failed? im failing some tests, getting some errors, skips, and reruns. im not sure which tests to work on and why they are failing.
So for this you need to call a test in a bash with the following pattern:
> pytest {test_file_name}::{test_class_name}::{particular_test_name} -s --dbg
e.g.
pytest defi/tulip/test/test_order_matching.py::TestMatchOrders1::test1 -s --dbg
-s --dbg
will give you detailed logs of the test
we're currently in the process of publishing a documentation on unit tests but for now use the mentioned approach anf read carefully https://github.com/sorrentum/sorrentum/blob/master/docs/Coding_Style_Guide.md#unit-tests
Thank you I see, after running these on some of the tests, I'm getting some modulenotfound errors, should i just skip over these? or mark these with some sort of dependency? for example, with helpers/test/test_hparquet.py::TestListAndMergePqFiles::test_list_and_merge_pq_files_duplicate_drop. it says missing pyarrow module
@jsmerix are you seeing the same problems when running the regressions?
I have fixed this in https://github.com/sorrentum/sorrentum/pull/325
I'm getting some modulenotfound errors, should i just skip over these? or mark these with some sort of dependency? for example, with helpers/test/test_hparquet.py::TestListAndMergePqFiles::test_list_and_merge_pq_files_duplicate_drop. it says missing pyarrow module
The general conventions is to post a trace (e.g. last 10-15 lines before the error) + explain how did you run, are you inside sorrentum container etc.
@aryananwar Whats the update on it? Any kind of ETA?
@aryananwar as this is a high priority task and needed someone who can contribute it ASAP. Will assign a new task to you.
Hi, sorry about the late response and for the late work, I appreciate being reassigned to something that can be worked on over a period of time if possible. Again, sorry about that
After activating the dev environment:
(base) alejandros-MacBook-Pro:sorrentum1 jandro$ source dev_scripts/setenv_amp.sh
I receive output:
\# Activate virtual env '/Users/jandro/src/venv/amp.client_venv/bin/activate'
which python=/Users/jandro/src/venv/amp.client_venv/bin/python
python -v=Python 3.9.13
# Set path
PATH=
/Users/jandro/src/sorrentum1/dev_scripts/testing
/Users/jandro/src/sorrentum1/dev_scripts/notebooks
/Users/jandro/src/sorrentum1/dev_scripts/install
/Users/jandro/src/sorrentum1/dev_scripts/infra
/Users/jandro/src/sorrentum1/dev_scripts/git
/Users/jandro/src/sorrentum1/dev_scripts/aws
/Users/jandro/src/sorrentum1/dev_scripts
/Users/jandro/src/sorrentum1
.
/Users/jandro/src/venv/amp.client_venv/bin
/Users/jandro/.local/share/solana/install/active_release/bin
/Users/jandro/.cargo/bin
/opt/anaconda3/bin
/opt/anaconda3/condabin
/Applications/Docker.app/Contents/Resources/bin
/usr/local/bin
/usr/bin
/bin
/usr/sbin
/sbin
/Library/TeX/texbin
# Set PYTHONPATH
PYTHONPATH=
/Users/jandro/src/sorrentum1
# Configure env
which gh=
AM_AWS_PROFILE=am
AM_AWS_S3_BUCKET=alphamatic-data
AM_ECR_BASE_PATH=665840871993.dkr.ecr.us-east-1.amazonaws.com
CK_AWS_PROFILE=ck
CK_AWS_S3_BUCKET=cryptokaizen-data
TERM_PROGRAM_VERSION=3.4.19
alias i='invoke'
alias ih='invoke --help'
alias il='invoke --list'
alias it='invoke traceback'
alias itpb='pbpaste | traceback_to_cfile.py -i - -o cfile'
==> SUCCESS <==
I then run the script for obtaining a branch name from an issue number:
(amp.client_venv) (base) alejandros-MacBook-Pro:sorrentum1 jandro$ i gh_issue_title -i 283
and receive output:
INFO: > cmd='/Users/jandro/src/venv/amp.client_venv/bin/invoke gh_issue_title -i 283'
report_memory_usage=False report_cpu_usage=False
## gh_issue_title: issue_id='283', repo_short_name='current'
## gh_login:
11:33:58 - INFO lib_tasks_gh.py gh_login:48 account='sorrentum'
11:33:58 - WARN lib_tasks_gh.py gh_login:56 Can't find file '/Users/jandro/.ssh/id_rsa.sorrentum.github'
11:33:58 - WARN lib_tasks_gh.py gh_login:67 Can't find file '/Users/jandro/.ssh/github_pat.sorrentum.txt'
11:33:59 - ERROR hsystem.py _system:272
################################################################################
cmd='(gh issue view 283 --repo github.com/sorrentum/sorrentum --json title,url) 2>&1' failed with rc='127'
################################################################################
Output of the failing command is:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
/bin/bash: gh: command not found
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Traceback (most recent call last):
File "/Users/jandro/src/venv/amp.client_venv/bin/invoke", line 8, in <module>
sys.exit(program.run())
File "/Users/jandro/src/venv/amp.client_venv/lib/python3.9/site-packages/invoke/program.py", line 398, in run
self.execute()
File "/Users/jandro/src/venv/amp.client_venv/lib/python3.9/site-packages/invoke/program.py", line 583, in execute
executor.execute(*self.tasks)
File "/Users/jandro/src/venv/amp.client_venv/lib/python3.9/site-packages/invoke/executor.py", line 140, in execute
result = call.task(*args, **call.kwargs)
File "/Users/jandro/src/venv/amp.client_venv/lib/python3.9/site-packages/invoke/tasks.py", line 138, in __call__
result = self.body(*args, **kwargs)
File "/Users/jandro/src/sorrentum1/helpers/lib_tasks_gh.py", line 340, in gh_issue_title
title, url = _get_gh_issue_title(issue_id, repo_short_name)
File "/Users/jandro/src/sorrentum1/helpers/lib_tasks_gh.py", line 299, in _get_gh_issue_title
_, txt = hsystem.system_to_string(cmd)
File "/Users/jandro/src/sorrentum1/helpers/hsystem.py", line 343, in system_to_string
rc, output = _system(
File "/Users/jandro/src/sorrentum1/helpers/hsystem.py", line 276, in _system
raise RuntimeError(
RuntimeError: cmd='(gh issue view 283 --repo github.com/sorrentum/sorrentum --json title,url) 2>&1' failed with rc='127'
truncated output=
/bin/bash: gh: command not found
I suspect this error is related to switching my default terminal to bash instead of zsh using the following command:
chsh -s /bin/bash
Any ideas here on how to fix?
@alejandroBallesterosC seems like the issue can be connected with this https://github.com/sorrentum/sorrentum/pull/324#discussion_r1227361105 let us work it out
but since this is not the point of your issue, for now use the following pattern:
SorrTask{issue_number}_{issue_name_through_underscores)
So for this one it is
SorrTask283_Create_test_list_to_run_with_Sorrentum
You need to install GH which the CLInversion of GitHub.
Do we have instructions on how to install / configure it? Otherwise you can just check the instructions on the web and / or pick a name of branch manually
On Mon, Jun 19, 2023 at 5:39 PM Alejandro Ballesteros < @.***> wrote:
After activating the dev environment:
(base) alejandros-MacBook-Pro:sorrentum1 jandro$ source dev_scripts/setenv_amp.sh # Activate virtual env '/Users/jandro/src/venv/amp.client_venv/bin/activate' which python=/Users/jandro/src/venv/amp.client_venv/bin/python python -v=Python 3.9.13
Set path
PATH= /Users/jandro/src/sorrentum1/dev_scripts/testing /Users/jandro/src/sorrentum1/dev_scripts/notebooks /Users/jandro/src/sorrentum1/dev_scripts/install /Users/jandro/src/sorrentum1/dev_scripts/infra /Users/jandro/src/sorrentum1/dev_scripts/git /Users/jandro/src/sorrentum1/dev_scripts/aws /Users/jandro/src/sorrentum1/dev_scripts /Users/jandro/src/sorrentum1 . /Users/jandro/src/venv/amp.client_venv/bin /Users/jandro/.local/share/solana/install/active_release/bin /Users/jandro/.cargo/bin /opt/anaconda3/bin /opt/anaconda3/condabin /Applications/Docker.app/Contents/Resources/bin /usr/local/bin /usr/bin /bin /usr/sbin /sbin /Library/TeX/texbin
Set PYTHONPATH
PYTHONPATH= /Users/jandro/src/sorrentum1
Configure env
which gh= AM_AWS_PROFILE=am AM_AWS_S3_BUCKET=alphamatic-data AM_ECR_BASE_PATH=665840871993.dkr.ecr.us-east-1.amazonaws.com CK_AWS_PROFILE=ck CK_AWS_S3_BUCKET=cryptokaizen-data TERM_PROGRAM_VERSION=3.4.19 alias i='invoke' alias ih='invoke --help' alias il='invoke --list' alias it='invoke traceback' alias itpb='pbpaste | traceback_to_cfile.py -i - -o cfile' ==> SUCCESS <==
I run the script for obtaining a branch name from an issue number:
(amp.client_venv) (base) alejandros-MacBook-Pro:sorrentum1 jandro$ i gh_issue_title -i 283
and receive output:
INFO: > cmd='/Users/jandro/src/venv/amp.client_venv/bin/invoke gh_issue_title -i 283' report_memory_usage=False report_cpu_usage=False
gh_issue_title: issue_id='283', repo_short_name='current'
gh_login:
11:33:58 - INFO lib_tasks_gh.py gh_login:48 account='sorrentum' 11:33:58 - WARN lib_tasks_gh.py gh_login:56 Can't find file '/Users/jandro/.ssh/id_rsa.sorrentum.github' 11:33:58 - WARN lib_tasks_gh.py gh_login:67 Can't find file '/Users/jandro/.ssh/github_pat.sorrentum.txt' 11:33:59 - ERROR hsystem.py _system:272 ################################################################################ cmd='(gh issue view 283 --repo github.com/sorrentum/sorrentum --json title,url) 2>&1' failed with rc='127' ################################################################################ Output of the failing command is:
/bin/bash: gh: command not found
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< Traceback (most recent call last): File "/Users/jandro/src/venv/amp.client_venv/bin/invoke", line 8, in
sys.exit(program.run()) File "/Users/jandro/src/venv/amp.client_venv/lib/python3.9/site-packages/invoke/program.py", line 398, in run self.execute() File "/Users/jandro/src/venv/amp.client_venv/lib/python3.9/site-packages/invoke/program.py", line 583, in execute executor.execute(self.tasks) File "/Users/jandro/src/venv/amp.client_venv/lib/python3.9/site-packages/invoke/executor.py", line 140, in execute result = call.task(args, *call.kwargs) File "/Users/jandro/src/venv/amp.client_venv/lib/python3.9/site-packages/invoke/tasks.py", line 138, in call result = self.body(args, **kwargs) File "/Users/jandro/src/sorrentum1/helpers/lib_tasks_gh.py", line 340, in gh_issue_title title, url = _get_gh_issue_title(issue_id, repo_short_name) File "/Users/jandro/src/sorrentum1/helpers/lib_tasks_gh.py", line 299, in _get_gh_issuetitle , txt = hsystem.system_to_string(cmd) File "/Users/jandro/src/sorrentum1/helpers/hsystem.py", line 343, in system_to_string rc, output = _system( File "/Users/jandro/src/sorrentum1/helpers/hsystem.py", line 276, in _system raise RuntimeError( RuntimeError: cmd='(gh issue view 283 --repo github.com/sorrentum/sorrentum --json title,url) 2>&1' failed with rc='127' truncated output= /bin/bash: gh: command not found I suspect this error is related to switching my default terminal to bash instead of zsh using the following command:
chsh -s /bin/bash
Any ideas here on how to fix?
— Reply to this email directly, view it on GitHub https://github.com/sorrentum/sorrentum/issues/283#issuecomment-1597399624, or unsubscribe https://github.com/notifications/unsubscribe-auth/AH5S2OK4RDFEJV4ZLD7FN5TXMBXENANCNFSM6AAAAAAYPNUT2I . You are receiving this because you were mentioned.Message ID: @.***>
since I had to change my default terminal to bash, I also had to add some paths to my .bashprofile including my homebrew bin, and then install the Github CLI through home brew, and authenticate via web.
After doing that and running:
(base) alejandros-MacBook-Pro:sorrentum1 jandro$ source dev_scripts/setenv_amp.sh
I get the following output:
# Activate virtual env '/Users/jandro/src/venv/amp.client_venv/bin/activate'
which python=/Users/jandro/src/venv/amp.client_venv/bin/python
python -v=Python 3.9.13
# Set path
PATH=
/Users/jandro/src/sorrentum1/dev_scripts/testing
/Users/jandro/src/sorrentum1/dev_scripts/notebooks
/Users/jandro/src/sorrentum1/dev_scripts/install
/Users/jandro/src/sorrentum1/dev_scripts/infra
/Users/jandro/src/sorrentum1/dev_scripts/git
/Users/jandro/src/sorrentum1/dev_scripts/aws
/Users/jandro/src/sorrentum1/dev_scripts
/Users/jandro/src/sorrentum1
.
/Users/jandro/src/venv/amp.client_venv/bin
/opt/homebrew/sbin
/opt/homebrew/bin
/usr/local/bin
/Users/jandro/.local/share/solana/install/active_release/bin
/Users/jandro/.cargo/bin
/opt/anaconda3/bin
/opt/anaconda3/condabin
/Applications/Docker.app/Contents/Resources/bin
/usr/bin
/bin
/usr/sbin
/sbin
/Library/TeX/texbin
# Set PYTHONPATH
PYTHONPATH=
/Users/jandro/src/sorrentum1
# Configure env
which gh=/opt/homebrew/bin/gh
AM_AWS_PROFILE=am
AM_AWS_S3_BUCKET=alphamatic-data
AM_ECR_BASE_PATH=665840871993.dkr.ecr.us-east-1.amazonaws.com
CK_AWS_PROFILE=ck
CK_AWS_S3_BUCKET=cryptokaizen-data
TERM_PROGRAM_VERSION=3.4.19
alias i='invoke'
alias ih='invoke --help'
alias il='invoke --list'
alias it='invoke traceback'
alias itpb='pbpaste | traceback_to_cfile.py -i - -o cfile'
==> SUCCESS <==
and, finally after running:
(amp.client_venv) (base) alejandros-MacBook-Pro:sorrentum1 jandro$ i gh_issue_title -i 283
I receive this output:
INFO: > cmd='/Users/jandro/src/venv/amp.client_venv/bin/invoke gh_issue_title -i 283'
report_memory_usage=False report_cpu_usage=False
## gh_issue_title: issue_id='283', repo_short_name='current'
## gh_login:
12:02:44 - INFO lib_tasks_gh.py gh_login:48 account='sorrentum'
12:02:44 - WARN lib_tasks_gh.py gh_login:56 Can't find file '/Users/jandro/.ssh/id_rsa.sorrentum.github'
12:02:44 - WARN lib_tasks_gh.py gh_login:67 Can't find file '/Users/jandro/.ssh/github_pat.sorrentum.txt'
# Copied to system clipboard:
CmTask283_Create_test_list_to_run_with_Sorrentum: https://github.com/sorrentum/sorrentum/issues/283
Very good. Looks correct to me
On Mon, Jun 19, 2023 at 6:06 PM Alejandro Ballesteros < @.***> wrote:
since I had to change my default terminal to bash, I also had to add some paths to my .bashprofile including my homebrew bin, and then install the Github CLI through home brew, and authenticate via web.
After doing that and running:
(base) alejandros-MacBook-Pro:sorrentum1 jandro$ source dev_scripts/setenv_amp.sh
I get the following output:
Activate virtual env '/Users/jandro/src/venv/amp.client_venv/bin/activate'
which python=/Users/jandro/src/venv/amp.client_venv/bin/python python -v=Python 3.9.13
Set path
PATH= /Users/jandro/src/sorrentum1/dev_scripts/testing /Users/jandro/src/sorrentum1/dev_scripts/notebooks /Users/jandro/src/sorrentum1/dev_scripts/install /Users/jandro/src/sorrentum1/dev_scripts/infra /Users/jandro/src/sorrentum1/dev_scripts/git /Users/jandro/src/sorrentum1/dev_scripts/aws /Users/jandro/src/sorrentum1/dev_scripts /Users/jandro/src/sorrentum1 . /Users/jandro/src/venv/amp.client_venv/bin /opt/homebrew/sbin /opt/homebrew/bin /usr/local/bin /Users/jandro/.local/share/solana/install/active_release/bin /Users/jandro/.cargo/bin /opt/anaconda3/bin /opt/anaconda3/condabin /Applications/Docker.app/Contents/Resources/bin /usr/bin /bin /usr/sbin /sbin /Library/TeX/texbin
Set PYTHONPATH
PYTHONPATH= /Users/jandro/src/sorrentum1
Configure env
which gh=/opt/homebrew/bin/gh AM_AWS_PROFILE=am AM_AWS_S3_BUCKET=alphamatic-data AM_ECR_BASE_PATH=665840871993.dkr.ecr.us-east-1.amazonaws.com CK_AWS_PROFILE=ck CK_AWS_S3_BUCKET=cryptokaizen-data TERM_PROGRAM_VERSION=3.4.19 alias i='invoke' alias ih='invoke --help' alias il='invoke --list' alias it='invoke traceback' alias itpb='pbpaste | traceback_to_cfile.py -i - -o cfile' ==> SUCCESS <==
and, finally after running:
(amp.client_venv) (base) alejandros-MacBook-Pro:sorrentum1 jandro$ i gh_issue_title -i 283
I receive this output:
INFO: > cmd='/Users/jandro/src/venv/amp.client_venv/bin/invoke gh_issue_title -i 283' report_memory_usage=False report_cpu_usage=False
gh_issue_title: issue_id='283', repo_short_name='current'
gh_login:
12:02:44 - INFO lib_tasks_gh.py gh_login:48 account='sorrentum' 12:02:44 - WARN lib_tasks_gh.py gh_login:56 Can't find file '/Users/jandro/.ssh/id_rsa.sorrentum.github' 12:02:44 - WARN lib_tasks_gh.py gh_login:67 Can't find file '/Users/jandro/.ssh/github_pat.sorrentum.txt'
Copied to system clipboard:
CmTask283_Create_test_list_to_run_with_Sorrentum: https://github.com/sorrentum/sorrentum/issues/283
— Reply to this email directly, view it on GitHub https://github.com/sorrentum/sorrentum/issues/283#issuecomment-1597439060, or unsubscribe https://github.com/notifications/unsubscribe-auth/AH5S2OKR2OO4C6CZGOWYHHTXMB2JLANCNFSM6AAAAAAYPNUT2I . You are receiving this because you were mentioned.Message ID: @.***>
For Starting Reference Point on Status of Local Tests:
ran:
i run_fast_tests
output:
========================================================= short test summary info =========================================================
SKIPPED [1] defi/tulip/implementation/optimize.py:18: could not import 'pulp': No module named 'pulp'
SKIPPED [7] im_v2/test/test_im_lib_tasks.py: CMTask #789.
SKIPPED [1] im_v2/test/test_im_lib_tasks.py:241: amp #1189
SKIPPED [1] test/test_tasks.py:81: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:63: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:88: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:75: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:67: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:71: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:55: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:59: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:141: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:102: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:106: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:161: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:131: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:148: Test needs to be run outside Docker
SKIPPED [1] test/test_tasks.py:153: Test needs to be run outside Docker
SKIPPED [1] optimizer/test/test_single_period_optimization.py:82: CmTask #1607 Flaky opt tests fail.
SKIPPED [1] optimizer/test/test_single_period_optimization.py:93: CmTask #1607 Flaky opt tests fail.
SKIPPED [1] optimizer/test/test_single_period_optimization.py:128: Fails with cvxpy.error.SolverError: Solver 'OSQP' failed.
SKIPPED [1] optimizer/test/test_single_period_optimization.py:262: TODO(gp): @Paul test asserting.
SKIPPED [1] oms/test/test_target_position_and_order_generator.py: AmpTask1786_Integrate_20230104
SKIPPED [1] oms/test/test_process_forecasts_opt.py:52: Generate manually files used by other tests
SKIPPED [1] oms/test/test_process_forecasts_opt.py:244: CmTask #1607 Flaky opt tests fail.
SKIPPED [1] oms/test/test_process_forecasts_.py:601: This test times out because nothing interesting happens after the first set of orders.
SKIPPED [1] oms/test/test_pnl_simulator.py:432: For performance measurement
SKIPPED [1] oms/test/test_oms_db.py: Run manually to clean up the DB
SKIPPED [1] oms/test/test_ccxt_broker.py: Run manually.
SKIPPED [1] oms/test/test_call_optimizer.py: Cannot update the goldens, see CmTask1357.
SKIPPED [1] oms/test/test_api.py:162: unconditional skip
SKIPPED [1] oms/test/test_api.py:191: unconditional skip
SKIPPED [1] market_data/test/test_im_client_market_data.py:196: CmTask #1633 Add unit test for universe in MarketData.
SKIPPED [1] market_data/test/test_im_client_market_data.py:712: CmTask #1633 Add unit test for universe in MarketData.
SKIPPED [6] im_v2/talos/data/extract/test/test_talos_extractor.py: Talos as a vendor is deprecated.
SKIPPED [2] im_v2/talos/data/extract/test/test_talos_download_historical_data.py: Talos as a vendor is deprecated.
SKIPPED [1] im_v2/ig/data/client/test/test_taq_bars_utils.py:303: This is used to generate the frozen input
SKIPPED [4] im_v2/common/data/transform/test/test_transform_pq_by_date_to_by_asset.py: TODO(gp): Need to update this tests after transform v1.3
SKIPPED [1] im_v2/common/data/transform/test/test_convert_csv_to_pq.py:60: CmTask1305: after removing circular dependencies in `hio.from_file`, this test fails reading a parquet file
SKIPPED [1] im_v2/common/data/extract/test/test_extract_utils.py:539: CMTask2089 and CmTask3359
SKIPPED [1] im_v2/common/data/extract/test/test_extract_utils.py:877: CMTask2089
SKIPPED [1] im_v2/common/data/extract/test/test_extract_utils.py:954: Cannot be run from the US due to 451 error API error. Run manually.
SKIPPED [1] im_v2/common/data/extract/test/test_extract_utils.py:971: Cannot be run from the US due to 451 error API error. Run manually.
SKIPPED [1] im_v2/common/data/extract/test/test_extract_utils.py: File '/home/.aws/credentials' doesn't exist
SKIPPED [1] im_v2/common/data/client/test/test_historical_pq_clients.py:274: Enable after Lime477
SKIPPED [1] im_v2/common/data/client/test/test_historical_pq_clients.py:284: Enable after Lime477
SKIPPED [1] im_v2/common/data/client/test/test_historical_pq_clients.py:204: CMTask1510: Faulty symbol not detected.
SKIPPED [1] im_v2/ccxt/data/extract/test/test_download_exchange_data_to_db_periodically.py: Replace with smoke test in CmTask #2083
SKIPPED [1] im_v2/ccxt/data/extract/test/test_download_exchange_data_to_db_exchange.py:68: Cannot be run from the US due to 451 error API error. Run manually.
SKIPPED [1] im_v2/ccxt/data/client/test/test_ccxt_clients.py:1289: Run only in //amp
SKIPPED [1] im_v2/ccxt/data/client/test/test_ccxt_clients.py:1302: Run only in //amp
SKIPPED [1] im_v2/ccxt/data/client/test/test_ccxt_clients.py:1315: Run only in //amp
SKIPPED [1] im_v2/ccxt/data/client/test/test_ccxt_clients.py:1381: Enable when unit test data needs to be generated.
SKIPPED [11] im/kibot/test/test_kibot_sql_writer_backend.py: CmTask666
SKIPPED [1] im/kibot/metadata/test/test_load.py:47: Disabled waiting for PTask4139
SKIPPED [1] im/kibot/metadata/test/test_load.py:66: Disabled waiting for PTask4139
SKIPPED [10] im/kibot/data/load/test/test_sql_data_loader.py: CmTask666
SKIPPED [1] im/kibot/data/load/test/test_s3_data_loader.py:24: Not implemented yet
SKIPPED [11] im/ib/test/test_ib_sql_writer_backend.py: CmTask666
SKIPPED [4] im/ib/data/transform/test/test_transform.py: CmTask666
SKIPPED [1] im/ib/connect/test/test_im_tasks.py: unconditional skip
SKIPPED [1] helpers/test/test_unit_test.py:345: This is only used to debug the debugging the infrastructure
SKIPPED [1] helpers/test/test_repo_config_amp.py:36: Only run in amp as supermodule
SKIPPED [1] helpers/test/test_repo_config_amp.py:47: Only run in amp as submodule
SKIPPED [1] helpers/test/test_repo_config_amp.py:190: Run only in //amp
SKIPPED [1] helpers/hunit_test_utils.py:400: Only run in CI
SKIPPED [2] helpers/hunit_test_utils.py:394: Only run on amp and not sorrentum
SKIPPED [3] helpers/test/test_open.py: See cryptomtc/cmamp#321
SKIPPED [1] helpers/test/test_lib_tasks_utils.py:43: This test makes sense for a branch
SKIPPED [1] helpers/test/test_lib_tasks_pytest.py:83: Only run in amp
SKIPPED [1] helpers/test/test_lib_tasks_gh.py:20: CmTask #2362.
SKIPPED [1] helpers/test/test_lib_tasks_gh.py:32: CmampTask #683.
SKIPPED [1] helpers/test/test_lib_tasks_find.py:180: Only run in amp
SKIPPED [1] helpers/test/test_lib_tasks_docker.py:80: Only run in amp as submodule
SKIPPED [1] helpers/test/test_lib_tasks_docker.py:108: Only run in amp as submodule
SKIPPED [1] helpers/test/test_lib_tasks_docker.py:144: Only run in amp as submodule
SKIPPED [1] helpers/test/test_lib_tasks_docker.py:174: Only run in amp as submodule
SKIPPED [1] helpers/test/test_lib_tasks_docker.py:210: Only run in amp as supermodule
SKIPPED [1] helpers/test/test_lib_tasks_docker.py:243: It changes a Docker file creating permission issues
SKIPPED [1] helpers/test/test_lib_tasks_docker.py:279: Only run in amp as submodule
SKIPPED [1] helpers/test/test_lib_tasks.py:234: AmpTask1347: Add support for mocking `system*()` functions to unit test
SKIPPED [1] helpers/test/test_lib_tasks.py:226: AmpTask1347: Add support for mocking `system*()` functions to unit test
SKIPPED [1] helpers/test/test_lib_tasks.py:218: AmpTask1347: Add support for mocking `system*()` functions to unit test
SKIPPED [1] helpers/test/test_lib_tasks.py:194: This is actually run deleting files
SKIPPED [1] helpers/test/test_lib_tasks.py:189: This is actually run
SKIPPED [1] helpers/test/test_lib_tasks.py:184: This is actually run
SKIPPED [1] helpers/test/test_joblib_helpers.py: Just for experimenting with joblib
SKIPPED [1] helpers/test/test_hparquet.py:774: CmTask1305: after removing circular dependencies in `hio.from_file`, this test fails reading a parquet file
SKIPPED [27] helpers/test/test_hpandas.py: Used for comparing speed of different trimming methods (CmTask1404).
SKIPPED [1] helpers/test/test_git.py:217: Run only in amp as super-module
SKIPPED [1] helpers/test/test_git.py:229: Run only in amp as sub-module
SKIPPED [1] helpers/test/test_cache.py:723: See CMTask #952.
SKIPPED [1] dev_scripts/infra/test/test_all.py: unconditional skip
SKIPPED [1] dev_scripts/git/git_hooks/test/test_install_hooks.py:21: There are no Git credentials inside Docker
SKIPPED [1] dataflow/system/test/test_real_time_runner.py:131: Too slow for real time
SKIPPED [1] dataflow/core/nodes/test/test_volatility_models.py:259: See CmTask #2975.
SKIPPED [1] dataflow/core/nodes/test/test_volatility_models.py:280: See CmTask #2975.
SKIPPED [1] dataflow/core/nodes/test/test_volatility_models.py:359: See CmTask #2975.
SKIPPED [1] dataflow/core/nodes/test/test_volatility_models.py:381: See CmTask #2975.
SKIPPED [1] dataflow/core/nodes/test/test_volatility_models.py:403: See CmTask #2975.
SKIPPED [1] dataflow/core/nodes/test/test_volatility_models.py:428: unconditional skip
SKIPPED [1] dataflow/core/nodes/test/test_volatility_models.py:502: See CmTask #2975.
SKIPPED [1] dataflow/core/nodes/test/test_volatility_models.py:519: See CmTask #2975.
SKIPPED [1] dataflow/core/nodes/test/test_sarimax_models.py:39: cmamp #654.
SKIPPED [1] dataflow/core/nodes/test/test_regression_models.py:36: This test fails on some computers due to AmpTask1649
SKIPPED [1] dataflow/core/nodes/test/test_regression_models.py:19: This test generates the input data
SKIPPED [1] core/test/test_features.py:527: Apparent instability
SKIPPED [1] core/test/test_features.py:534: Apparent instability
SKIPPED [1] core/test/test_features.py:541: Apparent instability
SKIPPED [1] core/test/test_features.py:573: Apparent instability
SKIPPED [1] core/test/test_features.py:586: Apparent instability
SKIPPED [1] core/test/test_explore.py:25: https://github.com/.../.../issues/3676
SKIPPED [1] core/test/test_data_adapters.py:146: Disabled because of PTask2440
SKIPPED [1] core/test/test_data_adapters.py:161: Disabled because of PTask2440
SKIPPED [1] core/test/test_data_adapters.py:118: Disabled because of PTask2440
SKIPPED [1] core/test/test_data_adapters.py:177: Disabled because of PTask2440
SKIPPED [1] core/test/test_data_adapters.py:132: Disabled because of PTask2440
SKIPPED [1] core/test/test_backtest.py:27: Disabled because of PTask2440
SKIPPED [1] core/test/test_backtest.py:69: Disabled because of PTask2440
SKIPPED [1] core/test/test_backtest.py:111: Disabled because of PTask2440
SKIPPED [1] core/statistics/test/test_requires_statsmodels.py:271: cmamp #654.
SKIPPED [1] core/statistics/test/test_requires_statsmodels.py:284: cmamp #654.
SKIPPED [1] core/statistics/test/test_requires_statsmodels.py:297: cmamp #654.
SKIPPED [1] core/statistics/test/test_requires_statsmodels.py:304: cmamp #654.
SKIPPED [1] core/statistics/test/test_requires_statsmodels.py:316: cmamp #654.
SKIPPED [1] core/statistics/test/test_regression.py:46: This test fails on some computers due to AmpTask1649
SKIPPED [1] core/statistics/test/test_regression.py:17: This test generates the input data
SKIPPED [1] core/statistics/test/test_regression.py:137: This test fails on some computers due to AmpTask1649
SKIPPED [1] core/statistics/test/test_regression.py:108: This test generates the input data
SKIPPED [1] core/config/test/test_config.py:521: See AmpTask1573
XFAIL core/statistics/test/test_requires_statsmodels.py::TestMultipleTests::test2
XFAIL core/statistics/test/test_requires_statsmodels.py::TestMultiTTest::test7
ERROR oms/test/test_restrictions.py::TestRestrictions1::test2 - RuntimeError: cmd='(docker container ls --filter name=/compose-oms_postg...
ERROR oms/test/test_process_forecasts_.py::TestMockedProcessForecasts1::test_mocked_system1 - RuntimeError: cmd='(docker container ls --...
ERROR oms/test/test_process_forecasts_.py::TestMockedProcessForecasts2::test_mocked_system4 - RuntimeError: cmd='(docker container ls --...
ERROR oms/test/test_portfolio_example.py::Test_Portfolio_builders2::test1 - RuntimeError: cmd='(docker container ls --filter name=/compo...
ERROR oms/test/test_portfolio.py::TestDatabasePortfolio1::test2 - RuntimeError: cmd='(docker container ls --filter name=/compose-oms_pos...
ERROR oms/test/test_portfolio.py::TestDatabasePortfolio2::test1 - RuntimeError: cmd='(docker container ls --filter name=/compose-oms_pos...
ERROR oms/test/test_portfolio.py::TestDatabasePortfolio3::test1 - RuntimeError: cmd='(docker container ls --filter name=/compose-oms_pos...
ERROR oms/test/test_order_processor.py::TestOrderProcessor1::test_submit_order_and_timeout2 - RuntimeError: cmd='(docker container ls --...
ERROR oms/test/test_broker_example.py::Test_Broker_builders2::test1 - RuntimeError: cmd='(docker container ls --filter name=/compose-oms...
ERROR oms/test/test_broker.py::TestDatabaseBroker1::test1 - RuntimeError: cmd='(docker container ls --filter name=/compose-oms_postgres1...
ERROR market_data/test/test_real_time_market_data.py::TestRealTimeMarketData2::test_get_twap_price1 - RuntimeError: cmd='(docker contain...
ERROR im_v2/talos/data/client/test/test_talos_clients.py::TestTalosSqlRealTimeImClient1::test_round_start_timestamp_behavior - RuntimeEr...
ERROR im_v2/talos/data/client/test/test_talos_clients.py::TestMockSqlRealTimeImClient1::test_read_data7 - RuntimeError: cmd='(docker con...
ERROR im_v2/ccxt/data/extract/test/test_compare_realtime_and_historical_data.py::TestCompareRealtimeAndHistoricalData1::test_parser - Ru...
ERROR im_v2/ccxt/data/client/test/test_ccxt_clients.py::TestCcxtSqlRealTimeImClient1::test_read_data7 - RuntimeError: cmd='(docker conta...
FAILED oms/test/test_ccxt_broker.py::TestCcxtBroker1::test_log_into_exchange3 - AssertionError: Tuples differ: (('binance.preprod.tradin...
FAILED market_data/test/test_real_time_market_data.py::TestRealTimeMarketData2::test_get_data_at_timestamp1 - TypeError: get_mock_realti...
FAILED market_data/test/test_real_time_market_data.py::TestRealTimeMarketData2::test_get_data_for_interval1 - TypeError: get_mock_realti...
FAILED market_data/test/test_real_time_market_data.py::TestRealTimeMarketData2::test_get_data_for_interval2 - TypeError: get_mock_realti...
FAILED market_data/test/test_real_time_market_data.py::TestRealTimeMarketData2::test_get_data_for_last_period1 - TypeError: get_mock_rea...
FAILED market_data/test/test_real_time_market_data.py::TestRealTimeMarketData2::test_get_twap_price1 - TypeError: get_mock_realtime_clie...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_data_at_timestamp1 - TypeError: get_DataFrameIm...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_data_for_interval2 - TypeError: get_DataFrameIm...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_data_for_interval3 - TypeError: get_DataFrameIm...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_data_for_interval4 - TypeError: get_DataFrameIm...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_data_for_interval5 - TypeError: get_DataFrameIm...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_data_for_last_period1 - TypeError: get_DataFram...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_data_for_last_period2 - TypeError: get_DataFram...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_data_for_last_period3 - TypeError: get_DataFram...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_data_for_last_period4 - TypeError: get_DataFram...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_data_for_last_period5 - TypeError: get_DataFram...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_data_for_last_period6 - TypeError: get_DataFram...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_data_for_last_period7 - TypeError: get_DataFram...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_last_end_time1 - TypeError: get_DataFrameImClie...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_last_price1 - TypeError: get_DataFrameImClient_...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_get_twap_price1 - TypeError: get_DataFrameImClient_...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_is_online1 - TypeError: get_DataFrameImClient_examp...
FAILED market_data/test/test_im_client_market_data.py::TestImClientMarketData2::test_should_be_online1 - TypeError: get_DataFrameImClien...
FAILED im_v2/talos/data/client/test/test_talos_clients.py::TestTalosHistoricalPqByTileClient2::test_get_end_ts_for_symbol1 - configparse...
FAILED im_v2/talos/data/client/test/test_talos_clients.py::TestTalosHistoricalPqByTileClient2::test_get_start_ts_for_symbol1 - configpar...
FAILED im_v2/talos/data/client/test/test_talos_clients.py::TestTalosHistoricalPqByTileClient2::test_read_data1 - configparser.NoSectionE...
FAILED im_v2/talos/data/client/test/test_talos_clients.py::TestTalosHistoricalPqByTileClient2::test_read_data6 - configparser.NoSectionE...
FAILED im_v2/ig/data/client/test/test_taq_bars_utils.py::TestTaqBarsUtils1::test_filter_dates1 - configparser.NoSectionError: No section...
FAILED im_v2/ig/data/client/test/test_taq_bars_utils.py::TestTaqBarsUtils1::test_filter_dates2 - configparser.NoSectionError: No section...
FAILED im_v2/ig/data/client/test/test_taq_bars_utils.py::TestTaqBarsUtils1::test_filter_dates3 - configparser.NoSectionError: No section...
FAILED im_v2/ig/data/client/test/test_taq_bars_utils.py::TestTaqBarsUtils1::test_filter_dates4 - configparser.NoSectionError: No section...
FAILED im_v2/ig/data/client/test/test_taq_bars_utils.py::TestTaqBarsUtils1::test_get_available_dates1 - configparser.NoSectionError: No ...
FAILED im_v2/ig/data/client/test/test_taq_bars_utils.py::TestGetBarData1::test1 - RuntimeError: date=2019-01-07
FAILED im_v2/ig/data/client/test/test_taq_bars_utils.py::TestGetBarData1::test2 - RuntimeError: date=2019-01-07
FAILED im_v2/ig/data/client/test/test_taq_bars_utils.py::Test_get_cached_bar_data_for_date_interval1::test_tsla1 - configparser.NoSectio...
FAILED im_v2/ig/data/client/test/test_ig_historical_pq_by_date_taq_bar_client.py::TestIgHistoricalPqByDateTaqBarClient1::test_read_data1
FAILED im_v2/ig/data/client/test/test_ig_historical_pq_by_date_taq_bar_client.py::TestIgHistoricalPqByDateTaqBarClient1::test_read_data2
FAILED im_v2/common/data/extract/test/test_extract_utils.py::TestDownloadHistoricalData1::test_empty_dataset - configparser.NoSectionErr...
FAILED im_v2/common/data/client/test/test_im_raw_data_client.py::TestImRawDataClient::test_build_s3_pq_file_path1 - AttributeError: modu...
FAILED im_v2/common/data/client/test/test_im_raw_data_client.py::TestImRawDataClient::test_build_s3_pq_file_path2 - AttributeError: modu...
FAILED im_v2/common/data/client/test/test_im_raw_data_client.py::TestImRawDataClient::test_build_s3_pq_file_path3 - AttributeError: modu...
FAILED im_v2/common/data/client/test/test_historical_pq_clients.py::TestHistoricalPqByTileClient1::test_filter_columns1 - Failed: Timeou...
FAILED im_v2/common/data/client/test/test_historical_pq_clients.py::TestHistoricalPqByTileClient1::test_filter_columns2 - Failed: Timeou...
FAILED im_v2/common/data/client/test/test_historical_pq_clients.py::TestHistoricalPqByTileClient1::test_get_end_ts_for_symbol1 - Failed:...
FAILED im_v2/common/data/client/test/test_historical_pq_clients.py::TestHistoricalPqByTileClient1::test_get_start_ts_for_symbol1 - Faile...
FAILED im_v2/common/data/client/test/test_historical_pq_clients.py::TestHistoricalPqByTileClient1::test_read_data1 - Failed: Timeout >5.0s
FAILED im_v2/common/data/client/test/test_historical_pq_clients.py::TestHistoricalPqByTileClient1::test_read_data2 - Failed: Timeout >5.0s
FAILED im_v2/common/data/client/test/test_historical_pq_clients.py::TestHistoricalPqByTileClient1::test_read_data3 - Failed: Timeout >5.0s
FAILED im_v2/common/data/client/test/test_historical_pq_clients.py::TestHistoricalPqByTileClient1::test_read_data4 - Failed: Timeout >5.0s
FAILED im_v2/common/data/client/test/test_historical_pq_clients.py::TestHistoricalPqByTileClient1::test_read_data5 - Failed: Timeout >5.0s
FAILED im_v2/common/data/client/test/test_historical_pq_clients.py::TestHistoricalPqByTileClient1::test_read_data7 - Failed: Timeout >5.0s
FAILED im_v2/ccxt/db/test/test_archive_db_data_to_s3.py::TestArchiveDbDataToS3Mode::test_archive_and_delete_mode - AttributeError: modul...
FAILED im_v2/ccxt/db/test/test_archive_db_data_to_s3.py::TestArchiveDbDataToS3Mode::test_archive_only_mode - AttributeError: module 'dat...
FAILED im_v2/ccxt/db/test/test_archive_db_data_to_s3.py::TestArchiveDbDataToS3Mode::test_delete_only_mode - AttributeError: module 'data...
FAILED im_v2/ccxt/data/client/test/test_ccxt_clients.py::TestCcxtHistoricalPqByTileClient1::test_get_end_ts_for_symbol1 - configparser.N...
FAILED im_v2/ccxt/data/client/test/test_ccxt_clients.py::TestCcxtHistoricalPqByTileClient1::test_get_start_ts_for_symbol1 - configparser...
FAILED im_v2/ccxt/data/client/test/test_ccxt_clients.py::TestCcxtHistoricalPqByTileClient1::test_read_data1 - configparser.NoSectionErro...
FAILED im_v2/ccxt/data/client/test/test_ccxt_clients.py::TestCcxtHistoricalPqByTileClient1::test_read_data6 - configparser.NoSectionErro...
FAILED im/kibot/metadata/test/test_kibot_metadata.py::TestKibotMetadata::test_get_expiry_contract_slow1 - configparser.NoSectionError: N...
FAILED im/kibot/metadata/test/test_kibot_metadata.py::TestKibotMetadata::test_get_futures_slow1 - configparser.NoSectionError: No sectio...
FAILED im/kibot/metadata/test/test_kibot_metadata.py::TestKibotMetadata::test_get_futures_slow2 - configparser.NoSectionError: No sectio...
FAILED im/kibot/metadata/test/test_kibot_metadata.py::TestKibotMetadata::test_get_metadata_slow1 - configparser.NoSectionError: No secti...
FAILED im/kibot/metadata/test/test_kibot_metadata.py::TestKibotMetadata::test_get_metadata_slow2 - configparser.NoSectionError: No secti...
FAILED im/kibot/metadata/test/test_kibot_metadata.py::TestKibotMetadata::test_get_metadata_slow3 - configparser.NoSectionError: No secti...
FAILED im/kibot/data/load/test/test_s3_data_loader.py::TestKibotS3DataLoader::test1 - configparser.NoSectionError: No section: 'am'
FAILED im/kibot/data/extract/test/test_kibot_data_download.py::TestKibotDownload::test_extract_payload_links - Failed: Timeout >5.0s
FAILED im/ib/data/load/test/test_s3_data_loader.py::TestS3IbDataLoader1::test_dtypes1 - configparser.NoSectionError: No section: 'am'
FAILED im/ib/data/load/test/test_s3_data_loader.py::TestS3IbDataLoader1::test_read_data1 - configparser.NoSectionError: No section: 'am'
FAILED im/ib/data/load/test/test_s3_data_loader.py::TestS3IbDataLoader1::test_read_data2 - configparser.NoSectionError: No section: 'am'
FAILED im/ib/data/load/test/test_s3_data_loader.py::TestS3IbDataLoader1::test_read_data3 - configparser.NoSectionError: No section: 'am'
FAILED im/ib/data/load/test/test_s3_data_loader.py::TestS3IbDataLoader1::test_read_data_check_date_type - configparser.NoSectionError: N...
FAILED im/ib/data/load/test/test_s3_data_loader.py::TestS3IbDataLoader1::test_read_data_with_start_end_ts - configparser.NoSectionError:...
FAILED im/ib/data/load/test/test_file_path_generator.py::TestIbFilePathGenerator::test_get_latest_symbols_file1 - configparser.NoSection...
FAILED helpers/test/test_unit_test.py::Test_purify_from_env_vars::test2 - RuntimeError:
FAILED helpers/test/test_unit_test.py::Test_purify_from_env_vars::test_end_to_end - RuntimeError:
FAILED helpers/test/test_s3.py::Test_s3_get_credentials1::test1 - configparser.NoSectionError: No section: 'am'
FAILED helpers/test/test_s3.py::Test_s3_1::test_exists1 - configparser.NoSectionError: No section: 'am'
FAILED helpers/test/test_s3.py::Test_s3_1::test_exists2 - configparser.NoSectionError: No section: 'am'
FAILED helpers/test/test_s3.py::Test_s3_1::test_exists3 - configparser.NoSectionError: No section: 'am'
FAILED helpers/test/test_s3.py::Test_s3_1::test_glob1 - configparser.NoSectionError: No section: 'am'
FAILED helpers/test/test_s3.py::Test_s3_1::test_ls1 - configparser.NoSectionError: No section: 'am'
FAILED helpers/test/test_playback.py::TestPlaybackFileMode1::test1 - Failed: Timeout >5.0s
FAILED helpers/test/test_playback.py::TestPlaybackFileMode1::test2 - Failed: Timeout >5.0s
FAILED helpers/test/test_playback.py::TestPlaybackFileMode1::test3 - Failed: Timeout >5.0s
FAILED helpers/test/test_lib_tasks_gh.py::TestLibTasks1::test_get_gh_issue_title4 - RuntimeError: cmd='(gh auth status) 2>&1' failed wit...
FAILED helpers/test/test_lib_tasks.py::TestGhLogin1::test_gh_login - RuntimeError: cmd='(gh auth status) 2>&1' failed with rc='1'
FAILED helpers/test/test_joblib_helpers.py::Test_parallel_execute1::test_parallel_loky2 - Failed: Timeout >5.0s
FAILED helpers/test/test_joblib_helpers.py::Test_parallel_execute3::test_parallel_loky2 - Failed: Timeout >5.0s
FAILED helpers/test/test_hsecrets.py::TestCreateClient::test_create_client1 - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hsecrets.py::TestGetSecret::test_get_secret - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hsecrets.py::TestStoreSecret::test_store_secret1 - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hs3.py::TestToFileAndFromFile1::test_from_file_invalid1 - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hs3.py::TestToFileAndFromFile1::test_to_file_and_from_file1 - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hs3.py::TestToFileAndFromFile1::test_to_file_and_from_file2 - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hs3.py::TestToFileAndFromFile1::test_to_file_invalid1 - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hs3.py::TestListdir1::test_listdir1 - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hs3.py::TestListdir1::test_listdir2 - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hs3.py::TestListdir1::test_listdir3 - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hs3.py::TestListdir1::test_listdir4 - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hs3.py::TestDu1::test_du1 - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hparquet.py::TestListAndMergePqFiles::test_list_and_merge_pq_files - configparser.NoSectionError: No section: 'ck'
FAILED helpers/test/test_hparquet.py::TestListAndMergePqFiles::test_list_and_merge_pq_files_duplicate_drop - configparser.NoSectionError...
FAILED helpers/test/test_hpandas.py::TestReadDataFromS3::test_read_csv1 - configparser.NoSectionError: No section: 'am'
FAILED dataflow_amp/system/mock1/test/test_mock1_forecast_system.py::Test_Mock1_System_CheckConfig::test_freeze_config1 - TypeError: get...
FAILED dataflow_amp/system/mock1/test/test_mock1_forecast_system.py::Test_Mock1_NonTime_ForecastSystem_FitPredict::test_fit_over_backtest_period1
FAILED dataflow_amp/system/mock1/test/test_mock1_forecast_system.py::Test_Mock1_NonTime_ForecastSystem_FitPredict::test_fit_over_period1
FAILED dataflow_amp/system/mock1/test/test_mock1_forecast_system.py::Test_Mock1_NonTime_ForecastSystem_FitPredict::test_fit_vs_predict1
FAILED dataflow_amp/system/mock1/test/test_mock1_forecast_system.py::Test_Mock1_NonTime_ForecastSystem_FitInvariance::test_invariance1
FAILED dataflow_amp/system/mock1/test/test_mock1_forecast_system.py::Test_Mock1_NonTime_ForecastSystem_CheckPnl::test_fit_run1 - TypeErr...
FAILED dataflow_amp/system/mock1/test/test_mock1_forecast_system.py::Test_Mock1_Time_ForecastSystem1::test1 - Failed: Timeout >5.0s
FAILED dataflow/system/test/test_real_time_runner.py::TestRealTimeDagRunner1::test_simulated_replayed_time1 - Failed: Timeout >5.0s
FAILED dataflow/model/test/test_tiled_flows.py::Test_evaluate_weighted_forecasts::test_combine_two_signals - Failed: Timeout >5.0s
FAILED dataflow/model/test/test_model_plotter.py::TestModelPlotter1::test_model_selection1 - Failed: Timeout >5.0s
FAILED dataflow/model/test/test_model_evaluator.py::TestModelEvaluator1::test_calculate_stats1 - Failed: Timeout >5.0s
FAILED dataflow/core/nodes/test/test_volatility_models.py::TestMultiindexVolatilityModel::test1 - RuntimeError:
FAILED dataflow/core/nodes/test/test_volatility_models.py::TestMultiindexVolatilityModel::test2 - RuntimeError:
FAILED dataflow/core/nodes/test/test_sarimax_models.py::TestContinuousSarimaxModel::test_compare_to_linear_regression1 - Failed: Timeout...
FAILED dataflow/core/nodes/test/test_sarimax_models.py::TestContinuousSarimaxModel::test_compare_to_linear_regression2 - Failed: Timeout...
FAILED dataflow/core/nodes/test/test_sarimax_models.py::TestContinuousSarimaxModel::test_fit_with_constant1 - Failed: Timeout >5.0s
FAILED dataflow/core/nodes/test/test_sarimax_models.py::TestContinuousSarimaxModel::test_predict_different_intervals1 - Failed: Timeout ...
FAILED dataflow/core/nodes/test/test_sarimax_models.py::TestContinuousSarimaxModel::test_predict_different_intervals_no_x1 - Failed: Tim...
FAILED dataflow/core/nodes/test/test_sarimax_models.py::TestContinuousSarimaxModel::test_predict_with_nan - Failed: Timeout >5.0s
FAILED dataflow/core/nodes/test/test_gluonts_models.py::TestDeepARGlobalModel::test_fit1 - Failed: Timeout >5.0s
FAILED dataflow/core/nodes/test/test_gluonts_models.py::TestDeepARGlobalModel::test_fit_dag1 - Failed: Timeout >5.0s
FAILED core/config/test/test_config.py::Test_from_env_var1::test1 - Failed: Timeout >5.0s
=============== 125 failed, 2022 passed, 206 skipped, 187 deselected, 2 xfailed, 15 errors, 61 rerun in 1047.69s (0:17:27) ================
12:31:01 @ 2023-06-19 08:24:22 - INFO hcache.py clear_global_cache:292 Before clear_global_cache: 'global mem' cache: path='/mnt/tmpfs/tmp.cache.mem', size=28.0 KB
12:31:01 @ 2023-06-19 08:24:22 - WARN hcache.py clear_global_cache:293 Resetting 'global mem' cache '/mnt/tmpfs/tmp.cache.mem'
12:31:01 @ 2023-06-19 08:24:22 - WARN hcache.py clear_global_cache:303 Destroying '/mnt/tmpfs/tmp.cache.mem' ...
12:31:01 @ 2023-06-19 08:24:22 - INFO hcache.py clear_global_cache:319 After clear_global_cache: 'global mem' cache: path='/mnt/tmpfs/tmp.cache.mem', size=nan
ERROR: 1
when running:
(amp.client_venv) (base) alejandros-MacBook-Pro:sorrentum1 jandro$ pytest core/statistics/test/test_requires_statsmodels.py::TestMultipleTests::test2 -s --dbg
The following is the resulting output:
WARNING: No module named 'matplotlib'
==================================================================================================================================================== test session starts =====================================================================================================================================================
platform darwin -- Python 3.9.13, pytest-7.3.2, pluggy-1.0.0 -- /Users/jandro/src/venv/amp.client_venv/bin/python3
cachedir: .pytest_cache
rootdir: /Users/jandro/src/sorrentum1
configfile: pytest.ini
plugins: instafail-0.5.0, cov-4.1.0, xdist-3.3.1, anyio-3.7.0
collected 0 items / 1 error
=========================================================================================================================================================== ERRORS ===========================================================================================================================================================
_____________________________________________________________________________________________________________________________ ERROR collecting core/statistics/test/test_requires_statsmodels.py _____________________________________________________________________________________________________________________________
ImportError while importing test module '/Users/jandro/src/sorrentum1/core/statistics/test/test_requires_statsmodels.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../venv/amp.client_venv/lib/python3.9/site-packages/_pytest/python.py:617: in _importtestmodule
mod = import_path(self.path, mode=importmode, root=self.config.rootpath)
../venv/amp.client_venv/lib/python3.9/site-packages/_pytest/pathlib.py:564: in import_path
importlib.import_module(module_name)
/opt/anaconda3/lib/python3.9/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
<frozen importlib._bootstrap>:1030: in _gcd_import
???
<frozen importlib._bootstrap>:1007: in _find_and_load
???
<frozen importlib._bootstrap>:972: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:228: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1030: in _gcd_import
???
<frozen importlib._bootstrap>:1007: in _find_and_load
???
<frozen importlib._bootstrap>:972: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:228: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1030: in _gcd_import
???
<frozen importlib._bootstrap>:1007: in _find_and_load
???
<frozen importlib._bootstrap>:986: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:680: in _load_unlocked
???
<frozen importlib._bootstrap_external>:850: in exec_module
???
<frozen importlib._bootstrap>:228: in _call_with_frames_removed
???
core/statistics/__init__.py:7: in <module>
from core.statistics.binning import * # pylint: disable=unused-import # NOQA
core/statistics/__init__.py:7: in <module>
from core.statistics.binning import * # pylint: disable=unused-import # NOQA
core/statistics/binning.py:12: in <module>
import scipy as sp
E ModuleNotFoundError: No module named 'scipy'
================================================================================================================================================== short test summary info ===================================================================================================================================================
ERROR core/statistics/test/test_requires_statsmodels.py
====================================================================================================================================================== 1 error in 0.10s ======================================================================================================================================================
ERROR: not found: /Users/jandro/src/sorrentum1/core/statistics/test/test_requires_statsmodels.py::TestMultipleTests::test2
(no name '/Users/jandro/src/sorrentum1/core/statistics/test/test_requires_statsmodels.py::TestMultipleTests::test2' in any of [<Module test_requires_statsmodels.py>])
upon exporting virtualenvironment dependencies to text file using
pip freeze > requirements.txt
it is clear that scipy is not included in this development environment, causing this test to fail
@alejandroBallesterosC
Try executing i docker_bash
before running the test. TLDR most of the libs are installed in the Docker environment only to keep our virtual env thin
All correct. We develop and run tests inside Docker so that everyone has the same env.
Don't we have a doc explaining this approach somewhere? Maybe it was not converted to markdown yet?
Don't we have a doc explaining this approach somewhere? Maybe it was not converted to markdown yet?
Not yet, was re-assigned since previous student did nothing
Current PR is close, so it can be used for now https://github.com/sorrentum/sorrentum/pull/307
I first activate the development environment
(base) alejandros-MacBook-Pro:sorrentum1 jandro$ source dev_scripts/setenv_amp.sh
Then I start the docker container
i docker_bash
I receive this output:
INFO: > cmd='/Users/jandro/src/venv/amp.client_venv/bin/invoke docker_bash'
report_memory_usage=False report_cpu_usage=False
## docker_bash:
22:49:39 - INFO lib_tasks_docker.py _docker_cmd:1253 Pulling the latest version of Docker
## docker_pull:
## docker_login:
...
... The config profile (ck) could not be found
22:49:40 - INFO lib_tasks_docker.py _docker_pull:226 image='sorrentum/cmamp:dev'
docker pull sorrentum/cmamp:dev
dev: Pulling from sorrentum/cmamp
Digest: sha256:7d9ee52407e426c8d0c6611bebb3a5e76bf05d504122aaa50bf6765dc500a2f7
Status: Image is up to date for sorrentum/cmamp:dev
docker.io/sorrentum/cmamp:dev
IMAGE=sorrentum/cmamp:dev \
docker-compose \
--file /Users/jandro/src/sorrentum1/devops/compose/docker-compose.yml \
--env-file devops/env/default.env \
run \
--rm \
--name jandro.cmamp.app.sorrentum1.20230620_224939 \
--user $(id -u):$(id -g) \
app \
bash
WARNING: The AM_AWS_ACCESS_KEY_ID variable is not set. Defaulting to a blank string.
WARNING: The AM_AWS_DEFAULT_REGION variable is not set. Defaulting to a blank string.
WARNING: The AM_AWS_SECRET_ACCESS_KEY variable is not set. Defaulting to a blank string.
WARNING: The AM_FORCE_TEST_FAIL variable is not set. Defaulting to a blank string.
WARNING: The AM_TELEGRAM_TOKEN variable is not set. Defaulting to a blank string.
WARNING: The CK_AWS_ACCESS_KEY_ID variable is not set. Defaulting to a blank string.
WARNING: The CK_AWS_DEFAULT_REGION variable is not set. Defaulting to a blank string.
WARNING: The CK_AWS_SECRET_ACCESS_KEY variable is not set. Defaulting to a blank string.
WARNING: The CK_TELEGRAM_TOKEN variable is not set. Defaulting to a blank string.
WARNING: Found orphan containers (compose-im_postgres762-1, compose-oms_postgres2901-1, compose-im_postgres4171-1, compose-oms_postgres2023-1, compose-oms_postgres7684-1, compose-oms_postgres1102-1, compose-oms_postgres8837-1, compose-oms_postgres4068-1, compose-oms_postgres3113-1, compose-im_postgres1489-1, compose-oms_postgres5019-1, compose-im_postgres7402-1, compose-oms_postgres7884-1, compose-oms_postgres1081-1, compose-im_postgres4960-1) for this project. If you removed or renamed this service in your compose file, you can run this command with the --remove-orphans flag to clean it up.
Creating compose_app_run ... done
##> devops/docker_run/entrypoint.sh
UID=501
GID=20
# Activate environment
##> devops/docker_run/setenv.sh
# Set PATH
PATH=/app/documentation/scripts:/app/dev_scripts/testing:/app/dev_scripts/notebooks:/app/dev_scripts/install:/app/dev_scripts/infra:/app/dev_scripts/git:/app/dev_scripts/aws:/app/dev_scripts:/app:.:/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
# Set PYTHONPATH
PYTHONPATH=/app:
# Configure env
git --version: git version 2.25.1
/app
WARNING: AWS credential check failed: can't find /home/.aws/config file.
# Check AWS authentication setup
Name Value Type Location
---- ----- ---- --------
profile am manual --profile
The config profile (am) could not be found
AM_CONTAINER_VERSION='1.4.0'
which python: /venv/bin/python
python -V: Python 3.8.10
helpers: <module 'helpers' from '/app/helpers/__init__.py'>
PATH=/app/documentation/scripts:/app/dev_scripts/testing:/app/dev_scripts/notebooks:/app/dev_scripts/install:/app/dev_scripts/infra:/app/dev_scripts/git:/app/dev_scripts/aws:/app/dev_scripts:/app:.:/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
PYTHONPATH=/app:
entrypoint.sh: 'bash'
<jemalloc>: MADV_DONTNEED does not work (memset will be used instead)
<jemalloc>: (This is the expected behaviour if you are running under QEMU)
INFO: > cmd='/venv/bin/invoke print_env'
report_memory_usage=False report_cpu_usage=False
-----------------------------------------------------------------------------
This code is not in sync with the container:
code_version='1.4.3' != container_version='1.4.0'
-----------------------------------------------------------------------------
You need to:
- merge origin/master into your branch with `invoke git_merge_master`
- pull the latest container with `invoke docker_pull`
# Repo config:
# repo_config.config
enable_privileged_mode='False'
get_docker_base_image_name='cmamp'
get_docker_shared_group=''
get_docker_user=''
get_host_name='github.com'
get_html_dir_to_url_mapping='{'s3://cryptokaizen-html': 'http://172.30.2.44'}'
get_invalid_words='[]'
get_name='//cmamp'
get_repo_map='{'cm': 'sorrentum/sorrentum'}'
get_shared_data_dirs='None'
has_dind_support='False'
has_docker_sudo='True'
is_CK_S3_available='True'
run_docker_as_root='False'
skip_submodules_test='False'
use_docker_db_container_name_to_connect='True'
use_docker_network_mode_host='False'
use_docker_sibling_containers='True'
# hserver.config
is_AM_S3_available()='True'
is_dev4()='False'
is_dev_ck()='False'
is_inside_ci()='False'
is_inside_docker()='True'
is_mac(version='Catalina')='False'
is_mac(version='Monterey')='True'
is_mac(version='Ventura')='False'
# System signature:
# Git
branch_name='CmTask283_Create_test_list_to_run_with_Sorrentum'
hash='80fd8710c'
# Last commits:
* 80fd8710c GP Saggese Update doc ( 2 days ago) Mon Jun 19 11:37:21 2023 (HEAD -> CmTask283_Create_test_list_to_run_with_Sorrentum, origin/master, origin/SorrTask332_Convert_Code_Review_gDoc_to_Markdown, origin/HEAD, master)
* 77f612f75 Yiyun Lei SorrIssue244 CCXT timestamp representation unit test (#317) ( 2 days ago) Mon Jun 19 11:32:32 2023
* 7b381e68c KangmingL SorrTask270_Encryption_flow_for_models (#313) ( 4 days ago) Sat Jun 17 14:01:48 2023
# Machine info
system=Linux
node name=3c790bbec86b
release=5.15.49-linuxkit-pr
version=#1 SMP PREEMPT Thu May 25 07:27:39 UTC 2023
machine=x86_64
processor=x86_64
cpu count=5
cpu freq=None
memory=svmem(total=8232833024, available=6069235712, percent=26.3, used=1449070592, free=1683234816, active=1362522112, inactive=4474675200, buffers=109793280, cached=4990734336, shared=506937344, slab=565035008)
disk usage=sdiskusage(total=62671097856, used=13541773312, free=45912608768, percent=22.8)
# Packages
python: 3.8.10
cvxopt: 1.3.0
cvxpy: 1.2.2
gluonnlp: ?
gluonts: 0.6.7
joblib: 1.2.0
mxnet: 1.9.1
numpy: 1.23.4
pandas: 1.5.1
pyarrow: 10.0.0
scipy: 1.9.3
seaborn: 0.12.1
sklearn: 1.1.3
statsmodels: 0.13.5
# Env vars:
AM_AWS_ACCESS_KEY_ID=undef
AM_AWS_DEFAULT_REGION=undef
AM_AWS_PROFILE='am'
AM_AWS_S3_BUCKET='alphamatic-data'
AM_AWS_SECRET_ACCESS_KEY=undef
AM_ECR_BASE_PATH='665840871993.dkr.ecr.us-east-1.amazonaws.com'
AM_ENABLE_DIND='0'
AM_FORCE_TEST_FAIL=''
AM_HOST_NAME='alejandros-MacBook-Pro.local'
AM_HOST_OS_NAME='Darwin'
AM_HOST_USER_NAME='jandro'
AM_HOST_VERSION='21.6.0'
AM_REPO_CONFIG_CHECK='True'
AM_REPO_CONFIG_PATH=''
AM_TELEGRAM_TOKEN=empty
CI=''
CK_AWS_ACCESS_KEY_ID=empty
CK_AWS_DEFAULT_REGION=''
CK_AWS_S3_BUCKET='cryptokaizen-data'
CK_AWS_SECRET_ACCESS_KEY=empty
CK_ECR_BASE_PATH='sorrentum'
GH_ACTION_ACCESS_TOKEN=empty
22:50:04 - INFO hcache.py clear_global_cache:292 Before clear_global_cache: 'global mem' cache: path='/mnt/tmpfs/tmp.cache.mem', size=32.0 KB
22:50:04 - WARN hcache.py clear_global_cache:293 Resetting 'global mem' cache '/mnt/tmpfs/tmp.cache.mem'
22:50:04 - WARN hcache.py clear_global_cache:303 Destroying '/mnt/tmpfs/tmp.cache.mem' ...
22:50:04 - INFO hcache.py clear_global_cache:319 After clear_global_cache: 'global mem' cache: path='/mnt/tmpfs/tmp.cache.mem', size=nan
and then I run:
user_501@3c790bbec86b:/app$ pytest core/statistics/test/test_requires_statsmodels.py::TestMultipleTests::test2 -s -dbg
and receive the following error:
bash: /app/pytest: Permission denied
any thoughts on how to set adequate permissions here?
@alejandroBallesterosC seems strange, can you run which pytest
and ls -la /app
and show the outputs?
Agree with Juraj. I think it’s a problem with the user / group id that is passed to the container.
So far we have only supported macOS for a few laptops with up to date set up so there might be some weird dependency from the version of the OS or version of Docker.
On Wed, Jun 21, 2023 at 11:13 AM Juraj Smeriga @.***> wrote:
@alejandroBallesterosC https://github.com/alejandroBallesterosC seems strange, can you run which pytest and ls -la /app and show the outputs?
— Reply to this email directly, view it on GitHub https://github.com/sorrentum/sorrentum/issues/283#issuecomment-1600482800, or unsubscribe https://github.com/notifications/unsubscribe-auth/AH5S2OLEYFWODMWARIKCXADXMK3MBANCNFSM6AAAAAAYPNUT2I . You are receiving this because you were mentioned.Message ID: @.***>
I run:
user_501@3c790bbec86b:/app$ which pytest
output:
/app/pytest
I run:
user_501@3c790bbec86b:/app$ ls -la /app
and I receive this output:
total 2292
drwxr-xr-x 56 user_501 dialout 1792 Jun 19 17:15 .
drwxr-xr-x 1 root root 4096 Jun 21 02:49 ..
-rw-r--r-- 1 user_501 dialout 227 Jun 16 15:55 .coveragerc
-rw-r--r-- 1 user_501 dialout 98 Jun 16 15:55 .dockerignore.dev
-rw-r--r-- 1 user_501 dialout 44 Jun 16 15:55 .dockerignore.prod
drwxr-xr-x 14 user_501 dialout 448 Jun 19 16:24 .git
drwxr-xr-x 4 user_501 dialout 128 Jun 16 15:55 .github
drwxr-xr-x 4 user_501 dialout 128 Jun 16 15:55 .github.OLD
-rw-r--r-- 1 user_501 dialout 83 Jun 19 15:23 .gitignore
-rw-r--r-- 1 user_501 dialout 90 Jun 16 15:55 .isort.cfg
drwxr-xr-x 6 user_501 dialout 192 Jun 19 16:31 .pytest_cache
-rw-r--r-- 1 user_501 dialout 36780 Jun 16 15:55 LICENSE
-rw-r--r-- 1 user_501 dialout 2853 Jun 16 15:55 README.md
-rw-r--r-- 1 user_501 dialout 0 Jun 16 15:55 __init__.py
drwxrwxr-x 4 user_501 dialout 128 Jun 19 17:15 __pycache__
-rw-r--r-- 1 user_501 dialout 135 Jun 19 16:16 actual.txt
-rw-r--r-- 1 user_501 dialout 1579 Jun 16 15:55 changelog.txt
drwxr-xr-x 8 user_501 dialout 256 Jun 16 15:55 ck_alembic
-rw-r--r-- 1 user_501 dialout 14117 Jun 16 15:55 code_organization.md
-rw-r--r-- 1 user_501 dialout 5222 Jun 16 15:55 conftest.py
drwxr-xr-x 27 user_501 dialout 864 Jun 19 17:23 core
drwxr-xr-x 8 user_501 dialout 256 Jun 16 15:55 data_schema
drwxr-xr-x 11 user_501 dialout 352 Jun 16 15:55 dataflow
drwxr-xr-x 5 user_501 dialout 160 Jun 16 15:55 dataflow_amp
drwxr-xr-x 14 user_501 dialout 448 Jun 16 15:55 defi
drwxr-xr-x 92 user_501 dialout 2944 Jun 19 15:23 dev_scripts
drwxr-xr-x 10 user_501 dialout 320 Jun 16 15:55 devops
drwxr-xr-x 9 user_501 dialout 288 Jun 16 15:55 docker_common
drwxr-xr-x 21 user_501 dialout 672 Jun 19 15:23 docs
drwxr-xr-x 8 user_501 dialout 256 Jun 16 15:55 documentation
-rw-r--r-- 1 user_501 dialout 135 Jun 19 16:16 expected.txt
drwxr-xr-x 81 user_501 dialout 2592 Jun 19 17:15 helpers
drwxr-xr-x 14 user_501 dialout 448 Jun 16 15:55 im
drwxr-xr-x 21 user_501 dialout 672 Jun 19 15:23 im_v2
drwxr-xr-x 4 user_501 dialout 128 Jun 16 15:55 infra
-rw-r--r-- 1 user_501 dialout 54 Jun 16 15:55 invoke.yaml
drwxr-xr-x 16 user_501 dialout 512 Jun 16 17:46 market_data
-rw-r--r-- 1 user_501 dialout 5279 Jun 16 15:55 mypy.ini
drwxr-xr-x 52 user_501 dialout 1664 Jun 19 17:33 oms
drwxr-xr-x 20 user_501 dialout 640 Jun 16 15:55 optimizer
drwxr-xr-x 3 user_501 dialout 96 Jun 19 16:21 outcomes
-rw-rw-r-- 1 user_501 dialout 0 Jun 19 17:14 pytest
-rw-r--r-- 1 user_501 dialout 850 Jun 16 15:55 pytest.ini
-rw-r--r-- 1 user_501 dialout 14198 Jun 16 15:55 repo_config.py
drwxr-xr-x 6 user_501 dialout 192 Jun 16 15:55 research_amp
-rw-r--r-- 1 user_501 dialout 1413 Jun 16 15:55 setup.py
drwxr-xr-x 9 user_501 dialout 288 Jun 16 15:55 sorrentum_sandbox
-rw-r--r-- 1 user_501 dialout 4641 Jun 16 15:55 tasks.py
drwxr-xr-x 3 user_501 dialout 96 Jun 16 15:55 test
drwxr-xr-x 3 user_501 dialout 96 Jun 16 16:08 tmp.cache.disk
-rw-r--r-- 1 user_501 dialout 21 Jun 19 16:21 tmp.exp_var.txt
-rw-r--r-- 1 user_501 dialout 10675 Jun 19 16:27 tmp.final.actual.txt
-rw-r--r-- 1 user_501 dialout 10675 Jun 19 16:27 tmp.final.expected.txt
-rw-r--r-- 1 user_501 dialout 399 Jun 19 16:23 tmp.parallel_execute.workload.txt
-rw-r--r-- 1 user_501 dialout 1623162 Jun 19 16:31 tmp.pytest.log
-rwxr-xr-x 1 user_501 dialout 310 Jun 19 16:27 tmp_diff.sh
Uhm just looking at the output and extrapolating, IMO you must have created by mistake a pytest
file in a dir inside PYTHONPATH
that shadows the actual executable.
user_501@5017361985b9:/app$ which pytest
/venv/bin/pytest
user_501@5017361985b9:/app$ ls -l $(which pytest)
-rwxr-xr-x 1 root root 221 Nov 15 2022 /venv/bin/pytest
while you pytest
is in /app
and it's created by you
-rw-rw-r-- 1 user_501 dialout 0 Jun 19 17:14 pytest
I would just delete that /app/pytest
and verify that the proper pytest
in the venv is picked up and then -> enjoy
Great, I've deleted app/pytest using:
rm -rf pytest
now, when I run which pytest
I get:
/venv/bin/pytest
however, when I run:
pytest core/statistics/test/test_requires_statsmodels.py::TestMultipleTests::test2 -s -dbg
I get:
bash: /app/pytest: No such file or directory
looks like the /venv/bin/pytest is not being added to my path variable, im assuming the docker container has a path variable that I can just add this path to?
EDIT: I reinstalled pytest in the virtual environment, deactivated and reactivated it, and it seems that this fixed the problem
when running:
user_501@6778a6190af5:/app$ pytest oms/test/test_restrictions.py::TestRestrictions1::test2 -s --dbg
I receive the following output:
<jemalloc>: MADV_DONTNEED does not work (memset will be used instead)
<jemalloc>: (This is the expected behaviour if you are running under QEMU)
==================================================================================================================================================== test session starts =====================================================================================================================================================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /venv/bin/python
cachedir: .pytest_cache
rootdir: /app, configfile: pytest.ini
plugins: xdist-3.0.2, instafail-0.4.2, rerunfailures-10.2, anyio-3.6.2, cov-4.0.0, timeout-2.1.0
collecting 1 item -----------------------------------------------------------------------------
This code is not in sync with the container:
code_version='1.4.3' != container_version='1.4.0'
-----------------------------------------------------------------------------
You need to:
- merge origin/master into your branch with `invoke git_merge_master`
- pull the latest container with `invoke docker_pull`
# Git
branch_name='CmTask283_Create_test_list_to_run_with_Sorrentum'
hash='80fd8710c'
# Last commits:
* 80fd8710c GP Saggese Update doc ( 2 days ago) Mon Jun 19 11:37:21 2023 (HEAD -> CmTask283_Create_test_list_to_run_with_Sorrentum, origin/master, origin/SorrTask332_Convert_Code_Review_gDoc_to_Markdown, origin/HEAD, master)
* 77f612f75 Yiyun Lei SorrIssue244 CCXT timestamp representation unit test (#317) ( 2 days ago) Mon Jun 19 11:32:32 2023
* 7b381e68c KangmingL SorrTask270_Encryption_flow_for_models (#313) ( 4 days ago) Sat Jun 17 14:01:48 2023
# Machine info
system=Linux
node name=6778a6190af5
release=5.15.49-linuxkit-pr
version=#1 SMP PREEMPT Thu May 25 07:27:39 UTC 2023
machine=x86_64
processor=x86_64
cpu count=5
cpu freq=None
memory=svmem(total=8232833024, available=6073434112, percent=26.2, used=1444085760, free=1653313536, active=1435295744, inactive=4419215360, buffers=141549568, cached=4993884160, shared=507740160, slab=566292480)
disk usage=sdiskusage(total=62671097856, used=13541814272, free=45912567808, percent=22.8)
# Packages
python: 3.8.10
cvxopt: 1.3.0
cvxpy: 1.2.2
gluonnlp: ?
gluonts: 0.6.7
joblib: 1.2.0
mxnet: 1.9.1
numpy: 1.23.4
pandas: 1.5.1
pyarrow: 10.0.0
scipy: 1.9.3
seaborn: 0.12.1
sklearn: 1.1.3
statsmodels: 0.13.5
WARNING: Setting verbosity level to 5
INFO: > cmd='/venv/bin/pytest oms/test/test_restrictions.py::TestRestrictions1::test2 -s --dbg'
report_memory_usage=False report_cpu_usage=False
INFO: Saving log to file 'tmp.pytest.log'
13:03:14 hdbg.py init_logger:987 Effective logging level=5
13:03:14 hlogging.py shutup_chatty_modules:198 Shut up 108 modules: matplotlib.lines, urllib3, matplotlib.axis, botocore.httpsession, botocore.paginate, botocore.auth, botocore.retries.special, matplotlib.axes._base, matplotlib._layoutgrid, matplotlib.image, matplotlib, matplotlib.category, boto3.resources.factory, botocore.history, asyncio, botocore.retries.adaptive, invoke, matplotlib.axes._axes, botocore.awsrequest, matplotlib.dates, botocore.endpoint, botocore.session, urllib3.connectionpool, matplotlib.colorbar, botocore.hooks, botocore.monitoring, boto3.resources, matplotlib.gridspec, matplotlib.dviread, botocore.discovery, matplotlib.font_manager, urllib3.util.retry, urllib3.response, boto3.resources.model, boto3, urllib3.util, matplotlib.style.core, botocore.credentials, botocore.retries, matplotlib.text, botocore.handlers, matplotlib.artist, boto3.resources.action, botocore.retries.standard, botocore.compat, botocore, botocore.waiter, matplotlib.backend_bases, matplotlib.textpath, s3fs, matplotlib._constrained_layout, botocore.utils, matplotlib.ticker, botocore.loaders, botocore.args, matplotlib.style, botocore.client, urllib3.poolmanager, boto3.resources.collection, botocore.parsers, botocore.retryhandler, matplotlib._afm, matplotlib.mathtext, botocore.response, boto3.resources.base, fsspec, matplotlib.pyplot, matplotlib.figure, botocore.regions, botocore.configprovider, urllib3.connection, matplotlib.texmanager, matplotlib.axes
collected 1 item
oms/test/test_restrictions.py::TestRestrictions1::test2 13:03:14 - INFO hsql_test.py setUpClass:74
################################################################################
setUpClass
################################################################################
13:03:14 hserver.py is_mac:101 version=Monterey
13:03:14 hserver.py is_mac:103 os.uname()=posix.uname_result(sysname='Linux', nodename='6778a6190af5', release='5.15.49-linuxkit-pr', version='#1 SMP PREEMPT Thu May 25 07:27:39 UTC 2023', machine='x86_64')
13:03:14 hserver.py is_mac:105 host_os_name=Linux am_host_os_name=Darwin
13:03:14 hserver.py is_mac:137 macos_tag=21.
13:03:14 hserver.py is_mac:142 host_os_version=5.15.49-linuxkit-pr am_host_os_version=21.6.0
13:03:14 hserver.py is_mac:148 is_mac_=True
13:03:14 hserver.py is_dev4:83 host_name=6778a6190af5 am_host_name=alejandros-MacBook-Pro.local
13:03:14 hserver.py is_dev4:88 host_name=6778a6190af5 am_host_name=alejandros-MacBook-Pro.local
13:03:14 hserver.py is_mac:101 version=Monterey
13:03:14 hserver.py is_mac:103 os.uname()=posix.uname_result(sysname='Linux', nodename='6778a6190af5', release='5.15.49-linuxkit-pr', version='#1 SMP PREEMPT Thu May 25 07:27:39 UTC 2023', machine='x86_64')
13:03:14 hserver.py is_mac:105 host_os_name=Linux am_host_os_name=Darwin
13:03:14 hserver.py is_mac:137 macos_tag=21.
13:03:14 hserver.py is_mac:142 host_os_version=5.15.49-linuxkit-pr am_host_os_version=21.6.0
13:03:14 hserver.py is_mac:148 is_mac_=True
13:03:14 hsystem.py _system:223 > (git rev-parse --show-toplevel) 2>&1
13:03:14 hsystem.py _system:223 > (cd /app; (git remote -v | grep origin | grep fetch)) 2>&1
13:03:14 hgit.py get_repo_full_name_from_dirname:495 data=['origin', 'git@github.com:sorrentum/sorrentum.git', '(fetch)']
13:03:14 hgit.py _parse_github_repo_name:462 host_name=github.com repo_name=sorrentum/sorrentum.git
13:03:14 hgit.py get_amp_abs_path:738 repo_sym_name=sorrentum/sorrentum
13:03:14 hsystem.py _system:223 > (git rev-parse --show-toplevel) 2>&1
13:03:14 hserver.py is_mac:101 version=Monterey
13:03:14 hserver.py is_mac:103 os.uname()=posix.uname_result(sysname='Linux', nodename='6778a6190af5', release='5.15.49-linuxkit-pr', version='#1 SMP PREEMPT Thu May 25 07:27:39 UTC 2023', machine='x86_64')
13:03:14 hserver.py is_mac:105 host_os_name=Linux am_host_os_name=Darwin
13:03:14 hserver.py is_mac:137 macos_tag=21.
13:03:14 hserver.py is_mac:142 host_os_version=5.15.49-linuxkit-pr am_host_os_version=21.6.0
13:03:14 hserver.py is_mac:148 is_mac_=True
13:03:14 hserver.py is_dev4:83 host_name=6778a6190af5 am_host_name=alejandros-MacBook-Pro.local
13:03:14 hserver.py is_dev4:88 host_name=6778a6190af5 am_host_name=alejandros-MacBook-Pro.local
13:03:14 hserver.py is_mac:101 version=Monterey
13:03:14 hserver.py is_mac:103 os.uname()=posix.uname_result(sysname='Linux', nodename='6778a6190af5', release='5.15.49-linuxkit-pr', version='#1 SMP PREEMPT Thu May 25 07:27:39 UTC 2023', machine='x86_64')
13:03:14 hserver.py is_mac:105 host_os_name=Linux am_host_os_name=Darwin
13:03:14 hserver.py is_mac:137 macos_tag=21.
13:03:14 hserver.py is_mac:142 host_os_version=5.15.49-linuxkit-pr am_host_os_version=21.6.0
13:03:14 hserver.py is_mac:148 is_mac_=True
13:03:14 hserver.py is_mac:101 version=Monterey
13:03:14 hserver.py is_mac:103 os.uname()=posix.uname_result(sysname='Linux', nodename='6778a6190af5', release='5.15.49-linuxkit-pr', version='#1 SMP PREEMPT Thu May 25 07:27:39 UTC 2023', machine='x86_64')
13:03:14 hserver.py is_mac:105 host_os_name=Linux am_host_os_name=Darwin
13:03:14 hserver.py is_mac:137 macos_tag=21.
13:03:14 hserver.py is_mac:142 host_os_version=5.15.49-linuxkit-pr am_host_os_version=21.6.0
13:03:14 hserver.py is_mac:148 is_mac_=True
13:03:14 hserver.py is_dev4:83 host_name=6778a6190af5 am_host_name=alejandros-MacBook-Pro.local
13:03:14 hserver.py is_dev4:88 host_name=6778a6190af5 am_host_name=alejandros-MacBook-Pro.local
13:03:14 hserver.py is_mac:101 version=Monterey
13:03:14 hserver.py is_mac:103 os.uname()=posix.uname_result(sysname='Linux', nodename='6778a6190af5', release='5.15.49-linuxkit-pr', version='#1 SMP PREEMPT Thu May 25 07:27:39 UTC 2023', machine='x86_64')
13:03:14 hserver.py is_mac:105 host_os_name=Linux am_host_os_name=Darwin
13:03:14 hserver.py is_mac:137 macos_tag=21.
13:03:14 hserver.py is_mac:142 host_os_version=5.15.49-linuxkit-pr am_host_os_version=21.6.0
13:03:14 hserver.py is_mac:148 is_mac_=True
13:03:14 hsql_test.py setUpClass:79 connection_info=DbConnectionInfo(host='oms_postgres5450', dbname='oms_postgres_db_local', port=5432, user='aljsdalsd', password='alsdkqoen')
13:03:14 hsql_implementation.py get_connection:55 host='oms_postgres5450', dbname='oms_postgres_db_local', port=5432, user='aljsdalsd'
13:03:14 hsql_test.py setUpClass:98 cmd=sudo docker-compose --file /app/oms/devops/compose/docker-compose_5450.yml --env-file /app/oms/devops/env/local.oms_db_config_5450.env up -d oms_postgres5450
13:03:14 hsystem.py _system:223 > (sudo docker-compose --file /app/oms/devops/compose/docker-compose_5450.yml --env-file /app/oms/devops/env/local.oms_db_config_5450.env up -d oms_postgres5450) 2>&1
... Volume "compose_oms_postgres5450_data" Creating
... Volume "compose_oms_postgres5450_data" Created
... time="2023-06-21T17:03:15Z" level=warning msg="Found orphan containers ([jandro.cmamp.app.sorrentum1.20230621_124828 compose-im_postgres4960-1 compose-im_postgres762-1 compose-im_postgres7402-1 compose-im_postgres4171-1 compose-im_postgres1489-1 compose-oms_postgres1102-1 compose-oms_postgres8837-1 compose-oms_postgres4068-1 compose-oms_postgres7884-1 compose-oms_postgres7684-1 compose-oms_postgres3113-1 compose-oms_postgres1081-1 compose-oms_postgres5019-1 compose-oms_postgres2901-1 compose-oms_postgres2023-1 jandro.cmamp.jupyter_server.sorrentum1.20230616_132944]) for this project. If you removed or renamed this service in your compose file, you can run this command with the --remove-orphans flag to clean it up."
... Container compose-oms_postgres5450-1 Creating
... Container compose-oms_postgres5450-1 Created
... Container compose-oms_postgres5450-1 Starting
... Container compose-oms_postgres5450-1 Started
13:03:15 hsql_implementation.py wait_db_connection:200 dbname=oms_postgres_db_local, port=5432, host=oms_postgres5450
13:03:15 - INFO hsql_implementation.py wait_db_connection:203 Waiting for PostgreSQL to become available...
13:03:15 hsql_implementation.py get_connection:55 host='oms_postgres5450', dbname='oms_postgres_db_local', port=5432, user='aljsdalsd'
13:03:16 - INFO hsql_implementation.py wait_db_connection:203 Waiting for PostgreSQL to become available...
13:03:16 hsql_implementation.py get_connection:55 host='oms_postgres5450', dbname='oms_postgres_db_local', port=5432, user='aljsdalsd'
13:03:16 - INFO hsql_implementation.py wait_db_connection:206 PostgreSQL is available (after 1 seconds)
13:03:16 hsql_implementation.py get_connection:55 host='oms_postgres5450', dbname='oms_postgres_db_local', port=5432, user='aljsdalsd'
13:03:16 - INFO hunit_test.py setUp:1042
################################################################################
TestRestrictions1.test2
################################################################################
13:03:16 hwall_clock_time.py reset_current_bar_timestamp:92 Reset
13:03:16 oms_db.py create_restrictions_table:245 db_connection=<connection object at 0x40af872cc0; dsn: 'user=aljsdalsd password=xxx dbname=oms_postgres_db_local host=oms_postgres5450 port=5432', closed: 0>, incremental=False, asset_id_name='asset_id', table_name='restrictions'
13:03:16 oms_db.py create_restrictions_table:269 query=DROP TABLE IF EXISTS restrictions;
CREATE TABLE IF NOT EXISTS restrictions (
strategyid VARCHAR(64),
account VARCHAR(64),
id INT,
tradedate DATE NOT NULL,
timestamp_db TIMESTAMP NOT NULL,
asset_id INT,
is_restricted BOOL,
is_buy_restricted BOOL,
is_buy_cover_restricted BOOL,
is_sell_short_restricted BOOL,
is_sell_long_restricted BOOL
);
13:03:16 hsql_implementation.py execute_insert_query:718 df=
strategyid account id tradedate timestamp_db asset_id is_restricted is_buy_restricted is_buy_cover_restricted is_sell_short_restricted is_sell_long_restricted
0 SAU1 paper 0 2000-01-01 2000-01-01 21:38:39.419536 101 TRUE TRUE TRUE TRUE TRUE
13:03:16 hsql_implementation.py create_insert_query:667 query=INSERT INTO restrictions(strategyid,account,id,tradedate,timestamp_db,asset_id,is_restricted,is_buy_restricted,is_buy_cover_restricted,is_sell_short_restricted,is_sell_long_restricted) VALUES %s
13:03:16 hsql_implementation.py execute_insert_query:718 df=
strategyid account id tradedate timestamp_db asset_id is_restricted is_buy_restricted is_buy_cover_restricted is_sell_short_restricted is_sell_long_restricted
0 SAU1 paper 0 2000-01-01 2000-01-01 21:38:38.419536 201 TRUE FALSE FALSE FALSE FALSE
13:03:16 hsql_implementation.py create_insert_query:667 query=INSERT INTO restrictions(strategyid,account,id,tradedate,timestamp_db,asset_id,is_restricted,is_buy_restricted,is_buy_cover_restricted,is_sell_short_restricted,is_sell_long_restricted) VALUES %s
13:03:16 hsql_implementation.py execute_insert_query:718 df=
strategyid account id tradedate timestamp_db asset_id is_restricted is_buy_restricted is_buy_cover_restricted is_sell_short_restricted is_sell_long_restricted
0 SAU1 paper 0 1999-12-31 1999-12-31 21:38:56.12345 101 TRUE TRUE TRUE TRUE TRUE
13:03:16 hsql_implementation.py create_insert_query:667 query=INSERT INTO restrictions(strategyid,account,id,tradedate,timestamp_db,asset_id,is_restricted,is_buy_restricted,is_buy_cover_restricted,is_sell_short_restricted,is_sell_long_restricted) VALUES %s
13:03:16 restrictions.py _get_trading_restrictions:68 wall_clock_timestamp=2000-01-01 22:00:00.123450
13:03:16 restrictions.py _get_trading_restrictions:77 query=SELECT * FROM restrictions
WHERE tradedate='2000-01-01' AND account='paper'
ORDER BY asset_id
13:03:17 restrictions.py _get_trading_restrictions:83 restrictions_df=
strategyid account id tradedate timestamp_db asset_id is_restricted is_buy_restricted is_buy_cover_restricted is_sell_short_restricted is_sell_long_restricted
0 SAU1 paper 0 2000-01-01 2000-01-01 21:38:39.419536 101 True True True True True
1 SAU1 paper 0 2000-01-01 2000-01-01 21:38:38.419536 201 True False False False False
/app/helpers/hunit_test.py:768: FutureWarning: column_space is deprecated and will be removed in a future version. Use df.to_string(col_space=...) instead.
pd.set_option(full_key, new_val)
/app/helpers/hsql_implementation.py:582: UserWarning: pandas only supports SQLAlchemy connectable (engine/connection) or database string URI or sqlite3 DBAPI2 connection. Other DBAPI2 objects are not tested. Please consider using SQLAlchemy.
df = pd.read_sql_query(query, connection)
(0.51 s) PASSED13:03:17 - INFO hsql_test.py tearDownClass:112
################################################################################
tearDown
################################################################################
13:03:17 hserver.py is_dev4:83 host_name=6778a6190af5 am_host_name=alejandros-MacBook-Pro.local
13:03:17 hserver.py is_dev4:88 host_name=6778a6190af5 am_host_name=alejandros-MacBook-Pro.local
13:03:17 hserver.py is_mac:101 version=Monterey
13:03:17 hserver.py is_mac:103 os.uname()=posix.uname_result(sysname='Linux', nodename='6778a6190af5', release='5.15.49-linuxkit-pr', version='#1 SMP PREEMPT Thu May 25 07:27:39 UTC 2023', machine='x86_64')
13:03:17 hserver.py is_mac:105 host_os_name=Linux am_host_os_name=Darwin
13:03:17 hserver.py is_mac:137 macos_tag=21.
13:03:17 hserver.py is_mac:142 host_os_version=5.15.49-linuxkit-pr am_host_os_version=21.6.0
13:03:17 hserver.py is_mac:148 is_mac_=True
13:03:17 hdocker.py container_rm:17 container_name='compose-oms_postgres5450-1'
13:03:17 hsystem.py _system:223 > (docker container ls --filter name=/compose-oms_postgres5450-1 -aq) 2>&1
13:03:17 - ERROR hsystem.py _system:272
################################################################################
cmd='(docker container ls --filter name=/compose-oms_postgres5450-1 -aq) 2>&1' failed with rc='1'
################################################################################
Output of the failing command is:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Get "http://%2Fvar%2Frun%2Fdocker.sock/v1.24/containers/json?all=1&filters=%7B%22name%22%3A%7B%22%2Fcompose-oms_postgres5450-1%22%3Atrue%7D%7D": dial unix /var/run/docker.sock: connect: permission denied
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
oms/test/test_restrictions.py::TestRestrictions1::test2 ERROR
=========================================================================================================================================================== ERRORS ===========================================================================================================================================================
________________________________________________________________________________________________________________________________________ ERROR at teardown of TestRestrictions1.test2 ________________________________________________________________________________________________________________________________________
Traceback (most recent call last):
File "/app/helpers/hsql_test.py", line 124, in tearDownClass
hdocker.container_rm(container_name)
File "/app/helpers/hdocker.py", line 21, in container_rm
_, container_id = hsystem.system_to_one_line(cmd)
File "/app/helpers/hsystem.py", line 400, in system_to_one_line
rc, output = system_to_string(cmd, *args, **kwargs)
File "/app/helpers/hsystem.py", line 343, in system_to_string
rc, output = _system(
File "/app/helpers/hsystem.py", line 276, in _system
raise RuntimeError(
RuntimeError: cmd='(docker container ls --filter name=/compose-oms_postgres5450-1 -aq) 2>&1' failed with rc='1'
truncated output=
Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Get "http://%2Fvar%2Frun%2Fdocker.sock/v1.24/containers/json?all=1&filters=%7B%22name%22%3A%7B%22%2Fcompose-oms_postgres5450-1%22%3Atrue%7D%7D": dial unix /var/run/docker.sock: connect: permission denied
==================================================================================================================================================== slowest 3 durations =====================================================================================================================================================
2.50s setup oms/test/test_restrictions.py::TestRestrictions1::test2
0.53s call oms/test/test_restrictions.py::TestRestrictions1::test2
0.44s teardown oms/test/test_restrictions.py::TestRestrictions1::test2
================================================================================================================================================== short test summary info ===================================================================================================================================================
ERROR oms/test/test_restrictions.py::TestRestrictions1::test2 - RuntimeError: cmd='(docker container ls --filter name=/compose-oms_postgres5450-1 -aq) 2>&1' failed with rc='1'
================================================================================================================================================ 1 passed, 1 error in 15.02s =================================================================================================================================================
13:03:17 hsystem.py _system:223 > (du -d 0 /mnt/tmpfs/tmp.cache.mem | awk '{print $1}') 2>&1
13:03:17 - INFO hcache.py clear_global_cache:292 Before clear_global_cache: 'global mem' cache: path='/mnt/tmpfs/tmp.cache.mem', size=32.0 KB
13:03:17 - WARN hcache.py clear_global_cache:293 Resetting 'global mem' cache '/mnt/tmpfs/tmp.cache.mem'
13:03:17 - WARN hcache.py clear_global_cache:303 Destroying '/mnt/tmpfs/tmp.cache.mem' ...
13:03:17 - INFO hcache.py clear_global_cache:319 After clear_global_cache: 'global mem' cache: path='/mnt/tmpfs/tmp.cache.mem', size=nan
Is this the currently expected behavior for running this test locally (and therefore this should be marked as requires ....) or should a collaborator be allowed to access the Docker daemon socket (indicating some error in my container set up)
I suspect you will get similar permission error when attempting (from inside container)
user_1002@49b2db0f7bf9:/app$ docker run hello-world
correct?
Try to run:
> user_1002@49b2db0f7bf9:/app$ printenv | grep DIND
it should give you AM_ENABLE_DIND=1
you are correct, if I run:
user_501@357516e7a236:/app$ docker run hello-world
I obtain:
docker: Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Post "http://%2Fvar%2Frun%2Fdocker.sock/v1.24/containers/create": dial unix /var/run/docker.sock: connect: permission denied.
See 'docker run --help'.
but if I enter:
user_501@357516e7a236:/app$ sudo docker run hello-world
I get:
Hello from Docker!
This message shows that your installation appears to be working correctly.
To generate this message, Docker took the following steps:
1. The Docker client contacted the Docker daemon.
2. The Docker daemon pulled the "hello-world" image from the Docker Hub.
(arm64v8)
3. The Docker daemon created a new container from that image which runs the
executable that produces the output you are currently reading.
4. The Docker daemon streamed that output to the Docker client, which sent it
to your terminal.
To try something more ambitious, you can run an Ubuntu container with:
$ docker run -it ubuntu bash
Share images, automate workflows, and more with a free Docker ID:
https://hub.docker.com/
For more examples and ideas, visit:
https://docs.docker.com/get-started/
so this is a problem with sudo privileges, in my docker user
as per your request, when I run:
printenv | grep DIND
I get:
AM_ENABLE_DIND=0
On a wild goose chase here but can you try to run the same printenv | grep DIND
outside of the container
sure, I get no output when I close the container and run:
printenv | grep DIND
docker run hello-world
outside of the container? (without sudo)input:
(amp.client_venv) (base) alejandros-MacBook-Pro:sorrentum1 jandro$ docker run hello-world
Output:
Hello from Docker!
This message shows that your installation appears to be working correctly.
To generate this message, Docker took the following steps:
1. The Docker client contacted the Docker daemon.
2. The Docker daemon pulled the "hello-world" image from the Docker Hub.
(arm64v8)
3. The Docker daemon created a new container from that image which runs the
executable that produces the output you are currently reading.
4. The Docker daemon streamed that output to the Docker client, which sent it
to your terminal.
To try something more ambitious, you can run an Ubuntu container with:
$ docker run -it ubuntu bash
Share images, automate workflows, and more with a free Docker ID:
https://hub.docker.com/
For more examples and ideas, visit:
https://docs.docker.com/get-started/
Input:
(amp.client_venv) (base) alejandros-MacBook-Pro:sorrentum1 jandro$ git status
Output:
On branch CmTask283_Create_test_list_to_run_with_Sorrentum
Changes to be committed:
(use "git restore --staged <file>..." to unstage)
modified: helpers/test/outcomes/TestCheckDataFrame1.test_check_df_not_equal3/output/test_df.txt
modified: helpers/test/outcomes/TestCheckString1.test_check_string1/output/test.txt
modified: helpers/test/outcomes/TestCheckString1.test_check_string_not_equal1/output/test.txt
modified: helpers/test/outcomes/TestCheckString1.test_check_string_not_equal2/output/test.txt
modified: helpers/test/outcomes/TestCheckString1.test_check_string_not_equal3/output/test.txt
Changes not staged for commit:
(use "git add <file>..." to update what will be committed)
(use "git restore <file>..." to discard changes in working directory)
modified: market_data/im_client_market_data.py
modified: market_data/notebooks/gallery_market_data.ipynb
Untracked files:
(use "git add <file>..." to include in what will be committed)
actual.txt
dataflow/core/nodes/test/outcomes/TestMultiindexVolatilityModel.test1/tmp.final.actual.txt
dataflow/core/nodes/test/outcomes/TestMultiindexVolatilityModel.test1/tmp.final.expected.txt
dataflow/core/nodes/test/outcomes/TestMultiindexVolatilityModel.test2/tmp.final.actual.txt
dataflow/core/nodes/test/outcomes/TestMultiindexVolatilityModel.test2/tmp.final.expected.txt
devops/compose/docker-compose.yml
expected.txt
helpers/test/outcomes/TestCheckDataFrame1.test_check_df_missing2/
helpers/test/outcomes/TestCheckDataFrame1.test_check_df_not_equal1/tmp.final.actual.txt
helpers/test/outcomes/TestCheckDataFrame1.test_check_df_not_equal1/tmp.final.expected.txt
helpers/test/outcomes/TestCheckDataFrame1.test_check_df_not_equal2/tmp.final.actual.txt
helpers/test/outcomes/TestCheckDataFrame1.test_check_df_not_equal2/tmp.final.expected.txt
helpers/test/outcomes/TestCheckDataFrame1.test_check_df_not_equal4/tmp.final.actual.txt
helpers/test/outcomes/TestCheckDataFrame1.test_check_df_not_equal4/tmp.final.expected.txt
helpers/test/outcomes/TestCheckString1.test_check_string_missing2/
helpers/test/outcomes/TestCheckString1.test_check_string_not_equal1/tmp.final.actual.txt
helpers/test/outcomes/TestCheckString1.test_check_string_not_equal1/tmp.final.expected.txt
helpers/test/outcomes/TestCheckString1.test_check_string_not_equal3/tmp.final.actual.txt
helpers/test/outcomes/TestCheckString1.test_check_string_not_equal3/tmp.final.expected.txt
helpers/test/outcomes/TestTestCase1.test_assert_equal5/
helpers/test/outcomes/TestTestCase1.test_assert_not_equal1/
helpers/test/outcomes/TestTestCase1.test_assert_not_equal2/
helpers/test/outcomes/Test_purify_from_env_vars.test2/
helpers/test/outcomes/Test_purify_from_env_vars.test_end_to_end/tmp.final.actual.txt
helpers/test/outcomes/Test_purify_from_env_vars.test_end_to_end/tmp.final.expected.txt
im_v2/common/test/generate_pq_test_data.py.log
im_v2/devops/compose/docker-compose_1489.yml
im_v2/devops/compose/docker-compose_4171.yml
im_v2/devops/compose/docker-compose_4960.yml
im_v2/devops/compose/docker-compose_7402.yml
im_v2/devops/compose/docker-compose_762.yml
im_v2/devops/env/local.im_db_config_1489.env
im_v2/devops/env/local.im_db_config_4171.env
im_v2/devops/env/local.im_db_config_4960.env
im_v2/devops/env/local.im_db_config_7402.env
im_v2/devops/env/local.im_db_config_762.env
market_data/im_client_market_data.ipynb
oms/devops/compose/docker-compose_1081.yml
oms/devops/compose/docker-compose_1102.yml
oms/devops/compose/docker-compose_2023.yml
oms/devops/compose/docker-compose_2901.yml
oms/devops/compose/docker-compose_3113.yml
oms/devops/compose/docker-compose_4068.yml
oms/devops/compose/docker-compose_5019.yml
oms/devops/compose/docker-compose_5450.yml
oms/devops/compose/docker-compose_7684.yml
oms/devops/compose/docker-compose_7884.yml
oms/devops/compose/docker-compose_8837.yml
oms/devops/env/local.oms_db_config_1081.env
oms/devops/env/local.oms_db_config_1102.env
oms/devops/env/local.oms_db_config_2023.env
oms/devops/env/local.oms_db_config_2901.env
oms/devops/env/local.oms_db_config_3113.env
oms/devops/env/local.oms_db_config_4068.env
oms/devops/env/local.oms_db_config_5019.env
oms/devops/env/local.oms_db_config_5450.env
oms/devops/env/local.oms_db_config_7684.env
oms/devops/env/local.oms_db_config_7884.env
oms/devops/env/local.oms_db_config_8837.env
tmp.exp_var.txt
tmp.final.actual.txt
tmp.final.expected.txt
tmp.parallel_execute.workload.txt
tmp.pytest.log
tmp_diff.sh
We support two ways of running Docker-based tests (sibling containers and docker-in-docker). MacOS has been changing what it allows out of the box support over time. IMO the problem you are seeing is due to giving permissions to the docker group to access the socket.
I would actually use this issue as a way of finding all tests that need Docker support and put them in a test list. I can see that for some users of Sorrentum it should not be useful or needed to run those tests.
In this way you can keep moving and then we can debug / document what's the problem with your laptop.
Because of all this, we will set you up on the server where we have solved all these issues once and for all instead of fighting with each laptop.
So in short, mark those tests as "need Docker" and skip them. Makes sense?
hi all, curious if this error indicates that the corresponding test requires docker:
I first activate the virtual environment (not in docker container) and then run:
pytest helpers/test/test_unit_test.py::Test_purify_from_env_vars::test2 -s --dbg
I receive the following output:
========================================================================== FAILURES ==========================================================================
______________________________________________________________ Test_purify_from_env_vars.test2 _______________________________________________________________
Traceback (most recent call last):
File "/Users/jandro/src/sorrentum1/helpers/hunit_test.py", line 1070, in setUp
set_pd_default_values()
File "/Users/jandro/src/sorrentum1/helpers/hunit_test.py", line 759, in set_pd_default_values
old_val = pd.get_option(full_key)
File "/Users/jandro/src/venv/amp.client_venv/lib/python3.9/site-packages/pandas/_config/config.py", line 261, in __call__
return self.__func__(*args, **kwds)
File "/Users/jandro/src/venv/amp.client_venv/lib/python3.9/site-packages/pandas/_config/config.py", line 135, in _get_option
key = _get_single_key(pat, silent)
File "/Users/jandro/src/venv/amp.client_venv/lib/python3.9/site-packages/pandas/_config/config.py", line 121, in _get_single_key
raise OptionError(f"No such keys(s): {repr(pat)}")
pandas._config.config.OptionError: "No such keys(s): 'display.column_space'"
==================================================================== slowest 3 durations =====================================================================
0.00s call helpers/test/test_unit_test.py::Test_purify_from_env_vars::test2
0.00s setup helpers/test/test_unit_test.py::Test_purify_from_env_vars::test2
0.00s teardown helpers/test/test_unit_test.py::Test_purify_from_env_vars::test2
================================================================== short test summary info ===================================================================
FAILED helpers/test/test_unit_test.py::Test_purify_from_env_vars::test2 - pandas._config.config.OptionError: "No such keys(s): 'display.column_space'"
looks like this is some pandas config error but unsure if related to docker container
I just pushed a commit to the branch, detailing which functions are failing because of imported modules that private laptops do not have access to without docker. they are labeled with the marker "requires_docker".
The error above is because you are running outside the container and so you are picking up a version of pandas that is not the one that the tests assume.
In general we assume that pytest
and all the commands that run our executables are issued inside the container.
There could be tests that don't require the container but in practice we just assume that everything runs inside the container.
Makes sense?
Makes sense, thanks for clarifying. If we want to label the test function or class responsible for this error, what would be a good label? I used a very basic "requires_docker" pytest marker, but I dont think this is very helpful given that we assume all tests are run from within the container.
I just pushed a commit to the branch, detailing which functions are failing because of imported modules that private laptops do not have access to without docker. they are labeled with the marker "requires_docker".
Can you link the PR to this issue? There should be doc about how to name a PR and how to link it. You can grep for it in docs
From https://github.com/sorrentum/sorrentum/issues/189#issuecomment-1563426753
Contributors can use repos outside our infra on their laptop, thus some tests might not work (e.g., if there is a dependency on AWS).
We want to mark unit tests based on what kind of support is needed, then contributors can just run our
pytest
flow skipping all the tests that are not expected to work outside our infra.no_aws
, ...)Assigning to @samarth9008 as current master of outsourcing. We can do a quick PR to get the skeleton in place. @PomazkinG and @jsmerix can help.