Closed jackbaker1001 closed 1 year ago
Thank you for this demo @jackbaker1001!
Thank you very much for your work @jackbaker1001 ! 😄 I am trying to replicate the demo on my computer and I am not getting it to work. The first cell of the imports is not working. Maybe the requeriments.txt needs to be updated, maybe on Mac there is an issue?
No problem! Thanks for taking the time to look through. Which imports are not working?
Did the conda environment install properly?
I use a Mac too.
Apparently the environment is created but in the installation the following error appears
Pip subprocess error:
Running command git clone --filter=blob:none --quiet https://github.com/gngdb/pytorch-minimize.git /private/var/folders/dr/kq8mcfmd4bx081whqn42ct8h0000gq/T/pip-install-kh09o3sa/pytorch-minimize_9bceb13424a741f4ab2944a55b0e4686
Running command git rev-parse -q --verify 'sha^7cd5e1ffa79d77ab90935b3d4bde4ac7387d49ba'
Running command git fetch -q https://github.com/gngdb/pytorch-minimize.git 7cd5e1ffa79d77ab90935b3d4bde4ac7387d49ba
ERROR: Cannot install -r /Users/guillermo/Documents/GitHub/QuantumVariationalRewinding/requirements.txt (line 21) and psutil==5.9.3 because these package versions have conflicting dependencies.
ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
failed
Hey @KetpuntoG @ikurecic I updated the requirements and enviroment. Could you please try installing again? I tried the whole install on my Mac M1 and it works.
@KetpuntoG @ikurecic Hi again. Do we have any update on this? Thanks!
Hi @jackbaker1001 , we are still seeing issues running the demo on a Mac M1. I'll check on a different system before tomorrow.
Hi @jackbaker1001 , I tried running it on Windows, but that doesn't work — uvloop doesn't support Windows. Could you please check again? My colleague @CatalinaAlbornoz tried running it on a Mac this week and was unsuccessful. You can get in touch with her with updates.
@ikurecic I know that Covalent does not support windows. I will ask another colleague to run the build instructions in the repo and see what happens.
@jackbaker1001 — Great, thank you! We're looking forward to hearing back. By the way, it might be helpful to include the Linux/macOS requirement in the demo readme.
@jackbaker1001 / @ikurecic, the installation seems to work for me on Mac following the instructions in the readme.
System details
Software:
System Software Overview:
System Version: macOS 13.0.1 (22A400)
Kernel Version: Darwin 22.1.0
Boot Volume: Macintosh HD
Boot Mode: Normal
Computer Name: --
User Name: --
Secure Virtual Memory: Enabled
System Integrity Protection: Enabled
(Attached are installation logs)
```console (base) ➜ QuantumVariationalRewinding git:(main) ls QVR_example.ipynb data images README.md environment.yml requirements.txt (base) ➜ QuantumVariationalRewinding git:(main) conda env create -f environment.yml Collecting package metadata (repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 4.14.0 latest version: 22.11.1 Please update conda by running $ conda update -n base -c defaults conda Downloading and Extracting Packages openssl-1.1.1s | 3.1 MB | ##################################### | 100% pip-22.3.1 | 2.7 MB | ##################################### | 100% xz-5.2.8 | 259 KB | ##################################### | 100% ca-certificates-2022 | 125 KB | ##################################### | 100% libffi-3.4.2 | 115 KB | ##################################### | 100% zlib-1.2.13 | 82 KB | ##################################### | 100% readline-8.2 | 353 KB | ##################################### | 100% sqlite-3.40.0 | 1.1 MB | ##################################### | 100% certifi-2022.12.7 | 151 KB | ##################################### | 100% python-3.9.15 | 12.5 MB | ##################################### | 100% setuptools-65.5.0 | 1.1 MB | ##################################### | 100% tzdata-2022g | 114 KB | ##################################### | 100% Preparing transaction: done Verifying transaction: done Executing transaction: done Installing pip dependencies: - Ran pip subprocess with arguments: ['/opt/homebrew/Caskroom/miniconda/base/envs/QVR/bin/python', '-m', 'pip', 'install', '-U', '-r', '/Users/voldemort/Desktop/projects/QuantumVariationalRewinding/condaenv.79lm0ga0.requirements.txt'] Pip subprocess output: Collecting pytorch-minimize@ git+https://github.com/gngdb/pytorch-minimize.git@7cd5e1ffa79d77ab90935b3d4bde4ac7387d49ba Cloning https://github.com/gngdb/pytorch-minimize.git (to revision 7cd5e1ffa79d77ab90935b3d4bde4ac7387d49ba) to /private/var/folders/fd/q_nb_x31067f4kntrmgcykcr0000gn/T/pip-install-5rap75xp/pytorch-minimize_50f518d465e84f37a9b173dceecfd65d Resolved https://github.com/gngdb/pytorch-minimize.git to commit 7cd5e1ffa79d77ab90935b3d4bde4ac7387d49ba Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting aiofiles==22.1.0 Using cached aiofiles-22.1.0-py3-none-any.whl (14 kB) Collecting aiohttp==3.8.1 Downloading aiohttp-3.8.1-cp39-cp39-macosx_11_0_arm64.whl (552 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 552.3/552.3 kB 4.7 MB/s eta 0:00:00 Collecting aiosignal==1.3.1 Downloading aiosignal-1.3.1-py3-none-any.whl (7.6 kB) Collecting alembic==1.8.1 Downloading alembic-1.8.1-py3-none-any.whl (209 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 209.8/209.8 kB 15.4 MB/s eta 0:00:00 Collecting anyio==3.6.2 Downloading anyio-3.6.2-py3-none-any.whl (80 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 80.6/80.6 kB 7.3 MB/s eta 0:00:00 Collecting appdirs==1.4.4 Using cached appdirs-1.4.4-py2.py3-none-any.whl (9.6 kB) Collecting appnope==0.1.3 Using cached appnope-0.1.3-py2.py3-none-any.whl (4.4 kB) Collecting asttokens==2.2.1 Downloading asttokens-2.2.1-py2.py3-none-any.whl (26 kB) Collecting async-timeout==4.0.2 Using cached async_timeout-4.0.2-py3-none-any.whl (5.8 kB) Collecting attrs==22.1.0 Using cached attrs-22.1.0-py2.py3-none-any.whl (58 kB) Collecting autograd==1.5 Using cached autograd-1.5-py3-none-any.whl (48 kB) Collecting autoray==0.5.3 Downloading autoray-0.5.3-py3-none-any.whl (39 kB) Collecting backcall==0.2.0 Using cached backcall-0.2.0-py2.py3-none-any.whl (11 kB) Collecting bidict==0.22.0 Using cached bidict-0.22.0-py3-none-any.whl (36 kB) Collecting cachetools==5.2.0 Using cached cachetools-5.2.0-py3-none-any.whl (9.3 kB) Requirement already satisfied: certifi==2022.12.7 in /opt/homebrew/Caskroom/miniconda/base/envs/QVR/lib/python3.9/site-packages (from -r /Users/voldemort/Desktop/projects/QuantumVariationalRewinding/requirements.txt (line 16)) (2022.12.7) Collecting charset-normalizer==2.1.1 Using cached charset_normalizer-2.1.1-py3-none-any.whl (39 kB) Collecting click==8.1.3 Using cached click-8.1.3-py3-none-any.whl (96 kB) Collecting cloudpickle==2.2.0 Using cached cloudpickle-2.2.0-py3-none-any.whl (25 kB) Collecting comm==0.1.2 Downloading comm-0.1.2-py3-none-any.whl (6.5 kB) Collecting contourpy==1.0.6 Downloading contourpy-1.0.6-cp39-cp39-macosx_11_0_arm64.whl (226 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 226.0/226.0 kB 21.0 MB/s eta 0:00:00 Collecting covalent==0.202.0.post1 Using cached covalent-0.202.0.post1.tar.gz (3.9 MB) Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Collecting cycler==0.11.0 Using cached cycler-0.11.0-py3-none-any.whl (6.4 kB) Collecting dask==2022.9.0 Using cached dask-2022.9.0-py3-none-any.whl (1.1 MB) Collecting debugpy==1.6.4 Downloading debugpy-1.6.4-py2.py3-none-any.whl (4.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.9/4.9 MB 42.0 MB/s eta 0:00:00 Collecting decorator==5.1.1 Using cached decorator-5.1.1-py3-none-any.whl (9.1 kB) Collecting distributed==2022.9.0 Downloading distributed-2022.9.0-py3-none-any.whl (902 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 902.4/902.4 kB 41.4 MB/s eta 0:00:00 Collecting entrypoints==0.4 Using cached entrypoints-0.4-py3-none-any.whl (5.3 kB) Collecting executing==1.2.0 Downloading executing-1.2.0-py2.py3-none-any.whl (24 kB) Collecting fastapi==0.83.0 Downloading fastapi-0.83.0-py3-none-any.whl (55 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 55.3/55.3 kB 4.6 MB/s eta 0:00:00 Collecting fonttools==4.38.0 Downloading fonttools-4.38.0-py3-none-any.whl (965 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 965.4/965.4 kB 34.0 MB/s eta 0:00:00 Collecting frozenlist==1.3.3 Downloading frozenlist-1.3.3-cp39-cp39-macosx_11_0_arm64.whl (35 kB) Collecting fsspec==2022.11.0 Downloading fsspec-2022.11.0-py3-none-any.whl (139 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 139.5/139.5 kB 11.3 MB/s eta 0:00:00 Collecting furl==2.1.3 Using cached furl-2.1.3-py2.py3-none-any.whl (20 kB) Collecting future==0.18.2 Using cached future-0.18.2.tar.gz (829 kB) Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting h11==0.14.0 Using cached h11-0.14.0-py3-none-any.whl (58 kB) Collecting HeapDict==1.0.1 Using cached HeapDict-1.0.1-py3-none-any.whl (3.9 kB) Collecting httptools==0.5.0 Downloading httptools-0.5.0-cp39-cp39-macosx_10_9_universal2.whl (231 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 231.5/231.5 kB 16.3 MB/s eta 0:00:00 Collecting idna==3.4 Using cached idna-3.4-py3-none-any.whl (61 kB) Collecting ipykernel==6.19.2 Downloading ipykernel-6.19.2-py3-none-any.whl (145 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 145.1/145.1 kB 11.2 MB/s eta 0:00:00 Collecting ipython==8.7.0 Downloading ipython-8.7.0-py3-none-any.whl (761 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 761.7/761.7 kB 35.8 MB/s eta 0:00:00 Collecting jedi==0.18.2 Downloading jedi-0.18.2-py2.py3-none-any.whl (1.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 44.5 MB/s eta 0:00:00 Collecting Jinja2==3.1.2 Using cached Jinja2-3.1.2-py3-none-any.whl (133 kB) Collecting joblib==1.2.0 Using cached joblib-1.2.0-py3-none-any.whl (297 kB) Collecting jupyter_core==5.1.0 Downloading jupyter_core-5.1.0-py3-none-any.whl (92 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 92.7/92.7 kB 7.5 MB/s eta 0:00:00 Collecting kiwisolver==1.4.4 Downloading kiwisolver-1.4.4-cp39-cp39-macosx_11_0_arm64.whl (63 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 63.2/63.2 kB 5.8 MB/s eta 0:00:00 Collecting locket==1.0.0 Using cached locket-1.0.0-py2.py3-none-any.whl (4.4 kB) Collecting Mako==1.2.4 Downloading Mako-1.2.4-py3-none-any.whl (78 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 78.7/78.7 kB 5.1 MB/s eta 0:00:00 Collecting MarkupSafe==2.1.1 Downloading MarkupSafe-2.1.1-cp39-cp39-macosx_10_9_universal2.whl (17 kB) Collecting matplotlib==3.6.2 Downloading matplotlib-3.6.2-cp39-cp39-macosx_11_0_arm64.whl (7.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.2/7.2 MB 81.9 MB/s eta 0:00:00 Collecting matplotlib-inline==0.1.6 Downloading matplotlib_inline-0.1.6-py3-none-any.whl (9.4 kB) Collecting msgpack==1.0.4 Downloading msgpack-1.0.4-cp39-cp39-macosx_11_0_arm64.whl (69 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 69.8/69.8 kB 6.4 MB/s eta 0:00:00 Collecting multidict==6.0.3 Downloading multidict-6.0.3-cp39-cp39-macosx_11_0_arm64.whl (29 kB) Collecting nest-asyncio==1.5.6 Using cached nest_asyncio-1.5.6-py3-none-any.whl (5.2 kB) Collecting networkx==2.8.6 Using cached networkx-2.8.6-py3-none-any.whl (2.0 MB) Collecting ninja==1.11.1 Using cached ninja-1.11.1-py2.py3-none-macosx_10_9_universal2.macosx_10_9_x86_64.macosx_11_0_arm64.macosx_11_0_universal2.whl (270 kB) Collecting numpy==1.23.5 Downloading numpy-1.23.5-cp39-cp39-macosx_11_0_arm64.whl (13.4 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 13.4/13.4 MB 76.2 MB/s eta 0:00:00 Collecting orderedmultidict==1.0.1 Using cached orderedmultidict-1.0.1-py2.py3-none-any.whl (11 kB) Collecting packaging==22.0 Downloading packaging-22.0-py3-none-any.whl (42 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 42.6/42.6 kB 3.4 MB/s eta 0:00:00 Collecting parso==0.8.3 Using cached parso-0.8.3-py2.py3-none-any.whl (100 kB) Collecting partd==1.3.0 Using cached partd-1.3.0-py3-none-any.whl (18 kB) Collecting PennyLane==0.27.0 Using cached PennyLane-0.27.0-py3-none-any.whl (1.1 MB) Collecting PennyLane-Lightning==0.27.0 Downloading PennyLane_Lightning-0.27.0-cp39-cp39-macosx_11_0_arm64.whl (897 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 897.7/897.7 kB 40.7 MB/s eta 0:00:00 Collecting pexpect==4.8.0 Using cached pexpect-4.8.0-py2.py3-none-any.whl (59 kB) Collecting pickleshare==0.7.5 Using cached pickleshare-0.7.5-py2.py3-none-any.whl (6.9 kB) Collecting Pillow==9.3.0 Downloading Pillow-9.3.0-cp39-cp39-macosx_11_0_arm64.whl (2.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.9/2.9 MB 56.2 MB/s eta 0:00:00 Collecting platformdirs==2.6.0 Downloading platformdirs-2.6.0-py3-none-any.whl (14 kB) Collecting prompt-toolkit==3.0.36 Downloading prompt_toolkit-3.0.36-py3-none-any.whl (386 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 386.4/386.4 kB 27.9 MB/s eta 0:00:00 Collecting psutil==5.9.2 Downloading psutil-5.9.2.tar.gz (479 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 479.8/479.8 kB 25.5 MB/s eta 0:00:00 Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting ptyprocess==0.7.0 Using cached ptyprocess-0.7.0-py2.py3-none-any.whl (13 kB) Collecting pure-eval==0.2.2 Using cached pure_eval-0.2.2-py3-none-any.whl (11 kB) Collecting pydantic==1.10.2 Downloading pydantic-1.10.2-cp39-cp39-macosx_11_0_arm64.whl (2.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.6/2.6 MB 54.4 MB/s eta 0:00:00 Collecting Pygments==2.13.0 Using cached Pygments-2.13.0-py3-none-any.whl (1.1 MB) Collecting pyparsing==3.0.9 Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB) Collecting python-dateutil==2.8.2 Using cached python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB) Collecting python-dotenv==0.21.0 Using cached python_dotenv-0.21.0-py3-none-any.whl (18 kB) Collecting python-engineio==4.3.4 Using cached python_engineio-4.3.4-py3-none-any.whl (52 kB) Collecting python-socketio==5.7.1 Using cached python_socketio-5.7.1-py3-none-any.whl (56 kB) Collecting PyYAML==6.0 Downloading PyYAML-6.0-cp39-cp39-macosx_11_0_arm64.whl (173 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 174.0/174.0 kB 10.2 MB/s eta 0:00:00 Collecting pyzmq==24.0.1 Downloading pyzmq-24.0.1-cp39-cp39-macosx_10_15_universal2.whl (1.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.8/1.8 MB 50.7 MB/s eta 0:00:00 Collecting requests==2.28.1 Using cached requests-2.28.1-py3-none-any.whl (62 kB) Collecting retworkx==0.12.1 Downloading retworkx-0.12.1-py3-none-any.whl (10 kB) Collecting rustworkx==0.12.1 Downloading rustworkx-0.12.1-cp39-cp39-macosx_11_0_arm64.whl (1.4 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.4/1.4 MB 52.5 MB/s eta 0:00:00 Collecting scikit-learn==1.2.0 Downloading scikit_learn-1.2.0-cp39-cp39-macosx_12_0_arm64.whl (8.3 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.3/8.3 MB 74.0 MB/s eta 0:00:00 Collecting scipy==1.9.3 Downloading scipy-1.9.3-cp39-cp39-macosx_12_0_arm64.whl (28.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 28.6/28.6 MB 64.0 MB/s eta 0:00:00 Collecting semantic-version==2.10.0 Using cached semantic_version-2.10.0-py2.py3-none-any.whl (15 kB) Collecting simplejson==3.17.6 Downloading simplejson-3.17.6-cp39-cp39-macosx_11_0_arm64.whl (73 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 73.6/73.6 kB 6.6 MB/s eta 0:00:00 Collecting six==1.16.0 Using cached six-1.16.0-py2.py3-none-any.whl (11 kB) Collecting sniffio==1.3.0 Using cached sniffio-1.3.0-py3-none-any.whl (10 kB) Collecting sortedcontainers==2.4.0 Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB) Collecting SQLAlchemy==1.4.41 Using cached SQLAlchemy-1.4.41.tar.gz (8.3 MB) Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting stack-data==0.6.2 Downloading stack_data-0.6.2-py3-none-any.whl (24 kB) Collecting starlette==0.19.1 Using cached starlette-0.19.1-py3-none-any.whl (63 kB) Collecting tblib==1.7.0 Using cached tblib-1.7.0-py2.py3-none-any.whl (12 kB) Collecting threadpoolctl==3.1.0 Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB) Collecting toml==0.10.2 Using cached toml-0.10.2-py2.py3-none-any.whl (16 kB) Collecting toolz==0.12.0 Using cached toolz-0.12.0-py3-none-any.whl (55 kB) Collecting torch==1.13.0 Downloading torch-1.13.0-cp39-none-macosx_11_0_arm64.whl (55.7 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 55.7/55.7 MB 41.8 MB/s eta 0:00:00 Collecting tornado==6.1 Using cached tornado-6.1.tar.gz (497 kB) Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting traitlets==5.7.1 Downloading traitlets-5.7.1-py3-none-any.whl (109 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 109.9/109.9 kB 12.4 MB/s eta 0:00:00 Collecting typing_extensions==4.4.0 Using cached typing_extensions-4.4.0-py3-none-any.whl (26 kB) Collecting urllib3==1.26.13 Using cached urllib3-1.26.13-py2.py3-none-any.whl (140 kB) Collecting uvicorn==0.18.3 Downloading uvicorn-0.18.3-py3-none-any.whl (57 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.4/57.4 kB 4.7 MB/s eta 0:00:00 Collecting uvloop==0.17.0 Downloading uvloop-0.17.0-cp39-cp39-macosx_10_9_universal2.whl (2.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.2/2.2 MB 50.3 MB/s eta 0:00:00 Collecting watchfiles==0.18.1 Downloading watchfiles-0.18.1-cp37-abi3-macosx_11_0_arm64.whl (367 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 367.2/367.2 kB 21.7 MB/s eta 0:00:00 Collecting wcwidth==0.2.5 Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB) Collecting websockets==10.4 Downloading websockets-10.4-cp39-cp39-macosx_11_0_arm64.whl (97 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 97.9/97.9 kB 9.1 MB/s eta 0:00:00 Collecting Werkzeug==2.2.2 Downloading Werkzeug-2.2.2-py3-none-any.whl (232 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 232.7/232.7 kB 13.4 MB/s eta 0:00:00 Collecting yarl==1.8.2 Downloading yarl-1.8.2-cp39-cp39-macosx_11_0_arm64.whl (57 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.8/57.8 kB 5.9 MB/s eta 0:00:00 Collecting zict==2.2.0 Using cached zict-2.2.0-py2.py3-none-any.whl (23 kB) Collecting jupyter-client>=6.1.12 Downloading jupyter_client-7.4.8-py3-none-any.whl (133 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.5/133.5 kB 12.5 MB/s eta 0:00:00 Downloading jupyter_client-7.4.7-py3-none-any.whl (133 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.5/133.5 kB 10.5 MB/s eta 0:00:00 Downloading jupyter_client-7.4.6-py3-none-any.whl (133 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.5/133.5 kB 13.5 MB/s eta 0:00:00 Downloading jupyter_client-7.4.5-py3-none-any.whl (132 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 132.3/132.3 kB 12.6 MB/s eta 0:00:00 Downloading jupyter_client-7.4.4-py3-none-any.whl (132 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 132.3/132.3 kB 11.1 MB/s eta 0:00:00 Downloading jupyter_client-7.4.3-py3-none-any.whl (132 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 132.2/132.2 kB 12.5 MB/s eta 0:00:00 Downloading jupyter_client-7.4.2-py3-none-any.whl (132 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 132.2/132.2 kB 14.6 MB/s eta 0:00:00 Downloading jupyter_client-7.4.1-py3-none-any.whl (132 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 132.1/132.1 kB 14.1 MB/s eta 0:00:00 Downloading jupyter_client-7.4.0-py3-none-any.whl (132 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 132.2/132.2 kB 13.1 MB/s eta 0:00:00 Downloading jupyter_client-7.3.5-py3-none-any.whl (132 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 132.1/132.1 kB 11.9 MB/s eta 0:00:00 Using cached jupyter_client-7.3.4-py3-none-any.whl (132 kB) Building wheels for collected packages: covalent, future, psutil, SQLAlchemy, tornado, pytorch-minimize Building wheel for covalent (pyproject.toml): started Building wheel for covalent (pyproject.toml): finished with status 'done' Created wheel for covalent: filename=covalent-0.202.0.post1-py3-none-any.whl size=4015429 sha256=e32ad16c03fff180e9bd6e7d0f06a95d4fde0ecda0c423c0e108939b873b9a31 Stored in directory: /Users/voldemort/Library/Caches/pip/wheels/94/1c/38/713413923a2258b8cfbec33d4b38d11a7fdc0a9efab7a41c28 Building wheel for future (setup.py): started Building wheel for future (setup.py): finished with status 'done' Created wheel for future: filename=future-0.18.2-py3-none-any.whl size=491058 sha256=ea31f018bd243f6d898cb07be69431a36d738efd07530a2ea84be102036f8aa8 Stored in directory: /Users/voldemort/Library/Caches/pip/wheels/96/66/19/2de75120f5d0bc185e9d16cf0fd223d8471ed025de08e45867 Building wheel for psutil (setup.py): started Building wheel for psutil (setup.py): finished with status 'done' Created wheel for psutil: filename=psutil-5.9.2-cp39-cp39-macosx_11_0_arm64.whl size=238047 sha256=b95af19ff20ed619ad84ff4171d29e8b39871ef946c319a7ff3d744f9cbcfb7a Stored in directory: /Users/voldemort/Library/Caches/pip/wheels/9d/2b/89/974c668340dad3d4af94432c8b2bc11dbc3b37f25a6b909a70 Building wheel for SQLAlchemy (setup.py): started Building wheel for SQLAlchemy (setup.py): finished with status 'done' Created wheel for SQLAlchemy: filename=SQLAlchemy-1.4.41-cp39-cp39-macosx_11_0_arm64.whl size=1548114 sha256=8604557891e4b3b9280073a467ec3e530f5fe435882abb0c94e3330095a36e31 Stored in directory: /Users/voldemort/Library/Caches/pip/wheels/13/a9/ed/708b588dce2840a4af2c4695d09774f418d3571f0bcb2e7aa7 Building wheel for tornado (setup.py): started Building wheel for tornado (setup.py): finished with status 'done' Created wheel for tornado: filename=tornado-6.1-cp39-cp39-macosx_11_0_arm64.whl size=416634 sha256=5dbe4b2f760f146620577d660e46895314dfc2182980bb01236f7e8e753dde88 Stored in directory: /Users/voldemort/Library/Caches/pip/wheels/85/1a/d9/3168afb3db1b8097a8b116cca803ae9e22aa2e325d07ee5efe Building wheel for pytorch-minimize (setup.py): started Building wheel for pytorch-minimize (setup.py): finished with status 'done' Created wheel for pytorch-minimize: filename=pytorch_minimize-0.2.0-py2.py3-none-any.whl size=10278 sha256=c50dabe6989909005fd7754c171faca97a2a79c1859079e9874bc08a5d81c916 Stored in directory: /Users/voldemort/Library/Caches/pip/wheels/e6/57/fd/b294a0e23c4726e0c520d2776934a5e7e819f3b64af6c900df Successfully built covalent future psutil SQLAlchemy tornado pytorch-minimize Installing collected packages: wcwidth, sortedcontainers, pure-eval, ptyprocess, pickleshare, ninja, msgpack, HeapDict, executing, backcall, appnope, appdirs, zict, websockets, uvloop, urllib3, typing_extensions, traitlets, tornado, toolz, toml, threadpoolctl, tblib, SQLAlchemy, sniffio, six, simplejson, semantic-version, pyzmq, PyYAML, python-engineio, python-dotenv, pyparsing, Pygments, psutil, prompt-toolkit, platformdirs, Pillow, pexpect, parso, packaging, numpy, networkx, nest-asyncio, multidict, MarkupSafe, locket, kiwisolver, joblib, idna, httptools, h11, future, fsspec, frozenlist, fonttools, entrypoints, decorator, debugpy, cycler, cloudpickle, click, charset-normalizer, cachetools, bidict, autoray, attrs, async-timeout, aiofiles, yarl, Werkzeug, uvicorn, torch, scipy, rustworkx, requests, python-socketio, python-dateutil, pydantic, partd, orderedmultidict, matplotlib-inline, Mako, jupyter_core, Jinja2, jedi, contourpy, comm, autograd, asttokens, anyio, aiosignal, watchfiles, starlette, stack-data, scikit-learn, retworkx, pytorch-minimize, matplotlib, jupyter-client, furl, dask, alembic, aiohttp, ipython, fastapi, distributed, ipykernel, covalent, PennyLane-Lightning, PennyLane Successfully installed HeapDict-1.0.1 Jinja2-3.1.2 Mako-1.2.4 MarkupSafe-2.1.1 PennyLane-0.27.0 PennyLane-Lightning-0.27.0 Pillow-9.3.0 PyYAML-6.0 Pygments-2.13.0 SQLAlchemy-1.4.41 Werkzeug-2.2.2 aiofiles-22.1.0 aiohttp-3.8.1 aiosignal-1.3.1 alembic-1.8.1 anyio-3.6.2 appdirs-1.4.4 appnope-0.1.3 asttokens-2.2.1 async-timeout-4.0.2 attrs-22.1.0 autograd-1.5 autoray-0.5.3 backcall-0.2.0 bidict-0.22.0 cachetools-5.2.0 charset-normalizer-2.1.1 click-8.1.3 cloudpickle-2.2.0 comm-0.1.2 contourpy-1.0.6 covalent-0.202.0.post1 cycler-0.11.0 dask-2022.9.0 debugpy-1.6.4 decorator-5.1.1 distributed-2022.9.0 entrypoints-0.4 executing-1.2.0 fastapi-0.83.0 fonttools-4.38.0 frozenlist-1.3.3 fsspec-2022.11.0 furl-2.1.3 future-0.18.2 h11-0.14.0 httptools-0.5.0 idna-3.4 ipykernel-6.19.2 ipython-8.7.0 jedi-0.18.2 joblib-1.2.0 jupyter-client-7.3.4 jupyter_core-5.1.0 kiwisolver-1.4.4 locket-1.0.0 matplotlib-3.6.2 matplotlib-inline-0.1.6 msgpack-1.0.4 multidict-6.0.3 nest-asyncio-1.5.6 networkx-2.8.6 ninja-1.11.1 numpy-1.23.5 orderedmultidict-1.0.1 packaging-22.0 parso-0.8.3 partd-1.3.0 pexpect-4.8.0 pickleshare-0.7.5 platformdirs-2.6.0 prompt-toolkit-3.0.36 psutil-5.9.2 ptyprocess-0.7.0 pure-eval-0.2.2 pydantic-1.10.2 pyparsing-3.0.9 python-dateutil-2.8.2 python-dotenv-0.21.0 python-engineio-4.3.4 python-socketio-5.7.1 pytorch-minimize-0.2.0 pyzmq-24.0.1 requests-2.28.1 retworkx-0.12.1 rustworkx-0.12.1 scikit-learn-1.2.0 scipy-1.9.3 semantic-version-2.10.0 simplejson-3.17.6 six-1.16.0 sniffio-1.3.0 sortedcontainers-2.4.0 stack-data-0.6.2 starlette-0.19.1 tblib-1.7.0 threadpoolctl-3.1.0 toml-0.10.2 toolz-0.12.0 torch-1.13.0 tornado-6.1 traitlets-5.7.1 typing_extensions-4.4.0 urllib3-1.26.13 uvicorn-0.18.3 uvloop-0.17.0 watchfiles-0.18.1 wcwidth-0.2.5 websockets-10.4 yarl-1.8.2 zict-2.2.0 done # # To activate this environment, use # # $ conda activate QVR # # To deactivate an active environment, use # # $ conda deactivate Retrieving notices: ...working... done (base) ➜ QuantumVariationalRewinding git:(main) conda activate QVR (QVR) ➜ QuantumVariationalRewinding git:(main) ```
Hi @jackbaker1001 and @santoshkumarradha.
I finally managed to properly create the environment but now I get this error when trying to train the model
---------------------------------------------------------------------------
HTTPError Traceback (most recent call last)
Cell In[16], line 2
1 Xtr_path = "".join([base_path, "/data/separated_data/Xtr.pickle"])
----> 2 dispatch_id = ct.dispatch(training_workflow)(Xtr_path=Xtr_path,
3 n_series_batch=10,
4 n_t_batch=10,
5 num_distributions=3,
6 transform_func=qml.templates.StronglyEntanglingLayers,
7 n_qubits=2,
8 transform_func_layers=3,
9 embed_func=qml.templates.AngleEmbedding,
10 N_E=10,
11 k=2,
12 observable=[qml.PauliZ(i) for i in range(n_qubits)],
13 tau=15,
14 optimizer_params={"method": "Powell",
15 "options": {"disp": True, "maxfev": 500,
16 "jac": False, "maxiter": 500}, "jac": False})
18 ct_results = ct.get_result(dispatch_id=dispatch_id, wait=True)
19 opt_results = ct_results.result
File ~/miniforge3/envs/QVR/lib/python3.9/site-packages/covalent/_dispatcher_plugins/local.py:87, in LocalDispatcher.dispatch.<locals>.wrapper(*args, **kwargs)
84 test_url = f"http://{dispatcher_addr}/api/submit"
86 r = requests.post(test_url, data=json_lattice)
---> 87 r.raise_for_status()
88 return r.content.decode("utf-8").strip().replace('"', "")
File ~/miniforge3/envs/QVR/lib/python3.9/site-packages/requests/models.py:1021, in Response.raise_for_status(self)
1016 http_error_msg = (
1017 f"{self.status_code} Server Error: {reason} for url: {self.url}"
1018 )
1020 if http_error_msg:
-> 1021 raise HTTPError(http_error_msg, response=self)
HTTPError: 500 Server Error: Internal Server Error for url: http://localhost:48008/api/submit
When I navigate to the localhost I see this
Hi @ikurecic
(1) I updated the readme to specifically state only Mac and Linux supports as per your suggestion.
(2) For the error you are seeing, could you ensure that (i) you only have Covalent installed in the conda environment of this repo (i.e not also installed in the base conda env) and (ii) ensure that the covalent server is launched while the conda env is activated and before you run any of the notebook cells.
I will also ask one of our engineers to look into this, but please try the above and let me know!
Hey @jackbaker1001 thank you for updating those files! I have been trying again and the Covalent is now activated correctly and I no longer get any problems importing the libraries. However, I have a problem inside Covalent:
I suspected it would be the python version so I switched to 3.10. In that case it stops a little further but the following error occurs:
so I imagine that the error will not come from the version. Do you have any idea what happens at that point?
Hi @KetpuntoG . I just tried following the install instructions again on my side and I have to say it all works so this is rather puzzling.
Could you please verify that (i) the only installation of Covalent is in the QVR
conda env and (ii) you ran covalent start --ignore-migrations
in a terminal window (not a magic command in the Jupyter) where the QVR
environment was already activated?
Also, what OS are you running on?
Also @KetpuntoG , Python 3.10 will not work with Covalent, I would leave the version as-is as defined in environment.yml
. Could you trying following the instructions on the readme.MD again from scratch?
If this fails, trying purging covalent before running the notebook:
> covalent purge -Hy
Hey @jackbaker1001 The issue remains the same. The only time I've gone outside the instructions is that, by default, when installing the environment is not included jupyter notebook, so when installing it there is a dependency that fails, so I need to update tornado
to 6.2. I also have to install with pip
covalent
(which also has the problem with tornado
). Otherwise everything installs correctly.
I'm using a MacBook Air M1 (Monterey 12.6)
Hi @KetpuntoG . Looking back in this conversation, the requirements file was changed to fix this last week and now includes jupyter:
jupyter_core==5.1.0
Could you please try the re-install?
edit: ah, jupyter core doesn't include notebook. Let me try and figure this out. Sorry!
ok @KetpuntoG I hope the new fix now deals with this. Could i ask you for (hopefully) the last time to try the install instructions on the repo again please?
The requirements were updated and are free of conflicts with jupyter notebook on my machine and on a colleages.
Thanks for your patience here!
Hi @jackbaker1001
As soon as I launch the Covalent server I get the following message:
This is the log message:
Exception in ASGI application
Traceback (most recent call last):
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1900, in _execute_context
self.dialect.do_execute(
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
cursor.execute(statement, parameters)
sqlite3.OperationalError: no such table: lattices
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/uvicorn/protocols/http/httptools_impl.py", line 404, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
return await self.app(scope, receive, send)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/fastapi/applications.py", line 269, in __call__
await super().__call__(scope, receive, send)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/starlette/applications.py", line 124, in __call__
await self.middleware_stack(scope, receive, send)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/starlette/middleware/errors.py", line 184, in __call__
raise exc
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/starlette/middleware/errors.py", line 162, in __call__
await self.app(scope, receive, _send)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/starlette/middleware/cors.py", line 84, in __call__
await self.app(scope, receive, send)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/starlette/exceptions.py", line 93, in __call__
raise exc
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/starlette/exceptions.py", line 82, in __call__
await self.app(scope, receive, sender)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
raise e
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
await self.app(scope, receive, send)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/starlette/routing.py", line 670, in __call__
await route.handle(scope, receive, send)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/starlette/routing.py", line 266, in handle
await self.app(scope, receive, send)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/starlette/routing.py", line 65, in app
response = await func(request)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/fastapi/routing.py", line 231, in app
raw_response = await run_endpoint_function(
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/fastapi/routing.py", line 160, in run_endpoint_function
return await dependant.call(**values)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/covalent_dispatcher/_service/app.py", line 172, in submit
dispatch_id = await dispatcher.run_dispatcher(data)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/covalent_dispatcher/entry_point.py", line 48, in run_dispatcher
result_object = initialize_result_object(json_lattice)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/covalent_dispatcher/_core/execution.py", line 843, in initialize_result_object
update.persist(result_object, electron_id=parent_electron_id)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/covalent_dispatcher/_db/update.py", line 50, in persist
upsert._lattice_data(record, electron_id=electron_id)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/covalent_dispatcher/_db/upsert.py", line 75, in _lattice_data
session.query(models.Lattice)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/orm/query.py", line 2823, in first
return self.limit(1)._iter().first()
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/orm/query.py", line 2907, in _iter
result = self.session.execute(
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/orm/session.py", line 1712, in execute
result = conn._execute_20(statement, params or {}, execution_options)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1705, in _execute_20
return meth(self, args_10style, kwargs_10style, execution_options)
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/sql/elements.py", line 333, in _execute_on_connection
return connection._execute_clauseelement(
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1572, in _execute_clauseelement
ret = self._execute_context(
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1943, in _execute_context
self._handle_dbapi_exception(
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 2124, in _handle_dbapi_exception
util.raise_(
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/util/compat.py", line 208, in raise_
raise exception
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1900, in _execute_context
self.dialect.do_execute(
File "/Users/catalina/miniforge3/envs/QVR/lib/python3.9/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: lattices
[SQL: SELECT lattices.id AS lattices_id, lattices.dispatch_id AS lattices_dispatch_id, lattices.electron_id AS lattices_electron_id, lattices.name AS lattices_name, lattices.docstring_filename AS lattices_docstring_filename, lattices.status AS lattices_status, lattices.electron_num AS lattices_electron_num, lattices.completed_electron_num AS lattices_completed_electron_num, lattices.storage_type AS lattices_storage_type, lattices.storage_path AS lattices_storage_path, lattices.function_filename AS lattices_function_filename, lattices.function_string_filename AS lattices_function_string_filename, lattices.executor AS lattices_executor, lattices.executor_data_filename AS lattices_executor_data_filename, lattices.workflow_executor AS lattices_workflow_executor, lattices.workflow_executor_data_filename AS lattices_workflow_executor_data_filename, lattices.error_filename AS lattices_error_filename, lattices.inputs_filename AS lattices_inputs_filename, lattices.named_args_filename AS lattices_named_args_filename, lattices.named_kwargs_filename AS lattices_named_kwargs_filename, lattices.results_filename AS lattices_results_filename, lattices.transport_graph_filename AS lattices_transport_graph_filename, lattices.deps_filename AS lattices_deps_filename, lattices.call_before_filename AS lattices_call_before_filename, lattices.call_after_filename AS lattices_call_after_filename, lattices.cova_imports_filename AS lattices_cova_imports_filename, lattices.lattice_imports_filename AS lattices_lattice_imports_filename, lattices.results_dir AS lattices_results_dir, lattices.root_dispatch_id AS lattices_root_dispatch_id, lattices.is_active AS lattices_is_active, lattices.created_at AS lattices_created_at, lattices.updated_at AS lattices_updated_at, lattices.started_at AS lattices_started_at, lattices.completed_at AS lattices_completed_at
FROM lattices
WHERE lattices.dispatch_id = ?
LIMIT ? OFFSET ?]
[parameters: ('f94d05c1-692e-4537-88e9-4456d224a13f', 1, 0)]
(Background on this error at: https://sqlalche.me/e/14/e3q8)
Hi @CatalinaAlbornoz this error usually occurs if there have been some residual files left from an earlier installation of covalent. The following steps should fix this issue for you,
covalent purge -Hy
- this will purge any old files and directories created by covalent essentially giving you a fresh start. This should however be used sparingly as it also purges the covalent database.covalent start
instead of covalent start --ignore-migrations
- this will attempt to start covalent the recommended way
covalent db migrate
first, then run that and then covalent start
afterwardsOnce again, thanks a lot for your patience!
Hi @kessler-frost, this works now! Thank you for sharing these steps.
Now that this works I would suggest 3 changes to the README:
A - Before asking people to open the Jupyter notebook it's better to add the part about starting Covalent.
B - In the readme change covalent start --ignore-migrations
to covalent start
.
C - Ideally also add your steps 1 and 2 to the README in case users need to troubleshoot.
Thank you again for your support here and please let me know if you have any questions for me.
Hi @CatalinaAlbornoz thanks for the suggestions. They have now been implemented into the README. What are next steps from here?
Great! good thing it was finally solved 😄 I'll take care of the next steps. We will include the demo in our community section and post a tweet to promote it. For this, I would need (if you agree), your twitter username to give you visivility (We can also quote @agnostiqHQ). Once we have everything, I will let you know the publication date, which may be soon 💪
Hey @KetpuntoG. I am actually pretty silent on social media. You can go ahead and proceed just with the agnostiqHQ twitter handle! Let me know once dates are sorted!
@KetpuntoG Could you also please include the Covalent twitter handle? @covalentxyz
It has been confirmed that it will be released next Thursday 🚀
Hi @jackbaker1001,
This is the copy we're planning for Thursday's Tweet. Please let me know if you would need for us to do any changes!
--
From financial markets to nuclear fission reactors 💰 🏭
Time series appear everywhere — so you have to be able to detect anomalies when they occur 🔍
@AgnostiqHQ and @Covalentxyz explain how QVR is a solution for detecting these anomalies 👇
Hi @CatalinaAlbornoz we are good with this tweet on our side. Thanks for your efforts and everybody else's in this thread!
We close this issue, thank you for the contribution 💪
General information
Name Jack S. Baker
Affiliation Agnostiq
Twitter agnostiqHQ
Image
Demo information
Title Quantum Variational Rewinding for Time Series Anomaly Detection
Abstract In this demo, we walk through a select number of examples from from the paper Quantum Variational Rewinding for Time Series Anomaly Detection. Specifically, we demonstrate the detection of anomalous behaviour in a bivariate cryptocurrency time series data and in synthetically generated univariate time series data. The goal of the tutorial is to have others experiment with the code to create new algorithmic variations as well as expose the advantages of using the heterogeneous workflow manager Covalent in quantum machine learning workflows.
Relevant links Base Git repo: https://github.com/AgnostiqHQ/QuantumVariationalRewinding Demo notebook: https://github.com/AgnostiqHQ/QuantumVariationalRewinding/blob/main/QVR_example.ipynb Covalent: https://github.com/AgnostiqHQ/covalent Paper: https://arxiv.org/abs/2210.16438