Closed Toseban closed 5 years ago
Merging #117 into velvet-revolution will decrease coverage by
7.55%
. The diff coverage is12.13%
.
@@ Coverage Diff @@
## velvet-revolution #117 +/- ##
=====================================================
- Coverage 75.12% 67.56% -7.56%
=====================================================
Files 57 63 +6
Lines 3509 3928 +419
=====================================================
+ Hits 2636 2654 +18
- Misses 873 1274 +401
Impacted Files | Coverage Δ | |
---|---|---|
modules/ravestate/context.py | 94.96% <ø> (ø) |
:arrow_up: |
modules/ravestate_akinator/__init__.py | 44.44% <0%> (-1.46%) |
:arrow_down: |
modules/ravestate_ros1/__init__.py | 0% <0%> (ø) |
|
modules/ravestate_wildtalk/parlai_backend.py | 0% <0%> (ø) |
|
modules/ravestate_wildtalk/server.py | 0% <0%> (ø) |
|
modules/ravestate_telegramio/telegram_bot.py | 41.84% <0%> (-0.46%) |
:arrow_down: |
modules/ravestate_roboyio/__init__.py | 0% <0%> (ø) |
:arrow_up: |
modules/ravestate_wildtalk/convai_gpt_backend.py | 0% <0%> (ø) |
|
modules/ravestate_wildtalk/gpt2_backend.py | 0% <0%> (ø) |
|
modules/ravestate_ros1/ros1_properties.py | 0% <0%> (ø) |
|
... and 14 more |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 19adeaa...46bca78. Read the comment docs.
Problem: URL dependency breaks PyPI deployment. Use this handy snippet to download & install the model on first run (stolen from spacy):
def download_model(filename, user_pip_args=None):
download_url = about.__download_url__ + "/" + filename
pip_args = ["--no-cache-dir", "--no-deps"]
if user_pip_args:
pip_args.extend(user_pip_args)
cmd = [sys.executable, "-m", "pip", "install"] + pip_args + [download_url]
return subprocess.call(cmd, env=os.environ.copy())
The server-client architecture with on-demand autostart is awesome. But I think we can skip integration of old Wildtalk. Nobody will use it (probably). Opinions @missxa?
@josephbirkner The old Wildtalk is a more reliable in giving "normal responses" I think, so I would include it. Try the gpt2 Wildtalk for a few conversations and let me know what you think
The old wildtalk module (roboy_parlai) :older_man: is now integrated as well as the conversational AI from https://github.com/huggingface/transfer-learning-conv-ai :speaking_head:
Models and their options can be set with ravestate configs :balloon: Server can also be started externally on its own :earth_africa:
First version of Wildtalk using GPT2 (345M) :speaking_head: There are still some things to do: