Closed vishnuravi closed 1 year ago
Merging #26 (09d609a) into main (0dcb663) will decrease coverage by
1.14%
. The diff coverage is3.71%
.
@@ Coverage Diff @@
## main #26 +/- ##
==========================================
- Coverage 64.25% 63.11% -1.14%
==========================================
Files 29 29
Lines 993 1011 +18
==========================================
Hits 638 638
- Misses 355 373 +18
Impacted Files | Coverage Δ | |
---|---|---|
HealthGPT/HealthGPT/Message.swift | 0.00% <0.00%> (ø) |
|
HealthGPT/HealthGPT/MessageManager.swift | 24.62% <0.00%> (-10.16%) |
:arrow_down: |
HealthGPT/HealthGPT/OpenAIManager.swift | 46.67% <33.34%> (+1.51%) |
:arrow_up: |
Continue to review full report in Codecov by Sentry.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 0dcb663...09d609a. Read the comment docs.
Stream responses from model
:recycle: Current situation & Problem
Currently we wait until receiving a complete response from the model in order to display it to the user. If the response takes a long time to generate, the user has to wait.
:bulb: Proposed solution
This PR allows for streaming the response from the model as it is generated using an
AsyncThrowingStream
which is supported in the MacPaw/OpenAI package as of release 0.2.0.Code of Conduct & Contributing Guidelines
By submitting creating this pull request, you agree to follow our Code of Conduct and Contributing Guidelines: