umati-fork of OPC UA cloud publisher --- reference implementation leveraging OPC UA PubSub over MQTT. It runs in a Docker container on standard Docker hosts or on Kubernetes and comes with an easy-to-use web user interface.
Other
0
stars
0
forks
source link
chore(deps): update dependency azure.ai.openai to 1.0.0-beta.17 #14
Azure/azure-sdk-for-net (Azure.AI.OpenAI)
### [`v1.0.0-beta.17`](https://togithub.com/Azure/azure-sdk-for-net/releases/tag/Azure.AI.OpenAI_1.0.0-beta.17)
[Compare Source](https://togithub.com/Azure/azure-sdk-for-net/compare/Azure.AI.OpenAI_1.0.0-beta.16...Azure.AI.OpenAI_1.0.0-beta.17)
#### 1.0.0-beta.17 (2024-05-03)
##### Features Added
- Image input support for `gpt-4-turbo` chat completions now works with image data in addition to internet URLs.
Images may be now be used as `gpt-4-turbo` message content items via one of three constructors:
- `ChatMessageImageContent(Uri)` -- the existing constructor, used for URL-based image references
- `ChatMessageImageContent(Stream,string)` -- (new) used with a stream and known MIME type (like `image/png`)
- `ChatMessageImageContent(BinaryData,string)` -- (new) used with a BinaryData instance and known MIME type
Please see the [readme example](https://togithub.com/Azure/azure-sdk-for-net/blob/main/sdk/openai/Azure.AI.OpenAI/README.md#chat-with-images-using-gpt-4-turbo) for more details.
##### Breaking Changes
- Public visibility of the `ChatMessageImageUrl` type is removed to promote more flexible use of data sources in
`ChatMessageImageContent`. Code that previously created a `ChatMessageImageUrl` using a `Uri` should simply provide
the `Uri` to the `ChatMessageImageContent` constructor directly.
### [`v1.0.0-beta.16`](https://togithub.com/Azure/azure-sdk-for-net/releases/tag/Azure.AI.OpenAI_1.0.0-beta.16)
[Compare Source](https://togithub.com/Azure/azure-sdk-for-net/compare/Azure.AI.OpenAI_1.0.0-beta.15...Azure.AI.OpenAI_1.0.0-beta.16)
#### 1.0.0-beta.16 (2024-04-11)
##### Features Added
**Audio**
- `GetAudioTranscription()` now supports word-level timestamp granularities via `AudioTranscriptionOptions`:
- The `Verbose` option for `ResponseFormat` must be used for any timing information to be populated
- `TimestampGranularityFlags` accepts a combination of the `.Word` and `.Segment` granularity values in
`AudioTimestampGranularity`, joined when needed via the single-pipe `|` operator
- For example, `TimestampGranularityFlags = AudioTimestampGranularity.Word | AudioTimestampGranularity.Segment`
will request that both word-level and segment-level timestamps are provided on the transcription result
- If not otherwise specified, `Verbose` format will default to using segment-level timestamp information
- Corresponding word-level information is found on the `.Words` collection of `AudioTranscription`, peer to the
existing `.Segments` collection
- Note that word-level timing information incurs a small amount of additional processingly latency; segment-level
timestamps do not encounter this behavior
- `GenerateSpeechFromText()` can now use `Wav` and `Pcm` values from `SpeechGenerationResponseFormat`, these new
options providing alternative uncompressed formats to `Flac`
**Chat**
- `ChatCompletions` and `StreamingChatCompletionsUpdate` now include the reported `Model` value from the response
- Log probability information is now included in `StreamingChatCompletionsUpdate` when `logprobs` are requested on
`GetChatCompletionsStreaming()`
- \[AOAI] Custom Blocklist information in content filter results is now represented in a more structured
`ContentFilterDetailedResults` type
- \[AOAI] A new `IndirectAttack` content filter entry is now present on content filter results for prompts
##### Breaking Changes
- \[AOAI] `AzureChatExtensionMessageContext`'s `RequestContentFilterResults` now uses the new
`ContentFilterDetailedResults` type, changed from the previous `IReadOnlyList`. The
previous list is now present on `CustomBlockLists.Details`, supplemented with a new `CustomBlockLists.Filtered`
property.
##### Bugs Fixed
- \[AOAI] An issue that sometimes caused `StreamingChatCompletionUpdates` from Azure OpenAI to inappropriately exclude
top-level information like `Id` and `CreatedAt` has been addressed
### [`v1.0.0-beta.15`](https://togithub.com/Azure/azure-sdk-for-net/releases/tag/Azure.AI.OpenAI_1.0.0-beta.15)
[Compare Source](https://togithub.com/Azure/azure-sdk-for-net/compare/Azure.AI.OpenAI_1.0.0-beta.14...Azure.AI.OpenAI_1.0.0-beta.15)
#### 1.0.0-beta.15 (2024-03-20)
This release targets the latest `2024-03-01-preview` service API label and brings support for the `Dimensions` property when using new embedding models.
##### Features Added
- `EmbeddingsOptions` now includes the `Dimensions` property, new to Azure OpenAI's `2024-03-01-preview` service API.
##### Bugs Fixed
- Several issues with the `ImageGenerations` response object being treated as writeable are fixed:
- `ImageGenerations` no longer has an erroneous public constructor
- `ImageGenerations.Created` no longer has a public setter
- `ImageGenerations.Data` is now an `IReadOnlyList` instead of an `IList`
- A corresponding replacement factory method for mocks is added to `AzureOpenAIModelFactory`
### [`v1.0.0-beta.14`](https://togithub.com/Azure/azure-sdk-for-net/releases/tag/Azure.Monitor.OpenTelemetry.Exporter_1.0.0-beta.14)
[Compare Source](https://togithub.com/Azure/azure-sdk-for-net/compare/Azure.AI.OpenAI_1.0.0-beta.13...Azure.AI.OpenAI_1.0.0-beta.14)
##### 1.0.0-beta.14 (2023-08-09)
##### Breaking Changes
- Location ip on server spans will now be set using `client.address` tag key on
activity instead of `http.client_ip`.
([#37707](https://togithub.com/Azure/azure-sdk-for-net/pull/37707))
- Removing `ServiceVersion.V2020_09_15_Preview`. This is no longer in use and
the exporter has already defaulted to the latest `ServiceVersion.v2_1`.
([#37996](https://togithub.com/Azure/azure-sdk-for-net/pull/37996))
- Remove Nullable Annotations from the Exporter's public API.
([#37996](https://togithub.com/Azure/azure-sdk-for-net/pull/37996))
##### Bugs Fixed
- Fixed an issue causing no telemetry if SDK Version string exceeds max length.
([#37807](https://togithub.com/Azure/azure-sdk-for-net/pull/37807))
##### Other Changes
- Update OpenTelemetry dependencies
([#37837](https://togithub.com/Azure/azure-sdk-for-net/pull/37837))
- OpenTelemetry 1.5.1
### [`v1.0.0-beta.13`](https://togithub.com/Azure/azure-sdk-for-net/releases/tag/Azure.Monitor.OpenTelemetry.Exporter_1.0.0-beta.13)
[Compare Source](https://togithub.com/Azure/azure-sdk-for-net/compare/Azure.AI.OpenAI_1.0.0-beta.12...Azure.AI.OpenAI_1.0.0-beta.13)
##### 1.0.0-beta.13 (2023-07-13)
##### Features Added
- Added `ApplicationInsightsSampler` to the exporter, enabling users to customize the sampling rate using the `SamplingRatio` property.
([#36972](https://togithub.com/Azure/azure-sdk-for-net/pull/36972))
##### Other Changes
- Updated Exporter to read v1.21.0 of the OpenTelemetry Semantic Conventions attributes for HTTP.
For more information see [Semantic conventions for HTTP spans](https://togithub.com/open-telemetry/opentelemetry-specification/blob/v1.21.0/specification/trace/semantic_conventions/http.md).
([#37464](https://togithub.com/Azure/azure-sdk-for-net/pull/37464))
([#37357](https://togithub.com/Azure/azure-sdk-for-net/pull/37357))
- Update OpenTelemetry dependencies
([#36859](https://togithub.com/Azure/azure-sdk-for-net/pull/36859))
- OpenTelemetry 1.5.0
- Remove metric namespace mapping.
([#36968](https://togithub.com/Azure/azure-sdk-for-net/pull/36968))
### [`v1.0.0-beta.12`](https://togithub.com/Azure/azure-sdk-for-net/releases/tag/Azure.AI.OpenAI_1.0.0-beta.12)
[Compare Source](https://togithub.com/Azure/azure-sdk-for-net/compare/Azure.AI.OpenAI_1.0.0-beta.11...Azure.AI.OpenAI_1.0.0-beta.12)
#### 1.0.0-beta.12 (2023-12-15)
Like beta.11, beta.12 is another release that brings further refinements and fixes. It remains based on the `2023-12-01-preview` service API version for Azure OpenAI and does not add any new service capabilities.
##### Features Added
**Updates for using streaming tool calls:**
- A new .NET-specific `StreamingToolCallUpdate` type has been added to better represent streaming tool call updates
when using chat tools.
- This new type includes an explicit `ToolCallIndex` property, reflecting `index` in the REST schema, to allow
resilient deserialization of parallel function tool calling.
- A convenience constructor has been added for `ChatRequestAssistantMessage` that can automatically populate from a prior
`ChatResponseMessage` when using non-streaming chat completions.
- A public constructor has been added for `ChatCompletionsFunctionToolCall` to allow more intuitive reconstruction of
`ChatCompletionsToolCall` instances for use in `ChatRequestAssistantMessage` instances made from streaming responses.
**Other additions:**
- To facilitate reuse of user message contents, `ChatRequestUserMessage` now provides a public `Content` property (`string`) as well as a public `MultimodalContentItems` property (`IList` type is introduced that implicitly exposes an `IAsyncEnumerable` derived from
the underlying response.
- `OpenAI.GetCompletionsStreaming()` now returns a `StreamingResponse` that may be directly
enumerated over. `StreamingCompletions`, `StreamingChoice`, and the corresponding methods are removed.
- Because Chat Completions use a distinct structure for their streaming response messages, a new
`StreamingChatCompletionsUpdate` type is introduced that encapsulates this update data.
- Correspondingly, `OpenAI.GetChatCompletionsStreaming()` now returns a
`StreamingResponse` that may be enumerated over directly.
`StreamingChatCompletions`, `StreamingChatChoice`, and related methods are removed.
- For more information, please see
[the related pull request description](https://togithub.com/Azure/azure-sdk-for-net/pull/39347) as well as the
updated snippets in the project README.
##### `deploymentOrModelName` moved to `*Options.DeploymentName`
`deploymentOrModelName` and related method parameters on `OpenAIClient` have been moved to `DeploymentName`
properties in the corresponding method options. This is intended to promote consistency across scenario,
language, and Azure/non-Azure OpenAI use.
As an example, the following:
```csharp
ChatCompletionsOptions chatCompletionsOptions = new()
{
Messages = { new(ChatRole.User, "Hello, assistant!") },
};
Response response = client.GetChatCompletions("gpt-4", chatCompletionsOptions);
```
...is now re-written as:
```csharp
ChatCompletionsOptions chatCompletionsOptions = new()
{
DeploymentName = "gpt-4",
Messages = { new(ChatRole.User, "Hello, assistant!") },
};
Response response = client.GetChatCompletions(chatCompletionsOptions);
```
##### Consistency in complex method options type constructors
With the migration of `DeploymentName` into method complex options types, these options types have now been snapped to
follow a common pattern: each complex options type will feature a default constructor that allows `init`-style setting
of properties as well as a single additional constructor that accepts *all* required parameters for the corresponding
method. Existing constructors that no longer meet that "all" requirement, including those impacted by the addition of
`DeploymentName`, have been removed. The "convenience" constructors that represented required parameter data
differently -- for example, `EmbeddingsOptions(string)`, have also been removed in favor of the consistent "set of
directly provide" choice.
More exhaustively, *removed* are:
- `AudioTranscriptionOptions(BinaryData)`
- `AudioTranslationOptions(BinaryData)`
- `ChatCompletionsOptions(IEnumerable)`
- `CompletionsOptions(IEnumerable)`
- `EmbeddingsOptions(string)`
- `EmbeddingsOptions(IEnumerable)`
And *added* as replacements are:
- `AudioTranscriptionOptions(string, BinaryData)`
- `AudioTranslationOptions(string, BinaryData)`
- `ChatCompletionsOptions(string, IEnumerable)`
- `CompletionsOptions(string, IEnumerable)`
- `EmbeddingsOptions(string, IEnumerable)`
##### Embeddings now represented as `ReadOnlyMemory`
Changed the representation of embeddings (specifically, the type of the `Embedding` property of the `EmbeddingItem` class)
from `IReadOnlyList` to `ReadOnlyMemory` as part of a broader effort to establish consistency across the
.NET ecosystem.
##### `SearchKey` and `EmbeddingKey` properties replaced by `SetSearchKey` and `SetEmbeddingKey` methods
Replaced the `SearchKey` and `EmbeddingKey` properties of the `AzureCognitiveSearchChatExtensionConfiguration` class with
new `SetSearchKey` and `SetEmbeddingKey` methods respectively. These methods simplify the configuration of the Azure Cognitive
Search chat extension by receiving a plain string instead of an `AzureKeyCredential`, promote more sensible key and secret
management, and align with the Azure SDK guidelines.
### [`v1.0.0-beta.8`](https://togithub.com/Azure/azure-sdk-for-net/releases/tag/Azure.AI.OpenAI_1.0.0-beta.8)
[Compare Source](https://togithub.com/Azure/azure-sdk-for-net/compare/Azure.AI.OpenAI_1.0.0-beta.7...Azure.AI.OpenAI_1.0.0-beta.8)
#### 1.0.0-beta.8 (2023-09-21)
##### Features Added
- Audio Transcription and Audio Translation using OpenAI Whisper models is now supported. See [OpenAI's API
reference](https://platform.openai.com/docs/api-reference/audio) or the [Azure OpenAI
quickstart](https://learn.microsoft.com/azure/ai-services/openai/whisper-quickstart) for detailed overview and
background information.
- The new methods `GetAudioTranscription` and `GetAudioTranscription` expose these capabilities on `OpenAIClient`
- Transcription produces text in the primary, supported, spoken input language of the audio data provided, together
with any optional associated metadata
- Translation produces text, translated to English and reflective of the audio data provided, together with any
optional associated metadata
- These methods work for both Azure OpenAI and non-Azure `api.openai.com` client configurations
##### Breaking Changes
- The underlying representation of `PromptFilterResults` (for `Completions` and `ChatCompletions`) has had its response
body key changed from `prompt_annotations` to `prompt_filter_results`
- **Prior versions of the `Azure.AI.OpenAI` library may no longer populate `PromptFilterResults` as expected** and it's
highly recommended to upgrade to this version if the use of Azure OpenAI content moderation annotations for input data
is desired
- If a library version upgrade is not immediately possible, it's advised to use `Response.GetRawResponse()` and manually
extract the `prompt_filter_results` object from the top level of the `Completions` or `ChatCompletions` response `Content`
payload
##### Bugs Fixed
- Support for the described breaking change for `PromptFilterResults` was added and this library version will now again
deserialize `PromptFilterResults` appropriately
- `PromptFilterResults` and `ContentFilterResults` are now exposed on the result classes for streaming Completions and
Chat Completions. `Streaming(Chat)Completions.PromptFilterResults` will report an index-sorted list of all prompt
annotations received so far while `Streaming(Chat)Choice.ContentFilterResults` will reflect the latest-received
content annotations that were populated and received while streaming
### [`v1.0.0-beta.7`](https://togithub.com/Azure/azure-sdk-for-net/releases/tag/Azure.AI.OpenAI_1.0.0-beta.7)
[Compare Source](https://togithub.com/Azure/azure-sdk-for-net/compare/Azure.AI.OpenAI_1.0.0-beta.6...Azure.AI.OpenAI_1.0.0-beta.7)
#### 1.0.0-beta.7 (2023-08-25)
##### Features Added
- The Azure OpenAI "using your own data" feature is now supported. See [the Azure OpenAI using your own data quickstart](https://learn.microsoft.com/azure/ai-services/openai/use-your-data-quickstart) for conceptual background and detailed setup instructions.
- Azure OpenAI chat extensions are configured via a new `AzureChatExtensionsOptions` property on `ChatCompletionsOptions`. When an `AzureChatExtensionsOptions` is provided, configured requests will only work with clients configured to use the Azure OpenAI service, as the capabilities are unique to that service target.
- `AzureChatExtensionsOptions` then has `AzureChatExtensionConfiguration` instances added to its `Extensions` property, with these instances representing the supplementary information needed for Azure OpenAI to use desired data sources to supplement chat completions behavior.
- `ChatChoice` instances on a `ChatCompletions` response value that used chat extensions will then also have their `Message` property supplemented by an `AzureChatExtensionMessageContext` instance. This context contains a collection of supplementary `Messages` that describe the behavior of extensions that were used and supplementary response data, such as citations, provided along with the response.
- See the README sample snippet for a simplified example of request/response use with "using your own data"
### [`v1.0.0-beta.6`](https://togithub.com/Azure/azure-sdk-for-net/releases/tag/Azure.AI.OpenAI_1.0.0-beta.6)
[Compare Source](https://togithub.com/Azure/azure-sdk-for-net/compare/Azure.AI.OpenAI_1.0.0-beta.5...Azure.AI.OpenAI_1.0.0-beta.6)
#### 1.0.0-beta.6 (2023-07-19)
##### Features Added
- DALL-E image generation is now supported. See [the Azure OpenAI quickstart](https://learn.microsoft.com/azure/cognitive-services/openai/dall-e-quickstart) for conceptual background and detailed setup instructions.
- `OpenAIClient` gains a new `GetImageGenerations` method that accepts an `ImageGenerationOptions` and produces an `ImageGenerations` via its response. This response object encapsulates the temporary storage location of generated images for future retrieval.
- In contrast to other capabilities, DALL-E image generation does not require explicit creation or specification of a deployment or model. Its surface as such does not include this concept.
- Functions for chat completions are now supported: see [OpenAI's blog post on the topic](https://openai.com/blog/function-calling-and-other-api-updates) for much more detail.
- A list of `FunctionDefinition` objects may be populated on `ChatCompletionsOptions` via its `Functions` property. These definitions include a name and description together with a serialized JSON Schema representation of its parameters; these parameters can be generated easily via `BinaryData.FromObjectAsJson` with dynamic objects -- see the README for example usage.
- **NOTE**: Chat Functions requires a minimum of the `-0613` model versions for `gpt-4` and `gpt-3.5-turbo`/`gpt-35-turbo`. Please ensure you're using these later model versions, as Functions are not supported with older model revisions. For Azure OpenAI, you can update a deployment's model version or create a new model deployment with an updated version via the Azure AI Studio interface, also accessible through Azure Portal.
- (Azure OpenAI specific) Completions and Chat Completions responses now include embedded content filter annotations for prompts and responses
- A new `Azure.AI.OpenAI.AzureOpenAIModelFactory` is now present for mocking.
##### Breaking Changes
- `ChatMessage`'s one-parameter constructor has been replaced with a no-parameter constructor. Please replace any hybrid construction with one of these two options that either completely rely on property setting or completely rely on constructor parameters.
Configuration
š Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
š¦ Automerge: Disabled by config. Please merge this manually once you are satisfied.
ā» Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
š Ignore: Close this PR and you won't be reminded about this update again.
[ ] If you want to rebase/retry this PR, check this box
This PR contains the following updates:
1.0.0-beta.5
->1.0.0-beta.17
Release Notes
Azure/azure-sdk-for-net (Azure.AI.OpenAI)
### [`v1.0.0-beta.17`](https://togithub.com/Azure/azure-sdk-for-net/releases/tag/Azure.AI.OpenAI_1.0.0-beta.17) [Compare Source](https://togithub.com/Azure/azure-sdk-for-net/compare/Azure.AI.OpenAI_1.0.0-beta.16...Azure.AI.OpenAI_1.0.0-beta.17) #### 1.0.0-beta.17 (2024-05-03) ##### Features Added - Image input support for `gpt-4-turbo` chat completions now works with image data in addition to internet URLs. Images may be now be used as `gpt-4-turbo` message content items via one of three constructors: - `ChatMessageImageContent(Uri)` -- the existing constructor, used for URL-based image references - `ChatMessageImageContent(Stream,string)` -- (new) used with a stream and known MIME type (like `image/png`) - `ChatMessageImageContent(BinaryData,string)` -- (new) used with a BinaryData instance and known MIME type Please see the [readme example](https://togithub.com/Azure/azure-sdk-for-net/blob/main/sdk/openai/Azure.AI.OpenAI/README.md#chat-with-images-using-gpt-4-turbo) for more details. ##### Breaking Changes - Public visibility of the `ChatMessageImageUrl` type is removed to promote more flexible use of data sources in `ChatMessageImageContent`. Code that previously created a `ChatMessageImageUrl` using a `Uri` should simply provide the `Uri` to the `ChatMessageImageContent` constructor directly. ### [`v1.0.0-beta.16`](https://togithub.com/Azure/azure-sdk-for-net/releases/tag/Azure.AI.OpenAI_1.0.0-beta.16) [Compare Source](https://togithub.com/Azure/azure-sdk-for-net/compare/Azure.AI.OpenAI_1.0.0-beta.15...Azure.AI.OpenAI_1.0.0-beta.16) #### 1.0.0-beta.16 (2024-04-11) ##### Features Added **Audio** - `GetAudioTranscription()` now supports word-level timestamp granularities via `AudioTranscriptionOptions`: - The `Verbose` option for `ResponseFormat` must be used for any timing information to be populated - `TimestampGranularityFlags` accepts a combination of the `.Word` and `.Segment` granularity values in `AudioTimestampGranularity`, joined when needed via the single-pipe `|` operator - For example, `TimestampGranularityFlags = AudioTimestampGranularity.Word | AudioTimestampGranularity.Segment` will request that both word-level and segment-level timestamps are provided on the transcription result - If not otherwise specified, `Verbose` format will default to using segment-level timestamp information - Corresponding word-level information is found on the `.Words` collection of `AudioTranscription`, peer to the existing `.Segments` collection - Note that word-level timing information incurs a small amount of additional processingly latency; segment-level timestamps do not encounter this behavior - `GenerateSpeechFromText()` can now use `Wav` and `Pcm` values from `SpeechGenerationResponseFormat`, these new options providing alternative uncompressed formats to `Flac` **Chat** - `ChatCompletions` and `StreamingChatCompletionsUpdate` now include the reported `Model` value from the response - Log probability information is now included in `StreamingChatCompletionsUpdate` when `logprobs` are requested on `GetChatCompletionsStreaming()` - \[AOAI] Custom Blocklist information in content filter results is now represented in a more structured `ContentFilterDetailedResults` type - \[AOAI] A new `IndirectAttack` content filter entry is now present on content filter results for prompts ##### Breaking Changes - \[AOAI] `AzureChatExtensionMessageContext`'s `RequestContentFilterResults` now uses the new `ContentFilterDetailedResults` type, changed from the previous `IReadOnlyListConfiguration
š Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
š¦ Automerge: Disabled by config. Please merge this manually once you are satisfied.
ā» Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
š Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.