Skip to content

Conversation

joperezr
Copy link
Member

@joperezr joperezr commented Jul 8, 2025

This PR should be merged and not squashed.

Microsoft Reviewers: Open in CodeFlow

joperezr and others added 15 commits July 2, 2025 23:30
Flow .NET Servicing versions

----
#### AI description  (iteration 1)
#### PR Classification
This PR performs a servicing update by bumping .NET dependency versions and refining CI pipeline configurations for stable releases.

#### PR Summary
The changes update dependency versions from 9.0.6 to 9.0.7 (and corresponding LTS versions) and adjust build settings to support release stability while streamlining internal feed configurations.
- `eng/Version.Details.xml`: Updated multiple dependency version numbers and SHA values from 9.0.6 to 9.0.7.
- `eng/Versions.props`: Bumped version properties (including LTS versions from 8.0.17 to 8.0.18) and enabled stable release settings by setting package stabilization to true and DotNetFinalVersionKind to release.
- `NuGet.config`: Modified package source settings by adding new internal feed mappings and removing preexisting package source mapping blocks.
- `azure-pipelines.yml` & `eng/pipelines/templates/BuildAndTest.yml`: Removed the CodeCoverage stage and added tasks for setting up private feed credentials, with integration tests commented out due to authentication requirements.
- `Directory.Build.props`: Suppressed NU1507 warnings to accommodate internal feeds without package source mapping.
<!-- GitOpsUserAgent=GitOps.Apps.Server.pullrequestcopilot -->
* Add DelegatingAIFunction

To simplify scenarios where someone wants to augment an existing AIFunction's behavior, tweak what one of its properties returns, etc.

* Address PR feedback
The conventions we're pushing for on NuGet are:

1. `PackAsTool`
2. `McpServer` package type (in addition to the default `DotnetTool`)
3. Embed server.json

This provides an `mcpserver` template as part of `Microsoft.Extensions.AI.Templates`. This currently only covers a local MCP server, with stdio transport.
We've had a bunch of requests to be able to customize how function invocation is handled, and while it's already possible today by deriving from FunctionInvokingChatClient and overriding its InvokeFunctionAsync, there's a lot of ceremony involved in that. By having a property on the client instance, that behavior can instead be configured as part of a UseFunctionInvocation call.
* Add reporting tests that show NLP results.

* Cleanup analyzer errors.

* Add global tags for NLP

* Add more precision to the evaluator timing

* More tags

* Add another partial match test
* Enable specifying "strict" for OpenAI clients via ChatOptions

* Address PR feedback
* AIFunctionFactory: tolerate JSON string function parameters.

* Add debug assertion.

* Update src/Libraries/Microsoft.Extensions.AI.Abstractions/Functions/AIFunctionFactory.cs

Co-authored-by: Stephen Toub <[email protected]>

* Add regex-based JSON string recognition and add more tests.

---------

Co-authored-by: Stephen Toub <[email protected]>
@joperezr joperezr requested review from a team as code owners July 8, 2025 20:48
@joperezr joperezr merged commit ba8e28e into release/9.7 Jul 9, 2025
6 checks passed
@joperezr joperezr deleted the InternalMerge97 branch July 9, 2025 16:41
@github-actions github-actions bot locked and limited conversation to collaborators Aug 9, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants