-
Notifications
You must be signed in to change notification settings - Fork 346
protocol models responses #828
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
protocol models responses #828
Conversation
KrzysztofCwalina
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is very nice and close. I added 3 comments, but supper happy to see how this does not complicate the namespace too much (or at all)
api/OpenAI.net8.0.cs
Outdated
| public virtual Task<ClientResult<ResponseResult>> GetResponseAsync(GetResponseOptions options, CancellationToken cancellationToken = default); | ||
| public virtual Task<ClientResult> GetResponseAsync(string responseId, bool? stream, int? startingAfter, RequestOptions options); | ||
| public virtual Task<ClientResult<OpenAIResponse>> GetResponseAsync(string responseId, CancellationToken cancellationToken = default); | ||
| public virtual ClientResult<ResponseItemList> GetResponseInputItems(GetResponseInputItemsOptions options = default, CancellationToken cancellationToken = default); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
public virtual ClientResult<ResponseItemList> GetResponseInputItems(GetResponseInputItemsOptions options = default, CancellationToken cancellationToken = default);
public virtual CollectionResult<ResponseItem> GetResponseInputItems(string responseId, ResponseItemCollectionOptions options = null, CancellationToken cancellationToken = default);
public virtual CollectionResult GetResponseInputItems(string responseId, int? limit, string order, string after, string before, RequestOptions options);
public virtual Task<ClientResult<ResponseItemList>> GetResponseInputItemsAsync(GetResponseInputItemsOptions options = default, CancellationToken cancellationToken = default);
public virtual AsyncCollectionResult<ResponseItem> GetResponseInputItemsAsync(string responseId, ResponseItemCollectionOptions options = null, CancellationToken cancellationToken = default);
public virtual AsyncCollectionResult GetResponseInputItemsAsync(string responseId, int? limit, string order, string after, string before, RequestOptions options); Looking at the above, I'm a little conflicted about having six methods for each list operation. Additionally, I think it's a little odd that the protocol method has pagination conveniences, while the convenience method that takes protocol models does not have pagination conveniences (it's like neither variant is fully protocol nor fully convenience). I wonder if we can simplify this...
What would you think of the following: What if we do not add list methods that return a protocol model? 99% of users won't benefit from these, and they would instead favor the methods that can paginate. If necessary, I would be in favor of shipping the protocol model by itself to support MRW scenarios and server scenarios.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the protocol model return types give better control over advanced pagination scenarios, but I agree they are probably needed rarely. I do think there is some value in having access to the raw protocol model without the pagination abstractions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@KrzysztofCwalina What is the story for users growing from paginating protocol methods to convenience methods?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We discussed this offline and arrived at the design described here:
🔗 https://gist.github.com/joseharriaga/8c9c57eca935cbe104477b59b0bce6dc
api/OpenAI.net8.0.cs
Outdated
| [Experimental("OPENAI001")] | ||
| public class ResponsesClient { | ||
| protected ResponsesClient(); | ||
| protected internal ResponsesClient(ClientPipeline pipeline, string model, OpenAIClientOptions options); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@m-nash, it would be good to change this into your new ctor pattern before we GA, i.e. not necessarily in this PR.
| public virtual Task<ClientResult> CancelResponseAsync(string responseId, RequestOptions options); | ||
| public virtual Task<ClientResult<OpenAIResponse>> CancelResponseAsync(string responseId, CancellationToken cancellationToken = default); | ||
| public virtual ClientResult CreateResponse(BinaryContent content, RequestOptions options = null); | ||
| public virtual ClientResult<OpenAIResponse> CreateResponse(IEnumerable<ResponseItem> inputItems, ResponseCreationOptions options = null, CancellationToken cancellationToken = default); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the OpenAI library, we started using this pattern of separating the required properties from the optional properties for a few reasons:
- Users found it awkward to have use two different ways to set the properties of the options class: They had to use the constructor for required properties, and initialization syntax for the optional properties. The options classes in the OpenAI API usually have tons of properties, and it is very common to at least want to set one or two. So this issue was very front-and-center. By making the options class all optional properties, this pain point was mitigated.
- Users wanted to re-use an instance of the options class across requests. Consider the following scenario: When using stateful responses, if the input items are in the options class, users need to clear the collection and then re-populate it with the new input items every time. By moving the input items out of the options class, users can forget about the options instance entirely.
Would we want to preserve this functionality?
What if we kept the convenience methods as they are right now?
Here's an example of what I'm thinking...
We keep the following method:
public virtual ClientResult<OpenAIResponse> CreateResponse(IEnumerable<ResponseItem> inputItems, ResponseCreationOptions options = null, CancellationToken cancellationToken = default);And then, we add a parameterless constructor to the options class. If a user specifies inputItems in both, the method parameter and the options property, we throw an exception (similar to what we would do if they set the stream property but call the non-streaming method).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We discussed this offline. For now, we're going with the design described here:
🔗 https://gist.github.com/joseharriaga/8c9c57eca935cbe104477b59b0bce6dc
api/OpenAI.net8.0.cs
Outdated
| public virtual Task<ClientResult<ResponseResult>> GetResponseAsync(GetResponseOptions options, CancellationToken cancellationToken = default); | ||
| public virtual Task<ClientResult> GetResponseAsync(string responseId, bool? stream, int? startingAfter, RequestOptions options); | ||
| public virtual Task<ClientResult<OpenAIResponse>> GetResponseAsync(string responseId, CancellationToken cancellationToken = default); | ||
| public virtual ClientResult<ResponseItemList> GetResponseInputItems(GetResponseInputItemsOptions options = default, CancellationToken cancellationToken = default); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@KrzysztofCwalina What is the story for users growing from paginating protocol methods to convenience methods?
No description provided.