-
Notifications
You must be signed in to change notification settings - Fork 46
Data plane codegen testing
There are two distinct kinds of evolution for a data-plane generated SDK:
- Developer driven evolution
- Service driven evolution
The main different between the two is Swagger status. Developer driven evolutions are not done because the Swagger is changing, but because we have reasons to think the surface API should be improved to provide a better customer experience. Tests in those scenarios will assume the Swagger is not changing, and there is only one Swagger version for the entire testsuite.
Service driven evolution on the other hand, means the service has a new API Version and is adding behaviors into the service API. Assuming the service team follow the Azure guidelines, those changes should not have breaking changes (it is accepted that breaking changes at the RestAPI layer will break the generation as a consequence). Service teams are encouraged to avoid breaking changes in RestAPI anyway, from the stewardship review board.
Changes done in a Swagger "in place" in a given API version are NOT considered to be on those categories, but are bug fixes of the Swagger, and should happen only if the service is still in preview. Swagger adjustments (like missing LRO annotations, adding a default client value, paging configuration, etc.) are only acceptable in preview releases, and the Swagger for a given API version will be frozen after the SDK from it is declared GA. We will NOT then consider a goal to verify we can adapt SDK to those scenarios, and we expect Swagger validation to be done during the various previews, reviews and SDK testing by the service team before we GA.
While service teams will have requirements on the level of testing they have to do ship a generated data-plane SDK, we want to verify upstream that the codegen is designed in a way to enable the previous scenarios. In order to do that, the autorest.testserver will provide a set of Swagger scenarios that each languages needs to implement to show they are prepared for the scenario.
For the following scenarios, it is not asked that the codegen generate code that takes care of it automatically, but the design is flexible enough for a SDK writer to handcraft it in a "reasonable time". The hand crafting must be documented, and the size of it will determine if the we consider the scenario fully supported by codegen.
Swagger input: https://github.com/Azure/autorest.testserver/blob/main/swagger/llc-customization.json
Scenarios:
- Improve a GET method that return raw JSON to return a model
- Improve a PUT polling method that return raw JSON to return a model
- Improve a GET paging method that return raw JSON to return a model
- Improve a POST method that reads a raw JSON to accept a model
Initial Swagger: https://github.com/Azure/autorest.testserver/blob/main/swagger/llc_initial.json Updated Swagger: https://github.com/Azure/autorest.testserver/blob/main/swagger/llc_update1.json
Scenarios:
- A required query parameter becomes optional