New Feature: Validate with all properties required.#146
New Feature: Validate with all properties required.#146flixx wants to merge 1 commit intopython-openapi:masterfrom
Conversation
When setting the new flag `require_all_props` to true, it is simulated that all properties in the specs are required.
| prop_value = value[prop_name] | ||
| except KeyError: | ||
| if prop_name in all_req_props_names: | ||
| if (prop_name in all_req_props_names) or require_all_props: |
There was a problem hiding this comment.
@reviewers This is the important line. All other changes are passing around the flag.
Codecov Report
@@ Coverage Diff @@
## master #146 +/- ##
==========================================
+ Coverage 96.19% 96.44% +0.25%
==========================================
Files 58 58
Lines 1602 1605 +3
==========================================
+ Hits 1541 1548 +7
+ Misses 61 57 -4
Continue to review full report at Codecov.
|
1 similar comment
Codecov Report
@@ Coverage Diff @@
## master #146 +/- ##
==========================================
+ Coverage 96.19% 96.44% +0.25%
==========================================
Files 58 58
Lines 1602 1605 +3
==========================================
+ Hits 1541 1548 +7
+ Misses 61 57 -4
Continue to review full report at Codecov.
|
| for t, u in primitive_unmarshallers.items() | ||
| ) | ||
|
|
||
| pass_defaults = lambda f: functools.partial( |
There was a problem hiding this comment.
I needed to refactor that part since the parameters were not passed in right with that lambda function here.
Could you explain this a little more? If those properties aren't required in the spec, then they are optional. Why is it bad to not return an error for optional properties being omitted? If all the properties are required, they should all be marked required in the spec. Maybe there's something subtle I'm misunderstanding here. |
|
Sure @bjmc @p1c2u: In order to ensure that all fields that are in the Specs are in the response of our Backend, we would need to set every single property as required. I think the term "required" does not really make sense when you talk about responses. In our API Documentation, we want to tell the API-User what properties are "Required" in the Request, yes, but naming something "required" in the Response, does not really make sense - our Backend always gives a Response with a complete list of all properties (some of them are nullable). So, yes technically, it would be most right to always mark all fields in the Response Schemes as required. But practically, this is both a lot of work (it is easy to forget to add new properties to the required list) and strange for a reader of a documentation. So I understand your point - but the changes I suggest here already helped us to make our API Documentation a lot better. Felix |
|
@flixx if this is always the case then why not add an automated postprocessing step that marks all response schema properties as required and generates the updated spec. That way it will always be done. |
|
I would love this feature! Interestingly, the AI hallucinated exactly such a flag when I described this problem. @flixx : Or did you find a workaround for this? |
|
A property that will always be present and must always be present is required. Value nullability is a separate concept. If a value may be nullable then you should describe it as a list of allowed types or type string with nullable true etc, depending upon your openapi version which use different json schema versions. |
|
Assume a Rest resource which provides CRUD for a resource. The resource payload is referenced via When running all API tests, i want to verify if all fields are described within the documentation and all fields described within the documentation are present when retrieving the resource. Using different schemas for retrieving and sending - with a different set of required fields would solve the problem - with the inherent problem, that descriptions become outdated or inaccurate. |
|
Hello! Glad to see that we are not the only ones with the problem. We did not find a workaround for this yet. It's a keeps being quite an annoyance for us. So still, we manually set all fields in the response as A flag how I proposed it back in 2019 would still be awesome. However, at the moment I don't have capacity & knowledge to bring this PR into a mergeable state so it would be greatly appreciated if someone can pick this up and continue the work. Felix |
|
@spacether mentioned in a previous comment:
I tried that - with the help of my stochastic parrot - but i was not successful. @flixx : Am I right, this would solve your problem too? Greetings from a frozen Berlin :-) |
Maybe/Partly. So as I mentioned, we also use the specs to generate documentation. It would be great to have those I would find a flag within openapi core more elegant though. |
|
@flixx : No I meant something like: |
|
@Julius2342 Yeah, more or less, what I meant with "pre-process our specs before running the tests automatically". I thought about writing it to temporary yaml/json files, but your suggestion handles all in memory / in code, which is more elegant. Do you happen to have an implementation of that |
|
I don't remember the exact problem why it did not work, but I did not get it running. I ended up with a https://gist.github.com/Julius2342/5a4781bf4f095043c5f98eaf5f991b17 |
Hello.
We use openapi-core to validate our specs with the backend, so that we are sure that the documentation generated out of the specs is up-to-date.
However, so far, the validation only worked one-way:
Properties returned in the response of the backend that are missing in the specs generate an error. (good)
Properties in the specs that are not returned in the response of the backend generate no error unless they are required (bad).
One solution would be, of curse, to set all properties to required. However, we do not want to do that since it is 1. confusing in the documentation 2. a lot of work.
This PR introduces
require_all_propsIt can be set like this:
If it is set to true, it is simulated that all properties defined in the specs are required.
Felix