In other API specs, you provide example values in the schema. So the
specs contain examples which can be used for the documentation. But
instead of defining example data separately, we can use the generated
data by the specs. This way we document real output and don't have to
double up on documentation.
Note that we don't have schema definitions for the DFC API yet. And it
wouldn't make sense to replicate the DFC Ontology manually in JSON
Schema for this purpose. The DFC Connector ensures already that we
comply with the ontology. But I hope that we can use a tool at some
point to generate JSON Schema from the DFC Ontology which would add more
detail to the Swagger docs, I think.
We were hiding that before because the API is not officially released
yet but that's actually quite annoying. It's very conenient to have the
UI on production and be able to try out endpoints.
I chose the simplest spec first to demonstrate how it works. The UI at
/api-docs now shows this endpoint with two possible responses.
The docs are missing an example response which I hope to add later.
We need to declare in each spec file for which endpoint the spec is
because it was just choosing the first declared one by default. The
first one was v1 and now it's dfc-v1.7.
Rswag doesn't look for specs in engines by default. We haven't added any
Rswag specs in the dfc_provider engine yet but that will come.
The generated API schema has some superfluous whitespace removed due to
a fix in the rswag gems.
The line item sorting by id has been replaced by sorting by completed_at
time: ccb183d60b
While that's a good idea, the query param to order was only defined in
the client Javascript and there was no default ordering. Line items also
get their completed_at date from the order. So it's the same for all
items of the same order and the ordering with that group of line items
was random.
Now we are adding an order in addition. Items are first sorted by date
and then by id if there's any ambiguity.
Line items which reference a master variant is a scenario that in theory shouldn't have been valid or even possible for at least 5-6 years, and these old bits of data in theory should have been cleaned up at the time those changes were made. But a couple of servers have some really old data that's not in a nice state.
Here we can just flip the is_master flag to false for those specific (legacy data) cases before deleting any other master variants, to keep the legacy line item data intact.
This reverts commit 21b80db0ee.
This fix was needed for an old version of the JSON module with a newer
version of Ruby. But we updated the json gem since and don't need this
any more.
These shouldn't technically exist, but apparently they can be present if the dataset is old enough. They can trigger a foreign key violation if they are present when a master variant is deleted, so they need to be dropped if present.
These shouldn't technically exist, but apparently they can be present if the dataset is old enough. They can trigger a foreign key violation if they are present when a master variant is deleted, so they need to be dropped if present.
OFN products and variants need more data like a price but the DFC
stores that in a different object. We may get a larger graph containing
that information but we don't have any test data yet.