-
-
Notifications
You must be signed in to change notification settings - Fork 265
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] OGC API Features - Part 4 / Support for PostgreSQLProvider #1266
[WIP] OGC API Features - Part 4 / Support for PostgreSQLProvider #1266
Conversation
The new function 'crs_transform_json_geom' used in the crs_transform decorator allows the coordinates of a feature (GeoJSON-like dict) to be transformed directly from one coordinate sytem to another, without first being converted to a shapely geometrical object. This new pipeline: 'GeoJSON Feature -> coordinates transformation' can replace the current pipeline: 'GeoJSON Feature -> convertion to Shapely geometry -> coordinates transformation -> convertion back to GeoJSON Feature'.
In the |
session.commit() | ||
session.refresh(db_obj_mapping) | ||
identifier = getattr(db_obj_mapping, self.id_field) | ||
except DataError as err: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @MTachon, why invalid data are reaching the database session? I'm expecting a data validation layer should be implemented at the interface level since pygeoapi already should be able to serve the schema for the collection. If the schema exposed is not validated than the endpoint should just emit a 422
response code with the validation error
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @francbartoli, just checked for valid GeoJSON before that point. Was not sure if instantiating db_obj_mapping
with wrong data passed to the table model would throw an error. No error came until the session.commit()
statement. Not sure how it should be implemented. I also think it would be better with a data validation layer using schemas. As for now, I do not think pygeoapi serves collections' schemas, and in order to do so, we would need to implement a schema generation function for all providers. Right now, I am working on adding transactions support for PostgreSQLProvider
, and I thought I could let PostgreSQL/PostGIS throw an error if the POST
-ed data is invalid and catch the exception. ElasticsearchProvider
supports transactions but is not using schemas validation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@MTachon how does the specification handle the schema on the OpenAPI document? I'm expecting, as a developer, that the data model is exposed on the swagger page, is somehow the queryables
endpoint designed to do that job?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in this issue opengeospatial/ogcapi-features#424 the mutation schema
is mentioned for that purpose, @tomkralidis any clue why it is completely missing from Part-4 now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@francbartoli , I think you will find your answer there: opengeospatial/ogcapi-features#740
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @MTachon, this is a WIP indeed. Can you please put a comment that this would be abstracted as soon as this part of the specification will be released and implemented in pygeoapi?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @francbartoli, just checked for valid GeoJSON before that point. Was not sure if instantiating
db_obj_mapping
with wrong data passed to the table model would throw an error. No error came until thesession.commit()
statement. Not sure how it should be implemented. I also think it would be better with a data validation layer using schemas. As for now, I do not think pygeoapi serves collections' schemas, and in order to do so, we would need to implement a schema generation function for all providers. Right now, I am working on adding transactions support forPostgreSQLProvider
, and I thought I could let PostgreSQL/PostGIS throw an error if thePOST
-ed data is invalid and catch the exception.ElasticsearchProvider
supports transactions but is not using schemas validation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@tomkralidis , thanks for pointing it out! I was not aware of it, I will have a look.
My mistake, no server is running here... Going to use methods from |
…th APIRequest.get_request_headers
…lt in _load_and_prepare_item
…tgresql_replace, and start get_schema for PostgreSQL
Thanks @MTachon for working on this. |
Hi @JakobMiksch . I have worked on other projects lately, so did not have much time to work on this PR. But now I will have more time again. I will let you know as I advance if some things can be tested :) Thanks! |
@MTachon alright, I am happy to test any time :) |
Hi @MTachon, what is the status of this? |
Hi @francbartoli, I am not really satisfied by the start of my implementation of schemas... I would like like to use schema generation to filter out invalid data, and not do exception handling around the database session, as discussed above. I felt like schema generation was too much for the scope of this PR, and wanted to dedicate new PR for that. I started PR #1349 on GeoJSON schemas generation. It is an attempt to use pydantic models to provide a standard way for data validation and schema generation in the plugin providers implementation. I see that PR #1349 would required an upgrade of the pydantic version, but that should not be too much work, as addressed in issue #1341. I also get CI issues from type annotations and features not yet available in python 3.7. Upgrade of the python version in CI were discussed in issue #1176. If we do not want to upgrade the python version, I need to do some refactoring. |
As per RFC4, this Pull Request has been inactive for 90 days. In order to manage maintenance burden, it will be automatically closed in 7 days. |
As per RFC4, this Pull Request has been closed due to there being no activity for more than 90 days. |
Overview
This PR is intended to add transactions support to the
PostgreSQLProvider
. Implementation should follow the OGC API - Features - Part 4: Create, Replace, Update and Delete draft specification.Additional Information
The OGC API - Features - Part 4: Create, Replace, Update and Delete draft specification is not an OGC standard and is subject to change.
As of now, the draft specification does not mention the use of schemas for data validation in POST/PUT/PATCH (create/replace/update) requests. This is a work in progress and the schema related content of both Part 3 and Part 4 will likely be moved to a Part 5 specification document. One can get more information about this process at opengeospatial/ogcapi-features#740. Once more information about schemas is available and that this part of the specification is implemented in pygeoapi, the exception handling around the database transactions lines of code will be removed, and we will let schemas validation do the work and stop requests with invalid data.
Contributions and Licensing
(as per https://github.com/geopython/pygeoapi/blob/master/CONTRIBUTING.md#contributions-and-licensing)