Skip to content

Commit

Permalink
Merge pull request #440 from Deltares/osm
Browse files Browse the repository at this point in the history
OSM tiling
  • Loading branch information
dalmijn authored Sep 22, 2023
2 parents ae7efdc + 2155a2f commit 8280221
Show file tree
Hide file tree
Showing 9 changed files with 564 additions and 46 deletions.
1 change: 1 addition & 0 deletions docs/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -493,6 +493,7 @@ Raster writing methods
:toctree: _generated

DataArray.raster.to_xyz_tiles
DataArray.raster.to_slippy_tiles
DataArray.raster.to_raster
Dataset.raster.to_mapstack

Expand Down
13 changes: 7 additions & 6 deletions docs/changelog.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,30 +11,31 @@ Unreleased

Added
-----
- docs now include a dropdown for selecting older versions of the docs. (#457)
- Support for loading the same data source but from different places (e.g. local & aws)
- docs now include a dropdown for selecting older versions of the docs. (PR #457)
- Support for loading the same data source but from different providers (e.g., local & aws) (PR #438)
- Add support for reading and writing tabular data in ``parquet`` format. (PR #445)
- Add support for reading model configs in ``TOML`` format. (PR #444)
- new ``force-overwrite`` option in ``hydromt update`` CLI to force overwritting updated netcdf files. (PR #460)
- add ``open_mfcsv`` function in ``io`` module for combining multiple CSV files into one dataset. (PR #486)
- Adapters can now clip data that is passed through a python object the same way as through the data catalog. (PR #481)
- Model objects now have a _MODEL_VERSION attribute that plugins can use for compatibility purposes (PR # 495)
- Model class now has methods for getting, setting, reading and writing arbitrary tabular data. (PR #502)
- Relevant data adapters now have functionality for reporting and detecting the spatial and temporal extent they cover (# 503)
- Relevant data adapters now have functionality for reporting and detecting the spatial and temporal extent they cover (PR #503)
- Data catalogs have a ``hydromt_version`` meta key that is used to determine compatibility between the catalog and the installed hydromt version. (PR #506)
- Allow the root of a data catalog to point to an archive, this will be extracted to the ~/.hydromt_data folder. (PR #512)
- Support for reading overviews from (Cloud Optimized) GeoTIFFs using the zoom_level argument of ``DataCatalog.get_rasterdataset``. (PR #514)
- Support for writing overviews to (Cloud Optimized) GeoTIFFs in the ``raster.to_raster`` method. (PR #514)
- Added documentation for how to start your own plugin (#446)
- Added documentation for how to start your own plugin (PR #446)
- New raster method ``to_slippy_tiles``: tiling of a raster dataset according to the slippy tile structure for e.g., webviewers (PR #440).
- Support for http and other *filesystems* in path of data source (PR #515).

Changed
-------
- Updated ``MeshModel`` and related methods to support multigrids instead of one single 2D grid. PR #412
- Updated ``MeshModel`` and related methods to support multigrids instead of one single 2D grid. (PR #412)
- possibility to ``load`` the data in the model read_ functions for netcdf files (default for read_grid in r+ mode). (PR #460)
- Internal model components (e.g. `Models._maps`, `GridModel._grid``) are now initialized with None and should not be accessed directly,
call the corresponding model property (e.g. `Model.maps`, `GridModel.grid`) instead. (PR #473)
- Use the Model.data_catalog to read the model region if defined by a geom or grid. (PR #479)
- Support for http and other *filesystems* in path of data source (PR #515).

Fixed
-----
Expand Down
2 changes: 1 addition & 1 deletion examples/prep_data_catalog.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -773,7 +773,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.6"
"version": "3.11.4"
},
"vscode": {
"interpreter": {
Expand Down
201 changes: 193 additions & 8 deletions examples/working_with_tiled_raster_data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
"from hydromt.log import setuplog\n",
"from os.path import join\n",
"\n",
"logger = setuplog('tiling', log_level=20)\n",
"logger = setuplog(\"tiling\", log_level=20)\n",
"\n",
"# get some elevation data from the data catalog\n",
"data_lib = \"artifact_data\"\n",
Expand All @@ -52,7 +52,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### tiling raster with XYZ stucture\n",
"### Tiling raster with XYZ stucture\n",
"\n",
"First let's have a look at the XYZ structure.\n",
"an xarray.DataArray is simple written to a tile database in XYZ structure via .raster.to_xyz_tiles\n",
Expand Down Expand Up @@ -92,12 +92,172 @@
"At last, a .yml file is produced which can be read by the [DataCatalog](https://deltares.github.io/hydromt/latest/_generated/hydromt.data_catalog.DataCatalog.html) of HydroMT."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Tiling raster with OSM stucture\n",
"\n",
"Now let's have a look at tiling according to the OSM structure\n",
"an xarray.DataArray is simple written to a tile database in OSM structure via ``.raster.to_slippy_tiles``"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Write the database in XYZ stucture\n",
"name_osm = f\"{source}_osm\"\n",
"root_osm = join(\"tmpdir\", name_osm)\n",
"da0.raster.to_slippy_tiles(\n",
" root=root_osm, driver=\"GTiff\", reproj_method=\"average\" # try also 'netcdf4'\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The tiles in the 'merit_hydro_OSM' folder now contains all the zoom_levels from the minimal starting level (9) until the highest level (0).\n",
"\n",
"Every tile, regardless of the zoomlevel, has a resolution of 256 by 256 pixels.\n",
"\n",
"Zoomlevel 0 is at the scale of the entire world (one tile) and is the most downscaled. Zoomlevel 9 contains the highest resolution (most tiles) in regards to this tile database.\n",
"\n",
"A mosaic is created per zoomlevel of these tiles in a .vrt file.\n",
"\n",
"At last, a .yml file is produced which can be read by the [DataCatalog](https://deltares.github.io/hydromt/latest/_generated/hydromt.data_catalog.DataCatalog.html) of HydroMT."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Tiling raster for a webviewer\n",
"\n",
"Finally, let's take a look at tiling of a raster dataset with its use being to view the data in a webviewer.\n",
"This is easily done with the .raster.to_webviewer_tiles method. \n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Write the data in OSM structure, but to images!\n",
"from matplotlib import pyplot as plt\n",
"\n",
"\n",
"name_png = f\"{source}_png\"\n",
"root_png = join(\"tmpdir\", name_png)\n",
"da0.raster.to_slippy_tiles(\n",
" root=root_png,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now that the images are created, let's take a look at an individual tile."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# View the data\n",
"from PIL import Image\n",
"import matplotlib.pyplot as plt\n",
"\n",
"# Create a figure to show the image\n",
"fig = plt.figure()\n",
"fig.set_size_inches(2.5, 2.5)\n",
"ax = fig.add_subplot(111)\n",
"ax.set_position([0, 0, 1, 1])\n",
"ax.axis(\"off\")\n",
"\n",
"# Show the image\n",
"im = Image.open(join(root_png, \"9\", \"273\", \"182.png\"))\n",
"ax.imshow(im)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The image itself tends to look like an oil painting, but this is correct.\n",
"\n",
"The colors for this image are determined so that they are correctly visually represented according to Terrarium Terrain RGB.\n",
"\n",
"If one were to see what this would look like in e.g. QGIS, a local sever is needed. \n",
"With python this is easily done with the command `python -m http.server 8000` from the command line while within the folder where the tiles are located. In this case that would be 'root_png'.\n",
"In QGIS, make a new XYZ Tiles connection. For this new connection the URL becomes 'http://localhost:8000/{z}/{x}/{y}.png' and the interpolation is set to Terrarium Terrain RGB.\n",
"\n",
"However, if the images are meant to be viewed as is, then a custom colormap can be defined to make them look nice!\n",
"\n",
"Let's make another dataset of png's!\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from matplotlib import pyplot as plt\n",
"\n",
"\n",
"name_png_cmap = f\"{source}_png_cmap\"\n",
"root_png_cmap = join(\"tmpdir\", name_png_cmap)\n",
"# let us define a nice color for a terrain image\n",
"da0.raster.to_slippy_tiles(\n",
" root=root_png_cmap,\n",
" cmap=\"terrain\",\n",
" norm=plt.Normalize(vmin=0, vmax=2000),\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now lets take a look at the improved standard visuals!"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# View the data\n",
"from PIL import Image\n",
"import matplotlib.pyplot as plt\n",
"\n",
"# Create a figure to show the image\n",
"fig = plt.figure()\n",
"fig.set_size_inches(2.5, 2.5)\n",
"ax = fig.add_subplot(111)\n",
"ax.set_position([0, 0, 1, 1])\n",
"ax.axis(\"off\")\n",
"\n",
"# Show the image\n",
"im = Image.open(join(root_png_cmap, \"9\", \"273\", \"182.png\"))\n",
"ax.imshow(im)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"### reading tiled raster data with zoom levels\n",
"### Reading tiled raster data with zoom levels\n",
"\n",
"With DataCatalog.[get_rasterdataset](https://deltares.github.io/hydromt/latest/_generated/hydromt.data_catalog.DataCatalog.get_rasterdataset.html) a raster (.vrt) can be retrieved. In case of a tile database it can be done for a certain zoomlevel. E.g."
]
Expand All @@ -112,12 +272,24 @@
"from hydromt import DataCatalog\n",
"\n",
"# Load the yml into a DataCatalog\n",
"data_catalog = DataCatalog(join(root, f\"{name}.yml\"), logger=logger)\n",
"data_catalog = DataCatalog(\n",
" [join(root, f\"{name}.yml\"), join(root_osm, f\"{name_osm}.yml\")], logger=logger\n",
")\n",
"\n",
"# View the structure of the DataCatalog\n",
"data_catalog[name]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# For the osm dataset\n",
"data_catalog[name_osm]"
]
},
{
"cell_type": "code",
"execution_count": null,
Expand All @@ -129,16 +301,29 @@
"da0.raster.shape"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# And for OSM\n",
"da1 = data_catalog.get_rasterdataset(name_osm, zoom_level=11)\n",
"da1.raster.shape"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Request a raster from the Datacatolog based on zoom resolution & unit\n",
"da = data_catalog.get_rasterdataset(name, zoom_level=(1/600, 'degree'))\n",
"da = data_catalog.get_rasterdataset(name, zoom_level=(1 / 600, \"degree\"))\n",
"da = data_catalog.get_rasterdataset(name_osm, zoom_level=(1 / 600, \"degree\"))\n",
"\n",
"da = data_catalog.get_rasterdataset(name, zoom_level=(1e3, 'meter'))\n"
"da = data_catalog.get_rasterdataset(name, zoom_level=(1e3, \"degree\"))\n",
"da = data_catalog.get_rasterdataset(name_osm, zoom_level=(1e4, \"meter\"))"
]
},
{
Expand Down Expand Up @@ -199,7 +384,7 @@
"metadata": {},
"outputs": [],
"source": [
"# if we run the same request again we will use the cached files (and download none) \n",
"# if we run the same request again we will use the cached files (and download none)\n",
"da0 = data_catalog.get_rasterdataset(name, bbox=[11.6, 45.3, 12.0, 46.0])"
]
}
Expand All @@ -220,7 +405,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.9"
"version": "3.11.5"
},
"orig_nbformat": 4,
"vscode": {
Expand Down
1 change: 1 addition & 0 deletions hydromt/data_catalog.py
Original file line number Diff line number Diff line change
Expand Up @@ -428,6 +428,7 @@ def __contains__(self, key: str) -> bool:
"Directly checking for containement is deprecated. "
" Use 'contains_source' instead.",
DeprecationWarning,
stacklevel=2,
)

return self.contains_source(key)
Expand Down
Loading

0 comments on commit 8280221

Please sign in to comment.