forked from opengeospatial/EDR-API-Sprint
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathFinal Report of EDR API Sprint
369 lines (303 loc) · 26.5 KB
/
Final Report of EDR API Sprint
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
Summary
TBD
Introduction
A Hackathon/Sprint was organised under the auspices of OGC to progress development of the Environmental Data Retrieval API (EDR-API). It was to be hosted by the US National Weather Service near Washington DC, from 18-20 March 2020. Unfortunately, because of the global health crisis, the physical face-to-face meeting was cancelled. It was decided to hold a virtual Hackathon/Sprint, more or less keeping to US Washington time (Eastern Daylight savings Time) and to communicate using GoToMeeting teleconferencing facilitles of OGC, email and GitHub.
The EDR-API is intended to allow a small amount of data, as required, to be retrieved, or 'sampled', by coordinates from data stores that are too large or impractical to copy and transmit. The data queries are based on common but distinct geometry types (i.e. points/timeseries, polygons, trajectories, etc).
The API is being developed by the OGC EDR API Standard Working Group, whose charter is at https://github.com/opengeospatial/Environmental-Data-Retrieval-API/blob/master/EnvironmentalDataRetrievalAPI-SWG-Charter.adoc , and the latest version of the candidate standard is at https://github.com/opengeospatial/Environmental-Data-Retrieval-API/blob/master/candidate-standard/EDR-candidate-specification.adoc . Background reading material is at https://github.com/opengeospatial/Environmental-Data-Retrieval-API .
The first draft of the candidate standard was based on the OGC API - Features, Part 1: Core and what is now OGC API - Common, Part 1: Core following a hackathon in London during June 2019. Both of these standards follow OGC current policy on API development and are consistent with OpenAPI Version 3.
Participants
There were about two dozen attendees, from north America, the far East, and Europe. They represented OGC, government departments, universities and private companies:
Steve Olson, US NWS
Shane Mill , US NWS
Paul Hershberg, US NWS
Dave Blodgett, USGS
Jim Kreft, USGS
Chris Little, UK Met Office
Mark Burgoyne, UK Met Office
Pete Trevelyan, UK Met Office
Peng Yue, Wuhan University
Boyi ShangGuan, Wuhan University
Lei Hu, Wuhan University
Zhang Mingda, Wuhan University
Jeff Donze ESRI
Sudhir Shrestha ESRI
Keith Ryden ESRI
George Percivall, OGC
Josh Liebermann, OGC
Chuck Heazel, HeazelTech LLC
Clemens Portele, Interactive Instruments
Tom Kralidis, Meteorological Service of Canada
Stephan Siemen, ECMWF
Purpose:
The EDR API Sprint was to provide feedback based on the current ERD API candidate specification, and assess the compatibility with the "OGC API - Features - Part 1: Core" https://github.com/opengeospatial/ogcapi-features/tree/master/core/standard standard and the in-progress "OGC API - Common - Part 1: Core standard" https://github.com/opengeospatial/oapi_common/tree/March-2020-Updates/standard .
Background
Environmental data typically involves huge datastores which are difficult to duplicate and transfer, with the difficulty compounded by rapid information replacement, such as in weather forecasts. Such datastores are often generated by a multitude of data resources, with a large number of interested users from multiple domains.
Much of the data exist in operational environments, where service reliability, availability and scalability, both in terms pf volume and numbers of users, are essential requirements.
Many of the existing services have powerful query capabilities, however some are difficult to scale in volume and numbers of users, making it hard to guarantee service levels. They may be difficult to secure.
Because some of the existing services have relatively complex APIs, there is a steep learning curve and they may be difficult to integrate into production environments.
EDR API approach:
By using simple query patterns 'sampling' into a large datastore where the data publisher is responsible for transforming and supplying the data, the user only defines area of interest in time and space, using Well Known Text (WKT) making the data simple to consume.
Handle and combine both “feature” and “coverage” data sources
All the logic is in the “collection” itself (self describing) and the self-describing queries.
Previous experiments, prototypes, experiences and production implementations give confidence that this approach avoids large scale duplication of data, is easier to implement, with easier access control supplied by modern API management tools, and gives interoperability through simplicity.
The EDR API is intended to be part of a proposed family of OGC APIs that are complementary to existing OGC Web-based geospatial services.
These new APIs are intended for a different audience, web developers rather than geospatial specialists, and serve as basic introduction to OGC services thus lowering the bar for entry to using OGC web services.
Goals
These were developed in meetings of the EDR API Standard WG and the Met Ocean Domain WG, with other input from OGC staff and Members:
* Build implementations of the EDR API service based on existing data stores, both client and server-sides;
* Develop client-side value-added applications that consume data from prototype implementations of EDR API;
* Develop prototype functionality for EDR API through developing and running code.
The goals were refined by the working groups into these more specific objectives and deliverables:
Objectives:
* Verify and validate requirements and methods for the Query and Filter operations of the EDR API;
* Prototype rapidly other geometry types for the EDR API (partitioned grids or tiles, vertical profiles and/or trajectories/corridors);
* Develop client-side value-added applications that consume data from UK Met Office and US NWS prototype EDR API implementations;
* Make progress on conformance testing of EDR API parts based on queries.
Proposed deliverables:
* Software Code;
* Recommended changes to the EDR API specification;
* Report on level of compatibility with OGC API Features and Common;
* Recommendations for other OGC APIs (Features, Common, etc);
* Report on conformance class testing.
These goals, objectives and deliverables were all discussed by the attendees prior to the meeting and in the opening session of the Sprint.
Use Cases
The Sprint was introduced by a quick review of the EDR API use cases at https://github.com/opengeospatial/Environmental-Data-Retrieval-API/tree/master/use-cases . These included:
* Get Parameters for a Point across a Timeseries
* Obtain or view a forecast time series of a parameter at a point
* Get Unstructured Obs from within a Polygon
* Show Weather Radar data timeseries
* Obtain or view Air Traffic Hazards and Restrictions for an Area
Existing Data and Implementations
Implementations of EDR API:
* US NWS:https://data-api.mdl.nws.noaa.gov/EDR-API
Available data offered (Metars, Tafs, Global Forecast System(GFS), North American Mesoscale model (NAM), National Digital Forecast Database (NDFD))
* UK Met Office: http://labs.metoffice.gov.uk/edr/
Available data offered (Metars, UK Global Model, US GFS, US NDFD, Open Streetmap, Digital Elevation Models)
Data Sources
* Amazon data sources: https://registry.opendata.aws/?search=tags:gis,earth%20observation,mapping,meteorological,environmental,transportation
Demonstrations
1. Automated aggregation of metadata from collections (3D, 4D, 5D)
This creates collections of parameters (that have common dimensions) from operational data stores, and outputs in JSON which is used to convert the original dataset to zarr. Each zarr data store represents a specific collection. With the parameters grouped by common dimensions, more complex queries than EDR API can be made. Demo at https://github.com/ShaneMill1/edr-automation .
2. Demonstration of client side APIs (single and multi-domain feature extractions)
Uses the EDR API to access time series at a point for:
* Observations from a point cloud – latest airfield obs (Metars);
* Gridded forecast current data from US NWS (GFS) using https://data-api.mdl.nws.noaa.gov/EDR-CLIENT-API ;
* Gridded forecast data from 2 day old UK Met Office Global Unified Model using http://labs.metoffice.gov.uk/edr/ .
Demo at http://labs.metoffice.gov.uk/map/wotwdemo/ .
Working Methods
A GitHub repository was established at https://github.com/opengeospatial/EDR-API-Sprint . A branch of the EDR API Candidate standard was also created at https://opengeospatial.github.io/EDR-API-Sprint/edr-api.html so that changes could readily be made if required. Attendees also raised Issues in the Sprint repository to indicate their objectives, and any issues encountered. Discussions were then followed on the Issues tabs and tagged according to their content.
Background information was supplied in a Slide presentation https://docs.google.com/presentation/d/1vU8O7dnkima9Vch_T0ebECV5RRjSw-8Kuox15gN2Z4k/edit?usp=sharing and wiki pages at https://github.com/opengeospatial/EDR-API-Sprint/wiki .
Also, at the beginning and end of each working day, briefing sessions were held on GoToMeeting to present work done, and to discuss in more depth any issues and their resolutions. Sessions were chosen to enable the different time zones to take part.
Objectives
Most attendees, either individuals or teams, identified specific objectives that they would like to achieve. These were:
1. Develop new EDR API server and client, Mark Burgoyne (Issue #20)
Develop a simple EDR server to demonstrate Point, Polygon and Items queries against the same Collection of data. This is still work in progress, to track and implement the changes to the latest EDI API specification, and also to try out novel ideas. It will also demonstrate an alternative approach to the proposed application item_id as a unique identifier for a location (i.e. a METAR id, GeoHash key, WhatThreeWords, etc) which has a set of data assigned to it.
This addressed Issues #04 and #05: Point, timeseries at a point, vertical profile at a point, and 2D polygon.
2. Demonstrate automatic harvesting of metadata from aggregations of data stores to improve search capabilities for use with the EDR API, US NWS (Issue #19 and Issue #14))
Extend the automated aggregation of metadata from Collections (3D, 4D, 5D) by storing the output from a forecast model run as a concatenation of binary GRIB files, then process as xarray dataset with pynio and extract metadata (parameter ID determined by pynio, long name, level type, dimensions) and put in a pandas dataframe. The dataframe creates a map between dimension names and dimension values. Parameters that have the same dimensions are grouped. The metadata is output in JSON. The original dataset is converted to zarr using the collection JSON. This approach will enable more complex queries than just EDR API. The implementation is: https://data-api.mdl.nws.noaa.gov/EDR-API/groups/US_Models?outputFormat=application%2Fjson based on NAM and GFS GRIB data.
The aggregation will be extended to other national weather service data source, or other file types such as HDF5, NetCDF, etc.
* France on AWS at https://registry.opendata.aws/meteo-france-models/
* Canada at https://weather.gc.ca/grib/index_e.html
* The Netherlands at https://data.knmi.nl/datasets?q=grib
Currently, one can search for a keyword, and Collections that have parameters with a long name containing that keyword will be shown as a URI, which will link to the the EDR API query endpoint. The links returned need to be ordered in a dictionary.
Additional work would be to:
* match the keywords with other metadata attributes such as the dimension names (ie ISBL for isobaric)
* add the ability to search by parameter name as well as dimension name so that a user can search more narrowly.
* utilize OpenSearch geo and time extensions. A pycsw module offers a python implementation of OGC CSW as well as the OpenSearch geo and time extensions. http://docs.pycsw.org/en/stable/introduction.html .
Looking at the documentation, I was able to incorporate pycsw into our EDR-API implementation following this approach:
https://docs.pycsw.org/en/latest/api.html
Then, following the steps provided at the link below, I was able to create a compliant blank sqlite database to start from:
http://docs.pycsw.org/en/stable/administration.html#metadata-repository-setup
Finally, you can see the beginning of our implementation at the following endpoint:
https://data-api.mdl.nws.noaa.gov/EDR-API/csw?service=CSW&version=2.0.2&request=GetCapabilities
My next steps will be to connect the dots from how I create metadata in the aggregation of collections software and incorporate that metadata into these services.
let's continue the discussion on what/how the formal metadata would look to serve as part of a CSW instance (Dublin Core, ISO, etc.). This will also give us an opportunity to investigate the OGC API - Records work (disclosure: I'm part of this SWG and working on an implementation). now an issue in the EDR SWG github: opengeospatial/Environmental-Data-Retrieval-API#40
Outcomes:
* Successfully used aggregation software to ingest GFS 1 Deg, 1/2 Deg, 1/4 Deg, NAM 32km, 12km, and Meteo-France's Arpege 1/2 Deg data.
* Identified memory problem when creating zarr datastores for higher resolution data.
Observations:
* data should be WGS84 for this implementation, therefore NAM data would need to be converted to this CRS.
Future work:
* test other national weather service grib data;
* optimize the code to solve memory issue.
3. Implement Trajectory queries against typhoon/hurricane data. Wuhan University (Issue #03)
This is the same as Issue #09, Objective: Trajectory, 2D, 3D or 4D. The trajectory query was successfully implemented in all the 2D, 3D and 4D cases, using the EDR API specification proposal to use Well Known Text (WKT) Linestring formats.
4. Explore putting EDR API on ESRI REST API image server, UK Met Office, ESRI (Issue #02 and Issue #17)
This is ongoing work. There do not seem to be any fundamental architectural problems with possible approaches for the proposed use cases.It iexpected that the queries supported would be for points/timeseries/vertical profiles, and cube/tiles.
The main aim of this work is to show how the EDR "Pattern" may be "mapped" onto a proprietary REST API , such as ESRI's Arc Image Server. The output from this work will be a document outlining the issues that are brought to light and a very simple prototype consisting of a simple proxy server that uses the EDR API and translates into the Image Server REST API
5. Use EDR API to retrieve feature orientated observations from hydrological network data stores, USGS (Issue #01)
6. Conformance testing, ECMWF (Issue #16)
There was general support for this topic, in particular, for the returned content of the payload from a specific query, but no work was done.
OGC API Features has a conformance test suite, written to be modular, and presumably, some of this will be shared with an OGC API Common test suite.
Conformance tests for the returned payload is a bit harder.
* XML has validating XML schemas and Schematron rules.
* GeoJSON has a clear standard to test against. For the payload, JSON schema is nowhere near as rigorous as XML schema.
* CoverageJSON is well defined in the original work done by Jon Blower and Maik Reichardt at Reading University, and the process to agree an OGC standard for the structure of CoverageJSON has started. See the repo at https://github.com/opengeospatial/CoverageJSON.
7. Implement and demonstrate a STAC (Spatio-Temporal Asset Catalog) for accessing meteorological real-time data stores in pygeoapi, Meteorological Service of Canada (Issue #21)
This is ongoing work that started after the Sprint days.
STAC provides a common language to describe a range of geospatial information, so it can more easily be indexed and discovered. A 'spatiotemporal asset' is any file that represents information about the earth captured in a certain space and time.
While STAC work has been primarily focused on EO imagery, there is value in investigating STAC capability for real-time Numerical Weather Prediction (NWP) data offerings. This would allow for arbitrary hierarchy based on NWP workflow (model runs, forecast hours, forecast variables, etc.) as well as simple, file-based discovery/traversal of same.
Questions
• how would NWP look as a static catalog of (in our case) GRIB2 files?
• how would a given STAC implementation for MetOcean compare to OGC API - Records
• would a MetOcean STAC profile be valuable?
Results
• implemented a STAC catalog for a (very very small subset) of our Global Deterministic Prediction System (GDPS)
• code: https://github.com/geopython/pygeoapi/tree/stac2
• deployment:
o root catalog: http://52.170.144.218:8000/stac/nwp/catalog.json
o model run: http://52.170.144.218:8000/stac/nwp/00/catalog.json
o model run/forecast hour: http://52.170.144.218:8000/stac/nwp/00/000/catalog.json
o data: http://52.170.144.218:8000/stac/nwp/00/000/CMC_glb_DEPR_ISBL_750_latlon.15x.15_2020040100_P000.grib2.json
Observations
• successfully tested with stac-validator
• data properties are basically GRIB2 metadata
• links are initially minimal: we could have links back to related EDR workflows (or OGC API - Coverages, OGC API - Processes)
• search is not in scope for STAC Catalog (more OGC API - Records and STAC Catalog)
Future Work
• investigate integration with OGC API - Records and STAC Catalog
o GDPS is a collection level discovery metadata in the scope of OGC API - Records
o searching within GDPS would be a link from the OGC API - Records document/search result to the STAC API (essentially searching model runs/forecast hours of data)
8. The remaining Issues #04 - #10 were generic objectives covering the full scope of data query patterns of the EDR API, but divided up acording to expected difficulty.
* Point, timeseries at a point, and vertical profile at a point
* Polygon and tile (2D)
* Polygon in 3D or 4D
* Polygons in 3D and 4D
* Tile/Cube in 3D or 4D
* Trajectory, 2D, 3D or 4D
* Corridor, 3D or 4D
Issues arising and resolved
Issue #03 and #09 Trajectories: which time specification?
The EDR API specification uses ISO8601 style of notation to specify dates and times for query parameters. This is very convenient and understandable for users. However, the Trajectory query specifies time as a number of seonds since the Unix epoch. This is specified in the Well Known Text (WKT) standard for defining lines/trajectories by uning LINESTRING. This makes time into a proper coordinate, enabling easier software calculations, though the units are not easily comprehended by users.
It was agreed to keep the two different approaches, as both had merits, and disadvantages, and elicit wider feed back from the OGC Members, OAB and the public. A mitigation may be to incorporate an easy to use service to convernt between the two different time representations.
Issue #04 Point, timeseries at a point, and vertical profile at a point: which combinations?
Would the current EDR API specification allow simultaneously both a time series and a vertical profile at a point? I.e. A 2D array of values would be returned. At present, the Point query only allows a time series at a point or a vertical profile at a point, but not both. It was agreed to stay with these minimal cases. The general 2D vertical timeseries is not widely used outside of meteorology, where it is known as a Hovmöller diagram.
Another consequence of these discussions was that Point was re-named Position, and Polygon re-named as Area.
Issues #08 and #06 and #05 Cube/Tile, Polygon in 3D or 4D, Polygon/Tile in 2D: which bounding box styles?
Should bounding boxes for a query, in 4D, be specified by ranges of values or coordinates of the corners? The consensus was that ranges are more natural for time and vertical coordinates. This is then the same as specified 2D Polygons in WKT.
there was no disagreement that there will be no polyhedra or polytopes (complex multidimensional polygons, rather than just 'extruded' 2D polygons. An extruded 2D polygon can be called a 'prism'.
Issue #0? What are the EDR API resource types?
this was the usual semantic question. It was agreed that the intial resource was a persistent, dense data store. The queries against it were Discrete Sampling Geometries sampling the data store, and were transient, but could be made a persistent resource if required by another service. [The Research Data Alliance, RDA, recommends a query store in its Best Practice Recommendations for citing dynamic data.] the returned data paylaod is also a transient resource, which also could be made persistent.
It was agreed to add words to this affect in the EDR API candidate standard.
Issue #18 Streaming of EDR API response media types
If the data returned in response to a query is small, there is no need for streaming of the results. It may be necessary for large responses, but that is starting to be outside the scope of the EDR API. It was agreed that the EDR API specification would not mention streaming, and it would be an implementation decision as to whether streaming is supported. This decision would obviously be influenced by the choice of supported media type, which may or may not support streaming.
Issue #12 Items view of EDR resources and Issue #15 JSON-Schema for /collections/{collectionID}/items
The EDR API provides no mechanism for the user to discover available location identifiers (such as ICAO ids) in the metadata. The identifiers are available in the query results, but not in any of the available metadata outputs. What would the JSON-Schema be for these items? E.g: collections/metar/raw/items?id=KIAD¶metername=icao_id&time=2020-01-31T00:00:00Z/2020-02-01T04:00:00Z .
If consistent with Features, then collections/metar/raw/items returns the list of items (paged, if there are many) and that the query would look like: collections/metar/raw/items/KIAD?parametername=icao_id&datetime=2020-01-31T00:00:00Z/2020-02-01T04:00:00Z .
Currently the EDR API "Items" specifies coverageJSON rather than a GeoJSON Feature Collection with valid query parameters for each item in the collection. Can lists of available parameters in geoJSON be exposed?
{
"type": "FeatureCollection",
"crs": {
"type": "EPSG",
"properties": {
"code": 4326,
"coordinate_order": [ 1, 0 ]
}
},
"features": [
{
"type": "Feature",;
"id": "123",
"geometry": {
"type": "Point",
e. "coordinates": [ -105.683442, 36.740017 ]
},
"properties": {
"datetime": "2018-02-12T00:00:00Z/2018-03-18T12:31:12Z",
"parametername": [ "param object 1", "param object 2" ],
"label": "Something like a site name to use on a link",
"uri": "https://feature_identifier"
}
}
]
}
This validates against the geojson schema:
id is what goes in /collection/{collectionID}/items/{itemID};
datetime follows features;
parametername follows the draft spec naming convention;
label needed because there has to be a list/link label in the client;
uri which could be @id, like json-ld and it should really be a linked data feature ID which 303 redirects if a real-world sampling feature.
If these extensions are put in the geoJSON FeatureCollection schema, the software sf/gdal outputs:
> sf::read_sf("~/Documents/active_code/EDR-API-Sprint/items/test.geojson")
Simple feature collection with 1 feature and 5 fields
geometry type: POINT
dimension: XY
bbox: xmin: -105.6834 ymin: 36.74002 xmax: -105.6834 ymax: 36.74002
epsg (SRID): 4326
proj4string: +proj=longlat +datum=WGS84 +no_defs
# A tibble: 1 x 6
id datetime label uri parametername geometry
<chr> <dttm> <chr> <chr> <list> <POINT [°]>
1 123 2018-02-11 18:00:00 Something like a si… https://feat… <chr [2]> (-105.6834 36.74002)
The properties schema is here:
properties:
type: object
title: The Properties Schema
description: An explanation about the purpose of this instance.
default: {}
example:
- datetime: 2018-02-12T00:00:00Z/2018-03-18T12:31:12Z
label: Something like a site name to use on a link
parametername:
- param object 1
- param object 2
uri: https://feature_identifier
required:
- datetime
- parametername
- label
- uri
properties:
datetime:
type: string
title: The Datetime Schema
description: An explanation about the purpose of this instance.
default: ''
example:
- 2018-02-12T00:00:00Z/2018-03-18T12:31:12Z
parametername:
type: array
title: The Parametername Schema
description: An explanation about the purpose of this instance.
default: []
items:
type: string
title: The Items Schema
description: An explanation about the purpose of this instance.
default: ''
example:
- param object 1
- param object 2
label:
type: string
title: The Label Schema
description: An explanation about the purpose of this instance.
default: ''
example:
- Something like a site name to use on a link
uri:
type: string
title: The Uri Schema
description: An explanation about the purpose of this instance.
default: ''
example:
- https://feature_identifier
Cool -- We'll need to clean that up.
Issues outstanding
Issue #11 Groups versus Collections
The EDR API specification found a need to have Groups of Collections in the API path. In the wider OGC, there is now a discussion of whether APIs could have Collection of Collections . The Sprint agreed to stay with Groups until the wider issue is resolved.
Issue #10 Corridors, 3D or 4D
Corridors were originally envisaged as a volume defined by a surface of constant distance from a line trajectory. The idea of the bottom of a corridor volume being delineated by the earth's surface (or some other surface) was raised. It was agreed to tackle this interesting, practical and difficult problem later.
Other Issues
Observers, from outside the Sprint, have raised some substantive questions, including about interpretation and implementation of vertical coordinates in the EDR API. These will be raised and addressed in the EDR API Standard WG.
Recommendations
??
Open Topics for further discussion
* Do we need one another Sprint?
* What Metadata frameworks should be used, recommended or mandated?
* Security considerations?
* Other geometry types need protoyping and implementing.
* How should the pub/sub approach be accommodated?
* How to integrate with other APIs (features/coordinates, maps)
* How to align with decision impact groups like SmartCity, others
Annexes?