Merge pull request #4 from osm2vectortiles/master

Get a clean slate
pull/354/head
stirringhalo 2016-06-17 09:41:26 -04:00 zatwierdzone przez GitHub
commit d8216e7ade
33 zmienionych plików z 1410 dodań i 896 usunięć

2
.gitignore vendored
Wyświetl plik

@ -4,6 +4,8 @@ __pycache__/
*$py.class
# Data files
import/
export/
*.mbtiles
*.pbf
*.gz

Wyświetl plik

@ -1,22 +1,16 @@
sudo: required
language: bash
language: python
python:
- "3.4"
services:
- docker
before_install:
- wget -nc -P "$TRAVIS_BUILD_DIR/import" "http://download.geofabrik.de/europe/liechtenstein-160401.osm.pbf"
- docker-compose pull import-external export import-osm postgis
- wget -nc -P "$TRAVIS_BUILD_DIR/import" "http://download.geofabrik.de/europe/albania-160601.osm.pbf"
- wget -nc -O "$TRAVIS_BUILD_DIR/export/planet.mbtiles" "https://osm2vectortiles-downloads.os.zhdk.cloud.switch.ch/v2.0/planet_z0-z5.mbtiles"
- docker-compose pull import-external export import-osm postgis rabbitmq merge-jobs
# Avoid building really expensive images
- make fast
install: "pip install -r ./tools/integration-test/requirements.txt"
script:
# Test import
- docker-compose up -d postgis
- sleep 10
- docker-compose run import-external
- docker-compose run import-osm
- docker-compose run import-sql
# Test export
- docker-compose run export
# Test changed tiles
- docker-compose run update-osm-diff
- docker-compose run import-osm-diff
- docker-compose run changed-tiles
- py.test -xv ./tools/integration-test/integration_test.py

46
CONTRIBUTORS.md 100644
Wyświetl plik

@ -0,0 +1,46 @@
OSM2VectorTiles Contributors (sorted historically)
=================================================
## Individuals
- **[Lukas Martinelli](https://github.com/lukasmartinelli)**
- Maintainer and main contributor
- **[Manuel Roth](https://github.com/mroth)**
- Maintainer and main contributor
- **[Petr Pridal](https://github.com/klokan)**
- Technical advisor
- **[Stefan Keller](https://github.com/sfkeller)**
- Bachelor and study thesis supervisor
- **[Imre Samu](https://github.com/ImreSamu)**
- QA scripts and control
- OSM ID database optimization
- Automatic PostGIS tuning
- Technical advisor
- **[Hannes Junnila](https://github.com/hannesj)**
- Support for additional tags focused on public transport
- Support road area polygons
- **[Zsolt Ero](https://github.com/hyperknot)**
- Find compatible Node versions for Docker images
- **[stirringhalo](https://github.com/stirringhalo)**
- Improve documentation
- Integrate polygon splitting
- Research reasons for non performant planet regions
## Organizations
- **[Klokan Technologies](https://www.klokantech.com/)**
- Sponsoring rendering infrastructure
- Initial design of OSM2VectorTiles website
- Distributing "OSM2VectorTiles" USB stick
- **[HSR University of Applied Science](geometalab.hsr.ch)**
- University involved in maintaining project
- Supervising Study and Bachelor thesis around OSM2VectorTiles
- Sponsoring S3 download store

Wyświetl plik

@ -8,32 +8,26 @@ This is the easiest way how to switch to OSM thanks to [MapBox](https://github.c
![Create vector tiles from OpenStreetMap data](http://osm2vectortiles.org/img/home-banner-icons.png)
## Docs
The following tutorials are targeted at users of the OSM2VectorTiles project.
- **[Getting Started](http://osm2vectortiles.org/docs/getting-started/):** Quickly get started using OSM2VectorTiles to display maps in your browser. This tutorial explains how to serve downloaded Vector Tiles and use them in your browser.
- **[Create new Mapbox GL style](http://osm2vectortiles.org/docs/create-map-with-mapbox-studio/):** Design beautiful maps with the new Mapbox Studio and use them together with osm2vectortiles.
If you want to adapt the OSM2VectorTiles workflow to create vector tiles yourself the detailed usage guide
will get you started.
- **[Detailed Usage Guide](/USAGE.md)**: Create your own planet scale vector tiles with a distributed workflow using the OSM2VectorTiles components.
## Downloads
Download the entire world, city or country extracts from http://osm2vectortiles.org/downloads.
## Documentation
## Develop
- Getting started
- [Display map with Mapnik and Tessera](http://osm2vectortiles.org/docs/start)
- [Display map with Mapbox GL](http://osm2vectortiles.org/docs/display-map-with-mapbox-gl)
- Create your own custom basemap
- [Create a style with Mapbox Studio Classic](http://osm2vectortiles.org/docs/create-map-with-mapbox-studio-classic)
- [Create a style with new Mapbox Studio](http://osm2vectortiles.org/docs/create-map-with-mapbox-studio)
- Hosting
- [Serve raster tiles with Docker](http://osm2vectortiles.org/docs/serve-raster-tiles-docker)
- [Serve vector tiles](http://osm2vectortiles.org/docs/serve-vector-tiles)
- [Use public CDN](http://osm2vectortiles.org/docs/use-public-cdn)
- Vector Tiles
- [Create your own vector tiles](http://osm2vectortiles.org/docs/own-vector-tiles)
- [Layer Reference](http://osm2vectortiles.org/docs/layer-reference)
- [Create your own extract](http://osm2vectortiles.org/docs/extracts)
- Data Source
- [Data Sources of OSM Vector Tiles](http://osm2vectortiles.org/docs/data-sources)
- Import and Export
- [Import and Export Process](http://osm2vectortiles.org/docs/import-export-process)
- [Database Schema and Layers](http://osm2vectortiles.org/docs/database-schema/)
- [Imposm Mapping Schema](http://osm2vectortiles.org/docs/imposm-schema)
You want to hack on OSM2VectorTiles yourself or are interested in running or adapting the workflow yourself.
Take a look at the [detailed usage guide](USAGE.md) which explains the workflow and the components.
## License
@ -41,7 +35,7 @@ The project is under the MIT license while the data downloads use the [Open Data
## Contribute
The purpose of this project is to make OSM data more accessible to anybody. Any feedback or improvement is greatly appreciated. So feel free to submit a pull request or file a bug. You can also post feedback as GitHub issue.
The purpose of this project is to make OSM data more accessible to anybody. Any feedback or improvement is greatly appreciated. So feel free to submit a pull request or file a bug. You can also post feedback as GitHub issue. A list of current contributors can be found in the [CONTRIBUTORS](/CONTRIBUTORS.md) file.
You can help us to improve the documentation by editing the Markdown files and creating a pull request.
The documentation is based on GitHub pages and is in the `gh-pages` branch.

210
USAGE.md 100644
Wyświetl plik

@ -0,0 +1,210 @@
# Usage Documentation
If you've gotten this far, you've already explored the [documentation](http://osm2vectortiles.org/docs/) and likely have imported
a smaller extract of the planet. If you're looking to adapt OSM2VectorTiles for your own purposes
or run the process yourself this usage documentation is for you. Many thanks to @stirringhalo for much of the usage documentation.
## Requirements
The entire project is structured components using Docker containers
to work together. Ensure you meet the prerequisites for running the
OSM2VectorTiles workflow.
- Install [Docker](https://docs.docker.com/engine/installation/)
- Install [Docker Compose](https://docs.docker.com/compose/install/)
- Setup a S3 bucket (or compatible object storage)
We assume a single machine setup for this purpose and later go into detail
how to run the workflow in a distributed manner.
**Hardware Requirements:**
You can render small areas with OSM2VectorTiles on your local machine.
However to run the workflow at global scale you need significant infrastructure.
- 500GB disk space (150GB PostGIS, 30GB planet dump, 50GB imposm3 cache, 50 GB final MBTiles)
- 16GB+ RAM recommended (up until 50 GB RAM for PostGIS)
- 8+ CPU cores (up until 16-40 cores) for rendering vector tiles and PostGIS calculations
## High-level Procedure
The architecture of the project is structured into the import phase (ETL process),
the changed tiles detection phase and the export phase (render vector tiles).
If you run the distributed workflow yourself, the following steps take place.
A detailed explanation follows afterwards.
1. Import external data sources into PostGIS
2. Import OSM PBF into PostGIS
3. Create a low level global extract from z0 to z8
4. Generate jobs to render entire planet and submit them to message queue (RabbitMQ)
5. Start rendering processes to work through submitted jobs
6. Merge job results together into previous low level extract resulting in the final planet MBTiles
7. Create country and city extracts from MBTiles
![Workflow structured into components](/src/etl_components.png)
### Component Overview
Documentation for each component can be find in the respective source directory.
**Import Components**
- **[import-external](/src/import-external)**: Import all data that is not directly form OpenStreetMap (NaturalEarth, OpenStreetMapData, custom data files)
- **[import-osm](/src/import-osm)**: Import OpenStreetMap planet files into PostGIS using imposm3
- **[import-sql](/src/import-sql)**: Provision and generate SQL used in the different layers. Contains most of the SQL logic.
**Export Components**
- **[export-worker](/src/export-worker)**: Responsible for rendering vector tiles using the vector data source **[osm2vectortiles.tm2source](/osm2vectortiles.tm2source)**. Exports can be run together with a message queue like RabbitMQ or standalone for smaller extracts where it is not necessary to divide the work into several parts.
- **[merge-jobs](/src/merge-jobs)**: Merges results of distributed rendering together into a single planet MBTiles file.
**Changed Tile Detection Components**
- **[changed-tiles](/src/changed-tiles)**: Calculate list of changed tiles
- **[generate-jobs](/src/generate-jobs)**: Responsible for creating JSON jobs for rendering the planet initially or jobs for updating the planet.
- **[update-osm-diff](/src/import-osm)**: Download diffs from OpenStreetMap based on imported planet file.
- **[import-osm-diff](/src/import-osm)**: Import OpenStreetMap diff file created by **update-osm-diff**.
- **[merge-osm-diff](/src/import-osm)**: Merge latest diff file into the old planet file.
## Processing Steps
### Prepare
1. Clone repository `git clone https://github.com/osm2vectortiles/osm2vectortiles.git && cd osm2vectortiles`.
2. If you want to build the containers yourself execute `make fast` or `make`. Otherwise you will use the public prebuilt Docker images by default.
3. Start and initialize database `docker-compose up -d postgis`
4. Import external data sources `docker-compose run import-external`
### Import OSM
**Download planet file or extract** from [Planet OSM](http://planet.osm.org/) or [Geofabrik](https://www.geofabrik.de/data/download.html) and store it in `import` folder.
```bash
wget http://planet.osm.org/pbf/planet-latest.osm.pbf
```
**Import OSM** into PostGIS. Since the import happens in diff mode this can take up to 14hrs for the full planet.
```bash
docker-compose run import-osm
```
**Provision SQL** needed to render the different layers.
```bash
docker-compose run import-sql
```
### Create Extract using Local Worker
You are now able to use a single local worker to render a extract. This can be used to create a extract
for a specific reason or generating low level vector tiles.
1. First **choose a specific bounding box** or export global bounding box from http://tools.geofabrik.de/calc/.
2. **Run the local export** which will store the resulting MBTiles in `export/tiles.mbtiles`
```bash
# Create low level extract from z0 to z8 for entire planet
docker-compose run \
-e BBOX="-180, -85, 180, 85" \
-e MIN_ZOOM="0" \
-e MAX_ZOOM="8" \
export
# Create extract for Albania from z10 to z14
docker-compose run \
-e BBOX="19.6875,40.97989806962015,20.390625,41.50857729743933" \
-e MIN_ZOOM="10" \
-e MAX_ZOOM="14" \
export
```
### Distributed Planet Export
You need to distribute the jobs to multiple workers for rendering the entire planet.
To work with the message queues and jobs we recommend using [pipecat](https://github.com/lukasmartinelli/pipecat).
**Start up message queue server**. The message queue server will track the jobs and results.
```bash
docker-compose up -d rabbitmq
```
**Divide the planet into jobs** from z8 down to z14 and publish the jobs to RabbitMQ.
To render the entire planet choose the top level tile `0/0/0` and choose job zoom level `8`.
```bash
docker-compose run \
-e TILE_X=0 -e TILE_Y=0 -e TILE_Z=0 \
-e JOB_ZOOM=8 \
generate-jobs
```
**Scale up the workers** to render the jobs. Make sure the `BUCKET_NAME`, `AWS_ACCESS_KEY`, `AWS_SECRET_ACCESS_KEY` and `AWS_S3_HOST` are configured correctly in order for the worker to upload the results to S3.
```bash
docker-compose scale export-worker=4
```
Watch progress at RabbitMQ management interface. Check the exposed external Docker port of the RabbitMQ management interface at port `15672`.
### Merge MBTiles
Please take a look at the component documentation of **[merge-jobs]((/src/merge-jobs))**.
If you are using a public S3 url merging the job results is fairly straightforward.
1. Ensure you have `export/planet.mbtiles` file present to merge the jobs into. Reuse a low level zoom extract generated earlier or download an existing low level zoom extract from http://osm2vectortiles.org/downloads/.
2. **Merge jobs** into planet file
```bash
docker-compose run merge-jobs
```
### Apply Diff Updates
Updates are performed on a rolling basis, where diffs are applied.
At this stage we assume you have successfully imported the PBF into the database
and rendered the planet once.
**Download latest OSM changelogs**. If you are working with the planet remove the `OSM_UPDATE_BASEURL` from the `environment` section in `update-osm-diff`. If you are using a custom extract from Geofabrik you can specify a custom update url there.
Download latest changelogs **since the last change date of the planet.pbf**.
```bash
docker-compose run update-osm-diff
```
Now **import the downloaded OSM diffs** in `export/latest.osc.gz` into the database. This may take up to half a day.
```bash
docker-compose run import-osm-diff
```
After that you have successfully applied the diff updates to the database and you can either rerender the entire planet or just the tiles that have changed.
After importing the diffs **you can reapply the diffs to the original PBF** file to keep it up to date.
```bash
docker-compose run merge-osm-diff
```
### Render Diff Updates
**Calculate the changed tiles** since the last diff import. This will store the changed tiles in `export/tiles.txt`.
```bash
docker-compose run changed-tiles
```
**Create batch jobs** from the large text file and publish them to RabbitMQ.
```bash
docker-compose run generate-diff-jobs
```
**Now schedule the workers** again (similar to scheduling the entire planet) and **merge the results**.
```bash
docker-compose scale export-worker=4
docker-compose run merge-jobs
```

Wyświetl plik

@ -41,7 +41,12 @@ update-osm-diff:
volumes:
- ./import:/data/import
environment:
OSM_UPDATE_BASEURL: "http://download.geofabrik.de/europe/switzerland-updates/"
OSM_UPDATE_BASEURL: "http://download.geofabrik.de/europe/liechtenstein-updates/"
merge-osm-diff:
image: "osm2vectortiles/import-osm"
command: ./import-mergediffs.sh
volumes:
- ./import:/data/import
import-external:
image: "osm2vectortiles/import-external"
links:
@ -71,13 +76,21 @@ generate-jobs:
image: "osm2vectortiles/generate-jobs"
volumes:
- ./export:/data/export
links:
- rabbitmq:rabbitmq
generate-diff-jobs:
image: "osm2vectortiles/generate-jobs"
command: ./generate_list_jobs.sh
volumes:
- ./export:/data/export
links:
- rabbitmq:rabbitmq
merge-jobs:
image: "osm2vectortiles/merge-jobs"
volumes:
- ./export:/data/export
links:
- rabbitmq:rabbitmq
- mock-s3:mock-s3
export-worker:
image: "osm2vectortiles/export"
command: ./export-worker.sh
@ -86,11 +99,11 @@ export-worker:
links:
- postgis:db
- rabbitmq:rabbitmq
- mock-s3:mock-s3
environment:
AWS_ACCESS_KEY_ID: "${AWS_ACCESS_KEY_ID}"
AWS_SECRET_ACCESS_KEY: "${AWS_SECRET_ACCESS_KEY}"
AWS_REGION: "eu-central-1"
AWS_S3_HOST: "${AWS_S3_HOST}"
BUCKET_NAME: "osm2vectortiles-testing"
export:
image: "osm2vectortiles/export"
command: ./export-local.sh
@ -124,6 +137,7 @@ import-sql:
- postgis:db
environment:
SQL_CREATE_INDIZES: 'false'
SQL_SPLIT_POLYGON: 'true'
mapbox-studio:
image: "osm2vectortiles/mapbox-studio"
volumes:
@ -135,25 +149,21 @@ mapbox-studio:
rabbitmq:
image: "osm2vectortiles/rabbitmq:management"
ports:
- "15672:15672"
- "5672:5672"
- "15672"
- "5672"
volumes_from:
- rabbitdata
environment:
RABBITMQ_DEFAULT_USER: "osm"
RABBITMQ_DEFAULT_PASS: "osm"
RABBITMQ_HEARTBEAT: "0"
RABBITMQ_HEARTBEAT: "600"
create-extracts:
image: "osm2vectortiles/create-extracts"
volumes:
- ./export:/data/export
environment:
S3_ACCESS_KEY: "${S3_ACCESS_KEY}"
S3_SECRET_KEY: "${S3_SECRET_KEY}"
mock-s3:
image: "ianblenke/mock-s3"
ports:
- "8080"
S3_ACCESS_KEY: "${AWS_ACCESS_KEY_ID}"
S3_SECRET_KEY: "${AWS_SECRET_ACCESS_KEY}"
compare-visual:
image: "osm2vectortiles/compare-visual"
ports:

Wyświetl plik

@ -1,30 +1,12 @@
# OSM2VectorTiles tm2source Project
# OSM2VectorTiles TM2Source Project
[Mapbox Steets v7](https://www.mapbox.com/developers/vector-tiles/mapbox-streets-v7/) compatible data source.
## Requirements
See [osm2vectortiles documentation](https://github.com/geometalab/osm2vectortiles) for details.
## Layers
Because this data source is compatible with Mapbox Streets v5 all layers your can find in Mapbox Streets v5 are also available in this source.
For more information, please check out the documentation of [Mapbox Streets v5](https://www.mapbox.com/developers/vector-tiles/mapbox-streets-v5/)
Because this data source is compatible with Mapbox Streets v7 all layers you can find in Mapbox Streets v7 are also available in this source.
For more information, please check out the documentation of [Mapbox Streets v7](https://www.mapbox.com/vector-tiles/mapbox-streets-v7/)
## Editing
If you want to edit this data source, you need some OSM data on your local machine. Follow the documentation of the [osm2vectortiles project](https://github.com/geometalab/osm2vectortiles) to set everything up.
- Install the latest [Mapbox Studio Classic](https://www.mapbox.com/mapbox-studio-classic/)
- Clone this repository and edit the data.yml file with connection information for your postgis database.
```bash
host: <your host>
port: <your port>
dbname: <your dbname>
password: <your password>
```
- Add this repository as a data source in Mapbox Studio Classic. Now, you should see your data as "x-ray" outlines.
To see the data in style:
- Open any style in Mapbox Studio Classic and change source to this repository under layers.
If you want to edit this data source, you need some OSM data on your local machine. Follow the documentation on [our website](http://osm2vectortiles.org/docs/) to set everything up.

Wyświetl plik

@ -7,7 +7,7 @@ angola Angola -4.2888889 11.3609793 -18.1389449 24.18212
anguilla Anguilla 18.8951194 -63.7391991 17.9609378 -62.6125448
argentina Argentina -21.6811679 -73.6603073 -55.285076 -53.5374514
armenia Armenia 41.400712 43.3471395 38.7404775 46.7333087
australia Australia -9.090427 72.1460938 -55.4228174 168.3249543
australia Australia -9.0882 112.2363 -44.3945 155.2172
austria Austria 49.1205264 9.4307487 46.2722761 17.260776
azerbaijan Azerbaijan 42.0502947 44.6633701 38.2929551 51.1090302
bahrain Bahrain 26.7872444 50.1697989 25.435 51.0233693
@ -66,7 +66,7 @@ faroe_islands Faroe Islands 62.5476162 -7.8983833 61.1880991 -6.0413261
federated_states_of_micronesia Federated States of Micronesia 10.391 137.1234512 0.727 163.3364054
fiji Fiji -12.1613865 -180.0999999 -21.3286516 180.1
finland Finland 70.1922939 18.9832098 59.3541578 31.6867044
france France 51.368318 -178.4873748 -50.3187168 172.4057152
france France 51.2684 -5.4662 41.2632 9.8679
gabon Gabon 2.4182171 8.4002246 -4.201226 14.639444
georgia Georgia 43.6864294 39.7844803 40.9552922 46.8365373
germany Germany 55.199161 5.7663153 47.1701114 15.1419319
@ -97,7 +97,7 @@ jersey Jersey 49.5605 -2.6591666 48.7721667 -1.7333332
jordan Jordan 33.4751558 34.7844372 29.083401 39.4012981
kazakhstan Kazakhstan 55.5804002 46.392161 40.4686476 87.4156316
kenya Kenya 4.72 33.8098987 -4.9995203 41.999578
kiribati Kiribati 5 -174.8433549 -11.7459999 177.1479136
kiribati Kiribati 7.9484 -179.1646 -7.0517 -164.1645
kuwait Kuwait 30.2038082 46.4526837 28.4138452 49.1046809
kyrgyzstan Kyrgyzstan 43.3667971 69.1649523 39.0728437 80.3295793
laos Laos 22.602872 99.9843247 13.8096752 107.7349989
@ -131,7 +131,7 @@ myanmar Myanmar 28.647835 92.0719423 9.4375 101.2700796
namibia Namibia -16.8634854 11.4280384 -29.0694499 25.3617476
nauru Nauru -0.2029999 166.6099864 -0.8529999 167.2597301
nepal Nepal 30.546945 79.9586109 26.2477172 88.3015257
new_zealand New Zealand -28.9303302 -179.1591529 -52.9213686 179.4643594
new_zealand New Zealand -33.5349 165.8236 -48.4206 179.5842
nicaragua Nicaragua 15.1331183 -87.933972 10.6084923 -82.5227022
niger Niger 23.617178 0.0689653 11.593756 16.096667
nigeria Nigeria 13.985645 2.576932 3.9690959 14.777982
@ -158,7 +158,7 @@ russian_federation Russian Federation 82.1586232 -180.0999999 41.0858711 180.1
rwanda Rwanda -0.9474509 28.7617546 -2.9389803 30.9990738
sao_tome_and_principe Sao Tome and Principe 2.0257601 6.160642 -0.3135136 7.7704783
sahrawi_arab_democratic_republic Sahrawi Arab Democratic Republic 27.7666834 -15.1405655 21.2370952 -8.5663889
saint_helena/ascension_and_tristan_da_cunha Saint Helena/Ascension and Tristan da Cunha -7.5899999 -14.7226944 -40.6699999 -5.3234152
saint_helena_ascension_and_tristan_da_cunha Saint Helena/Ascension and Tristan da Cunha -15.9 -5.79 -16.04 -5.63
saint_kitts_and_nevis Saint Kitts and Nevis 17.7158146 -63.1511289 16.795 -62.2303518
saint_lucia Saint Lucia 14.3725 -61.3853866 13.408 -60.5669362
saint_vincent_and_the_grenadines Saint Vincent and the Grenadines 13.683 -61.765747 12.4166548 -60.8094145
@ -178,7 +178,7 @@ south_africa South Africa -22.02503 16.2335213 -47.2788334 38.3898954
south_georgia_and_the_south_sandwich_islands South Georgia and the South Sandwich Islands -53.3531685 -42.2349052 -59.7839999 -25.7468302
south_korea South Korea 38.7234602 124.254847 32.77788 132.2483256
south_sudan South Sudan 12.336389 23.347778 3.38898 36.048997
spain Spain 44.0933088 -18.4936844 27.3335426 4.6918885
spain Spain 43.9934 -9.9123 35.892 4.5919
sri_lanka Sri Lanka 10.135 79.2741141 5.619 82.1810141
sudan Sudan 22.324918 21.7145046 9.247221 39.1576252
suriname Suriname 6.325 -58.1708329 1.7312802 -53.7433357
@ -191,7 +191,7 @@ tanzania Tanzania -0.8820299 29.2269773 -11.8612539 40.7584071
thailand Thailand 20.5648337 97.2438072 5.512851 105.7370925
the_bahamas The Bahamas 27.5734551 -80.800194 20.6059846 -72.347752
the_gambia The Gambia 13.9253137 -17.1288253 12.961 -13.6977779
the_netherlands The Netherlands 53.8253321 -70.3695875 11.677 7.3274985
the_netherlands The Netherlands 53.7254 2.9394 50.7264 7.2275
togo Togo 11.2395355 -0.2439718 5.826547 1.9025
tokelau Tokelau -8.232863 -172.8213672 -9.7442498 -170.8797585
tonga Tonga -15.2655721 -179.4951979 -24.2625705 -173.4295457
@ -200,12 +200,12 @@ tunisia Tunisia 37.8612052 7.4219807 30.14238 11.9801133
turkey Turkey 42.397 25.5212891 35.7076804 44.9176638
turkmenistan Turkmenistan 42.8975571 52.235076 35.0355776 66.784303
turks_and_caicos_islands Turks and Caicos Islands 22.2630989 -72.7799045 20.8553418 -70.764359
tuvalu Tuvalu -5.336961 -180.0999999 -11.0939388 180.1
tuvalu Tuvalu -5.1524 175.3417 -11.1881 180.835
uganda Uganda 4.3340766 29.4727424 -1.5787899 35.100308
ukraine Ukraine 52.4791473 22.037059 44.084598 40.3275801
united_arab_emirates United Arab Emirates 26.2517219 51.3160714 22.5316214 56.7024458
united_kingdom United Kingdom 61.161 -14.1155169 49.574 2.1919117
united_states_of_america United States of America 71.7048217 -180.0999999 -14.8608357 180.1
united_states_of_america United States of America 49.4325 -125.3321 23.8991 -65.7421
uruguay Uruguay -29.9853439 -58.5948437 -35.8847311 -52.9755832
uzbekistan Uzbekistan 45.690118 55.8985781 37.0772144 73.2397362
vanuatu Vanuatu -12.7713776 166.2355255 -20.5627424 170.549982

1 extract country city top left bottom right
7 anguilla Anguilla 18.8951194 -63.7391991 17.9609378 -62.6125448
8 argentina Argentina -21.6811679 -73.6603073 -55.285076 -53.5374514
9 armenia Armenia 41.400712 43.3471395 38.7404775 46.7333087
10 australia Australia -9.090427 -9.0882 72.1460938 112.2363 -55.4228174 -44.3945 168.3249543 155.2172
11 austria Austria 49.1205264 9.4307487 46.2722761 17.260776
12 azerbaijan Azerbaijan 42.0502947 44.6633701 38.2929551 51.1090302
13 bahrain Bahrain 26.7872444 50.1697989 25.435 51.0233693
66 federated_states_of_micronesia Federated States of Micronesia 10.391 137.1234512 0.727 163.3364054
67 fiji Fiji -12.1613865 -180.0999999 -21.3286516 180.1
68 finland Finland 70.1922939 18.9832098 59.3541578 31.6867044
69 france France 51.368318 51.2684 -178.4873748 -5.4662 -50.3187168 41.2632 172.4057152 9.8679
70 gabon Gabon 2.4182171 8.4002246 -4.201226 14.639444
71 georgia Georgia 43.6864294 39.7844803 40.9552922 46.8365373
72 germany Germany 55.199161 5.7663153 47.1701114 15.1419319
97 jordan Jordan 33.4751558 34.7844372 29.083401 39.4012981
98 kazakhstan Kazakhstan 55.5804002 46.392161 40.4686476 87.4156316
99 kenya Kenya 4.72 33.8098987 -4.9995203 41.999578
100 kiribati Kiribati 5 7.9484 -174.8433549 -179.1646 -11.7459999 -7.0517 177.1479136 -164.1645
101 kuwait Kuwait 30.2038082 46.4526837 28.4138452 49.1046809
102 kyrgyzstan Kyrgyzstan 43.3667971 69.1649523 39.0728437 80.3295793
103 laos Laos 22.602872 99.9843247 13.8096752 107.7349989
131 namibia Namibia -16.8634854 11.4280384 -29.0694499 25.3617476
132 nauru Nauru -0.2029999 166.6099864 -0.8529999 167.2597301
133 nepal Nepal 30.546945 79.9586109 26.2477172 88.3015257
134 new_zealand New Zealand -28.9303302 -33.5349 -179.1591529 165.8236 -52.9213686 -48.4206 179.4643594 179.5842
135 nicaragua Nicaragua 15.1331183 -87.933972 10.6084923 -82.5227022
136 niger Niger 23.617178 0.0689653 11.593756 16.096667
137 nigeria Nigeria 13.985645 2.576932 3.9690959 14.777982
158 rwanda Rwanda -0.9474509 28.7617546 -2.9389803 30.9990738
159 sao_tome_and_principe Sao Tome and Principe 2.0257601 6.160642 -0.3135136 7.7704783
160 sahrawi_arab_democratic_republic Sahrawi Arab Democratic Republic 27.7666834 -15.1405655 21.2370952 -8.5663889
161 saint_helena/ascension_and_tristan_da_cunha saint_helena_ascension_and_tristan_da_cunha Saint Helena/Ascension and Tristan da Cunha -7.5899999 -15.9 -14.7226944 -5.79 -40.6699999 -16.04 -5.3234152 -5.63
162 saint_kitts_and_nevis Saint Kitts and Nevis 17.7158146 -63.1511289 16.795 -62.2303518
163 saint_lucia Saint Lucia 14.3725 -61.3853866 13.408 -60.5669362
164 saint_vincent_and_the_grenadines Saint Vincent and the Grenadines 13.683 -61.765747 12.4166548 -60.8094145
178 south_georgia_and_the_south_sandwich_islands South Georgia and the South Sandwich Islands -53.3531685 -42.2349052 -59.7839999 -25.7468302
179 south_korea South Korea 38.7234602 124.254847 32.77788 132.2483256
180 south_sudan South Sudan 12.336389 23.347778 3.38898 36.048997
181 spain Spain 44.0933088 43.9934 -18.4936844 -9.9123 27.3335426 35.892 4.6918885 4.5919
182 sri_lanka Sri Lanka 10.135 79.2741141 5.619 82.1810141
183 sudan Sudan 22.324918 21.7145046 9.247221 39.1576252
184 suriname Suriname 6.325 -58.1708329 1.7312802 -53.7433357
191 thailand Thailand 20.5648337 97.2438072 5.512851 105.7370925
192 the_bahamas The Bahamas 27.5734551 -80.800194 20.6059846 -72.347752
193 the_gambia The Gambia 13.9253137 -17.1288253 12.961 -13.6977779
194 the_netherlands The Netherlands 53.8253321 53.7254 -70.3695875 2.9394 11.677 50.7264 7.3274985 7.2275
195 togo Togo 11.2395355 -0.2439718 5.826547 1.9025
196 tokelau Tokelau -8.232863 -172.8213672 -9.7442498 -170.8797585
197 tonga Tonga -15.2655721 -179.4951979 -24.2625705 -173.4295457
200 turkey Turkey 42.397 25.5212891 35.7076804 44.9176638
201 turkmenistan Turkmenistan 42.8975571 52.235076 35.0355776 66.784303
202 turks_and_caicos_islands Turks and Caicos Islands 22.2630989 -72.7799045 20.8553418 -70.764359
203 tuvalu Tuvalu -5.336961 -5.1524 -180.0999999 175.3417 -11.0939388 -11.1881 180.1 180.835
204 uganda Uganda 4.3340766 29.4727424 -1.5787899 35.100308
205 ukraine Ukraine 52.4791473 22.037059 44.084598 40.3275801
206 united_arab_emirates United Arab Emirates 26.2517219 51.3160714 22.5316214 56.7024458
207 united_kingdom United Kingdom 61.161 -14.1155169 49.574 2.1919117
208 united_states_of_america United States of America 71.7048217 49.4325 -180.0999999 -125.3321 -14.8608357 23.8991 180.1 -65.7421
209 uruguay Uruguay -29.9853439 -58.5948437 -35.8847311 -52.9755832
210 uzbekistan Uzbekistan 45.690118 55.8985781 37.0772144 73.2397362
211 vanuatu Vanuatu -12.7713776 166.2355255 -20.5627424 170.549982

Wyświetl plik

@ -21,15 +21,15 @@ function main() {
echo 'Skip upload since no S3_ACCESS_KEY was found.'
fi
# Generate patch sources first
python create_extracts.py zoom-level "$WORLD_MBTILES" \
--max-zoom=5 --target-dir="$EXTRACT_DIR" $upload_flag
python create_extracts.py zoom-level "$WORLD_MBTILES" \
--max-zoom=8 --target-dir="$EXTRACT_DIR" $upload_flag
# Generate patch sources first but do not upload them
python -u create_extracts.py zoom-level "$WORLD_MBTILES" \
--max-zoom=5 --target-dir="$EXTRACT_DIR"
python -u create_extracts.py zoom-level "$WORLD_MBTILES" \
--max-zoom=8 --target-dir="$EXTRACT_DIR"
python create_extracts.py bbox "$WORLD_MBTILES" "$CITIES_TSV" \
python -u create_extracts.py bbox "$WORLD_MBTILES" "$CITIES_TSV" \
--patch-from="$PATCH_SRC" --target-dir="$EXTRACT_DIR" $upload_flag
python create_extracts.py bbox "$WORLD_MBTILES" "$COUNTRIES_TSV" \
python -u create_extracts.py bbox "$WORLD_MBTILES" "$COUNTRIES_TSV" \
--patch-from="$PATCH_SRC" --target-dir="$EXTRACT_DIR" $upload_flag
}

Wyświetl plik

@ -17,17 +17,16 @@ Options:
--target-dir=<target-dir> Target directory to put extracts in [default: ./]
"""
import json
import shutil
import subprocess
import mbutil
import sqlite3
import csv
import os.path
from collections import namedtuple
from multiprocessing.dummy import Pool as ProcessPool
from docopt import docopt
ATTRIBUTION = '<a href="http://www.openstreetmap.org/about/" target="_blank">&copy; OpenStreetMap contributors</a>'
VERSION = '1.5'
VERSION = '2.0'
class Extract(object):
@ -54,7 +53,7 @@ class Extract(object):
def center(self):
center_lon = (self.min_lon + self.max_lon) / 2.0
center_lat = (self.min_lat + self.max_lat) / 2.0
return '{},{},{}'.format(center_lat, center_lon, self.center_zoom)
return '{},{},{}'.format(center_lon, center_lat, self.center_zoom)
def metadata(self, extract_file):
return {
@ -83,6 +82,7 @@ def create_extract(extract, source_file, extract_file):
'--bounds={}'.format(extract.bounds()),
'--minzoom', str(extract.min_zoom),
'--maxzoom', str(extract.max_zoom),
'--timeout=200000',
source, sink
]
@ -94,7 +94,7 @@ def update_metadata(mbtiles_file, metadata):
Update metadata key value pairs inside the MBTiles file
with the provided metadata
"""
conn = mbutil.mbtiles_connect(mbtiles_file)
conn = sqlite3.connect(mbtiles_file)
def upsert_entry(key, value):
conn.execute("DELETE FROM metadata WHERE name='{}'".format(key))
@ -103,19 +103,8 @@ def update_metadata(mbtiles_file, metadata):
for key, value in metadata.items():
upsert_entry(key, value)
def patch_mbtiles(source_file, target_file):
conn = mbutil.mbtiles_connect(target_file)
conn.executescript(
"""
PRAGMA journal_mode=PERSIST;
PRAGMA page_size=80000;
PRAGMA synchronous=OFF;
ATTACH DATABASE '{}' AS source;
REPLACE INTO map SELECT * FROM source.map;
REPLACE INTO images SELECT * FROM source.images;
""".format(source_file)
)
conn.commit()
conn.close()
def parse_extracts(tsv_file):
@ -144,7 +133,7 @@ def upload_mbtiles(mbtiles_file):
access_key = os.environ['S3_ACCESS_KEY']
secret_key = os.environ['S3_SECRET_KEY']
bucket_name = os.getenv('S3_BUCKET_NAME', 'osm2vectortiles-downloads')
prefix = os.getenv('S3_PREFIX', '{}/{}/'.format(VERSION, 'extracts'))
prefix = os.getenv('S3_PREFIX', 'v{}/{}/'.format(VERSION, 'extracts'))
subprocess.check_call([
's3cmd',
@ -166,23 +155,25 @@ if __name__ == '__main__':
source_file = args['<source_file>']
def process_extract(extract):
patch_src = args['--patch-from']
extract_file = os.path.join(target_dir, extract.extract + '.mbtiles')
print('Create extract {}'.format(extract_file))
# Instead of patching copy over the patch source as target and
# write directly to it (since that works concurrently).
patch_src = args['--patch-from']
if patch_src:
print('Use patch from {} as base'.format(patch_src))
shutil.copyfile(patch_src, extract_file)
create_extract(extract, source_file, extract_file)
print('Update metadata {}'.format(extract_file))
update_metadata(extract_file, extract.metadata(extract_file))
if patch_src:
print('Patch from {}'.format(patch_src))
patch_mbtiles(patch_src, extract_file)
if upload:
print('Upload file {}'.format(extract_file))
upload_mbtiles(extract_file)
if args['bbox']:
process_count = int(args['--concurrency'])
extracts = list(parse_extracts(args['<tsv_file>']))

Wyświetl plik

@ -1,2 +1 @@
-e git://github.com/mapbox/mbutil.git@master#egg=mbutil
docopt==0.6.2

Plik binarny nie jest wyświetlany.

Po

Szerokość:  |  Wysokość:  |  Rozmiar: 73 KiB

Wyświetl plik

@ -5,12 +5,11 @@ set -o nounset
source utils.sh
readonly QUEUE_NAME=${QUEUE_NAME:-osm2vectortiles_jobs}
readonly BUCKET_NAME=${BUCKET_NAME:-osm2vectortiles-jobs}
readonly RABBITMQ_URI=${RABBITMQ_URI:-"amqp://osm:osm@rabbitmq:5672/?blocked_connection_timeout=1200&heartbeat=0"}
function export_remote_mbtiles() {
exec python export_remote.py "$RABBITMQ_URI" \
exec python -u export_remote.py "$RABBITMQ_URI" \
--tm2source="$DEST_PROJECT_DIR" \
--bucket="$BUCKET_NAME"
}

Wyświetl plik

@ -19,9 +19,10 @@ import time
import sys
import os
import os.path
import functools
import json
import humanize
import pika
from humanize import naturaltime, naturalsize
from boto.s3.connection import S3Connection, OrdinaryCallingFormat
from mbtoolbox.optimize import find_optimizable_tiles, all_descendant_tiles
from mbtoolbox.mbtiles import MBTiles
@ -42,12 +43,10 @@ def s3_url(host, port, bucket_name, file_name):
def connect_s3(host, port, bucket_name):
# import boto
# boto.set_stream_logger('paws')
is_secure = port == 443
conn = S3Connection(
os.getenv('AWS_ACCESS_KEY_ID', 'dummy'),
os.getenv('AWS_SECRET_ACCESS_KEY', 'dummy'),
os.environ['AWS_ACCESS_KEY_ID'],
os.environ['AWS_SECRET_ACCESS_KEY'],
is_secure=is_secure,
port=port,
host=host,
@ -106,85 +105,132 @@ def optimize_mbtiles(mbtiles_file, mask_level=8):
mbtiles = MBTiles(mbtiles_file, 'tms')
for tile in find_optimizable_tiles(mbtiles, mask_level, 'tms'):
tiles = all_descendant_tiles(x=tile.x, y=tile.y, zoom=tile.z, max_zoom=14)
tiles = all_descendant_tiles(x=tile.x, y=tile.y,
zoom=tile.z, max_zoom=14)
mbtiles.remove_tiles(tiles)
def render_pyramid(msg, source, sink):
pyramid = msg['pyramid']
tileinfo = pyramid['tile']
print('Render pyramid {}/{} from z{} down to z{}'.format(
tileinfo['x'],
tileinfo['y'],
tileinfo['min_zoom'],
tileinfo['max_zoom'],
))
return render_pyramid_command(
source, sink,
bounds=create_tilelive_bbox(pyramid['bounds']),
min_zoom=tileinfo['min_zoom'],
max_zoom=tileinfo['max_zoom']
)
def render_list(msg, source, sink):
list_file = '/tmp/tiles.txt'
with open(list_file, 'w') as fh:
write_list_file(fh, msg['tiles'])
print('Render {} tiles from list job'.format(
len(msg['tiles']),
))
return render_tile_list_command(
source, sink,
list_file=list_file,
)
def timing(f, *args, **kwargs):
start = time.time()
ret = f(*args, **kwargs)
end = time.time()
return ret, end - start
def handle_message(tm2source, bucket, s3_url, body):
msg = json.loads(body.decode('utf-8'))
task_id = msg['id']
mbtiles_file = task_id + '.mbtiles'
source = 'tmsource://' + os.path.abspath(tm2source)
sink = 'mbtiles://' + os.path.abspath(mbtiles_file)
tilelive_cmd = []
if msg['type'] == 'pyramid':
tilelive_cmd = render_pyramid(msg, source, sink)
elif msg['type'] == 'list':
tilelive_cmd = render_list(msg, source, sink)
else:
raise ValueError("Message must be either of type pyramid or list")
_, render_time = timing(subprocess.check_call, tilelive_cmd, timeout=5*60)
print('Render MBTiles: {}'.format(naturaltime(render_time)))
_, optimize_time = timing(optimize_mbtiles, mbtiles_file)
print('Optimize MBTiles: {}'.format(naturaltime(optimize_time)))
_, upload_time = timing(upload_mbtiles, bucket, mbtiles_file)
print('Upload MBTiles : {}'.format(naturaltime(upload_time)))
download_link = s3_url(mbtiles_file)
print('Uploaded {} to {}'.format(
naturalsize(os.path.getsize(mbtiles_file)),
download_link
))
os.remove(mbtiles_file)
return create_result_message(task_id, download_link, msg)
def export_remote(tm2source, rabbitmq_url, queue_name, result_queue_name,
failed_queue_name, render_scheme, bucket_name):
host = os.getenv('AWS_S3_HOST', 'mock-s3')
port = int(os.getenv('AWS_S3_PORT', 8080))
if 'AWS_S3_HOST' not in os.environ:
sys.stderr.write('You need to specify the AWS_S3_HOST')
sys.exit(1)
host = os.environ['AWS_S3_HOST']
port = int(os.getenv('AWS_S3_PORT', 443))
print('Connect with S3 bucket {} at {}:{}'.format(
bucket_name, host, port
))
bucket = connect_s3(host, port, bucket_name)
connection = pika.BlockingConnection(pika.URLParameters(rabbitmq_url))
channel = connection.channel()
channel.basic_qos(prefetch_count=1)
channel.confirm_delivery()
configure_rabbitmq(channel)
print('Connect with RabbitMQ server {}'.format(rabbitmq_url))
def callback(ch, method, properties, body):
msg = json.loads(body.decode('utf-8'))
task_id = msg['id']
mbtiles_file = task_id + '.mbtiles'
while True:
method_frame, header_frame, body = channel.basic_get(queue_name)
source = 'tmsource://' + os.path.abspath(tm2source)
sink = 'mbtiles://' + os.path.abspath(mbtiles_file)
tilelive_cmd = []
if msg['type'] == 'pyramid':
pyramid = msg['pyramid']
tileinfo = pyramid['tile']
tilelive_cmd = render_pyramid_command(
source, sink,
bounds=create_tilelive_bbox(pyramid['bounds']),
min_zoom=tileinfo['min_zoom'],
max_zoom=tileinfo['max_zoom']
)
elif msg['type'] == 'list':
list_file = '/tmp/tiles.txt'
with open(list_file, 'w') as fh:
write_list_file(fh)
tilelive_cmd = render_tile_list_command(
source, sink,
list_file=list_file,
)
else:
raise ValueError("Message must be either of type pyramid or list")
# Consumer should stop if there are no more message to receive
if not body:
channel.stop_consuming()
print('No message received - stop consuming')
break
try:
start = time.time()
subprocess.check_call(tilelive_cmd, timeout=5*60)
end = time.time()
print('Rendering time: {}'.format(humanize.naturaltime(end - start)))
print('Optimize MBTiles file size')
optimize_mbtiles(mbtiles_file)
upload_mbtiles(bucket, mbtiles_file)
os.remove(mbtiles_file)
print('Upload mbtiles {}'.format(mbtiles_file))
download_link = s3_url(host, port, bucket_name, mbtiles_file)
result_msg = create_result_message(task_id, download_link, msg)
result_msg = handle_message(
tm2source, bucket,
functools.partial(s3_url, host, port, bucket_name),
body
)
durable_publish(channel, result_queue_name,
body=json.dumps(result_msg))
channel.basic_ack(delivery_tag=method.delivery_tag)
except (subprocess.CalledProcessError, subprocess.TimeoutExpired) as e:
channel.basic_ack(delivery_tag=method_frame.delivery_tag)
except:
durable_publish(channel, failed_queue_name, body=body)
channel.basic_ack(delivery_tag=method.delivery_tag)
channel.basic_ack(delivery_tag=method_frame.delivery_tag)
channel.stop_consuming()
time.sleep(5) # Give RabbitMQ some time
raise e
channel.basic_consume(callback, queue=queue_name)
try:
channel.start_consuming()
except KeyboardInterrupt:
channel.stop_consuming()
raise
connection.close()

Wyświetl plik

@ -1,2 +1,12 @@
FROM python:3.4-onbuild
CMD ["./generate_jobs.py"]
FROM python:3.4
RUN wget -O /usr/bin/pipecat https://github.com/lukasmartinelli/pipecat/releases/download/v0.2/pipecat_linux_amd64 \
&& chmod +x /usr/bin/pipecat
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY requirements.txt /usr/src/app/
RUN pip install --no-cache-dir -r requirements.txt
COPY . /usr/src/app
CMD ["./generate_world_jobs.sh"]

Wyświetl plik

@ -35,8 +35,7 @@ def quad_tree(tx, ty, zoom):
if __name__ == '__main__':
args = docopt(__doc__, version='0.1')
writer = csv.writer(sys.stdout, delimiter='\t')
writer = csv.writer(sys.stdout, delimiter=' ')
with open(args['<list_file>'], "r") as file_handle:
for line in file_handle:
z, x, y = line.split('/')

Wyświetl plik

@ -0,0 +1,21 @@
#!/bin/bash
set -o errexit
set -o pipefail
set -o nounset
readonly AMQP_URI=${AMQP_URI:-"amqp://osm:osm@rabbitmq:5672/"}
readonly EXPORT_DATA_DIR=${EXPORT_DATA_DIR:-"/data/export"}
function generate_list_jobs() {
local unsorted_tiles="$EXPORT_DATA_DIR/tiles.txt"
local sorted_tiles="$EXPORT_DATA_DIR/tiles.sorted.txt"
local batched_jobs="$EXPORT_DATA_DIR/batched_jobs.json"
local jobs_queue="jobs"
python calculate_quad_key.py "$unsorted_tiles" | sort -k2 | cut -d ' ' -f1 > $sorted_tiles
python generate_jobs.py list "$sorted_tiles" --batch-size=1000 > $batched_jobs
pipecat publish --amqpuri="$AMQP_URI" "$jobs_queue" < $batched_jobs
}
generate_list_jobs

Wyświetl plik

@ -0,0 +1,21 @@
#!/bin/bash
set -o errexit
set -o pipefail
set -o nounset
readonly AMQP_URI=${AMQP_URI:-"amqp://osm:osm@rabbitmq:5672/"}
readonly EXPORT_DATA_DIR=${EXPORT_DATA_DIR:-"/data/export"}
readonly TILE_X=${TILE_X:-"0"}
readonly TILE_Y=${TILE_Y:-"0"}
readonly TILE_Z=${TILE_Z:-"0"}
readonly JOB_ZOOM=${JOB_ZOOM:-"8"}
function generate_world_jobs() {
local jobs_file="$EXPORT_DATA_DIR/world_jobs.txt"
local jobs_queue="jobs"
python generate_jobs.py pyramid "$TILE_X" "$TILE_Y" "$TILE_Z" --job-zoom="$JOB_ZOOM" > $jobs_file
pipecat publish --amqpuri="$AMQP_URI" "$jobs_queue" < $jobs_file
}
generate_world_jobs

Wyświetl plik

@ -38,9 +38,6 @@ function import_pbf() {
echo "Create osm_water_point table with precalculated centroids"
create_osm_water_point_table
echo "Split very large landuse polygons"
create_osm_landuse_split_polygon_table
echo "Update osm_place_polygon with point geometry"
update_points
@ -64,10 +61,6 @@ function update_scaleranks() {
exec_sql_file "update_scaleranks.sql"
}
function create_osm_landuse_split_polygon_table() {
exec_sql_file "landuse_split_polygon_table.sql"
}
function create_osm_water_point_table() {
exec_sql_file "water_point_table.sql"
}
@ -178,9 +171,6 @@ function import_pbf_diffs() {
echo "Create osm_water_point table with precalculated centroids"
create_osm_water_point_table
echo "Split very large landuse polygons"
create_osm_landuse_split_polygon_table
echo "Update osm_place_polygon with point geometry"
update_points

Wyświetl plik

@ -367,6 +367,7 @@ tables:
- recreation_ground
- sports_centre
- pitch
- track
natural:
- glacier
- sand
@ -666,14 +667,14 @@ tables:
- name: type
type: mapping_value
mapping:
station:
- subway
- light_rail
railway:
- station
- halt
- tram_stop
- subway_entrance
station:
- subway
- light_rail
type: point
water_polygon:
fields:

Wyświetl plik

@ -13,7 +13,8 @@ ENV SQL_FUNCTIONS_FILE=/usr/src/app/functions.sql \
SQL_TRIGGERS_FILE=/usr/src/app/triggers.sql \
SQL_XYZ_EXTENT_FILE=/usr/src/app/xyz_extent.sql \
SQL_INDIZES_FILE=/usr/src/app/indizes.sql \
SQL_LAYERS_DIR=/usr/src/app/layers/
SQL_LAYERS_DIR=/usr/src/app/layers/ \
SQL_SPLIT_POLYGON_FILE=/usr/src/app/landuse_split_polygon_table.sql
COPY . /usr/src/app
# Generate class functions

Wyświetl plik

@ -43,6 +43,7 @@ system:
- athletics
- chess
- pitch
- track
rock:
- rock
- bare_rock

Wyświetl plik

@ -6,10 +6,10 @@ system:
rail-metro:
- stop
- subway
- tram_stop
rail-light:
- light_rail
- halt
- tram_stop
entrance:
- subway_entrance

Wyświetl plik

@ -103,7 +103,6 @@ BEGIN
WHEN class = 'track' THEN road_type_value(class, tracktype)
WHEN class = 'service' THEN road_type_value(class, service)
WHEN class = 'golf' THEN 'golf'
WHEN class IN ('major_rail', 'minor_rail') THEN 'rail'
WHEN class = 'mtb' THEN 'mountain_bike'
WHEN class = 'aerialway' AND type IN ('gondola', 'mixed_lift', 'chair_lift') THEN road_type_value(class, type)
WHEN class = 'aerialway' AND type = 'cable_car' THEN 'aerialway:cablecar'

Wyświetl plik

@ -6,6 +6,7 @@ set -o nounset
readonly SQL_FUNCTIONS_FILE=${IMPORT_DATA_DIR:-/usr/src/app/functions.sql}
readonly SQL_LAYERS_DIR=${IMPORT_DATA_DIR:-/usr/src/app/layers/}
readonly SQL_CREATE_INDIZES=${SQL_CREATE_INDIZES:-false}
readonly SQL_SPLIT_POLYGON_FILE=${SQL_SPLIT_POLYGON_FILE:-/usr/src/app/landuse_split_polygon_table.sql}
readonly DB_HOST=$DB_PORT_5432_TCP_ADDR
readonly OSM_DB=${OSM_DB:-osm}
@ -30,6 +31,14 @@ function main() {
echo "Creating generated functions in $OSM_DB"
exec_sql_file "$SQL_GENERATED_FILE"
echo "Creating triggers in $OSM_DB"
if [ "$SQL_SPLIT_POLYGON" = true ] ; then
echo "Split polygons for $OSM_DB"
exec_sql_file "${SQL_SPLIT_POLYGON_FILE}"
else
echo "Omitting splitting polygon for $OSM_DB"
fi
exec_sql_file "$SQL_TRIGGERS_FILE"
echo "Creating layers in $OSM_DB"
exec_sql_file "${SQL_LAYERS_DIR}admin.sql"

Wyświetl plik

@ -71,9 +71,11 @@ def merge_results(rabbitmq_url, merge_target, result_queue_name):
channel = connection.channel()
channel.basic_qos(prefetch_count=3)
channel.confirm_delivery()
print('Connect with RabbitMQ server {}'.format(rabbitmq_url))
def callback(ch, method, properties, body):
msg = json.loads(body.decode('utf-8'))
print('Download {}'.format(msg['url']))
merge_source = download_mbtiles(msg['url'])
action = functools.partial(merge_mbtiles, merge_source, merge_target)
diff_size = compare_file_after_action(merge_target, action)

Wyświetl plik

@ -5,10 +5,10 @@ set -o nounset
readonly EXPORT_DIR=${EXPORT_DIR:-/data/export}
readonly MERGE_TARGET=${MERGE_TARGET:-"$EXPORT_DIR/planet.mbtiles"}
readonly RABBITMQ_URI=${RABBITMQ_URI:-"amqp://osm:osm@rabbitmq:5672/"}
readonly RABBITMQ_URI=${RABBITMQ_URI:-"amqp://osm:osm@rabbitmq:5672/?blocked_connection_timeout=1200&heartbeat=0"}
function export_remote_mbtiles() {
exec python merge-jobs.py "$RABBITMQ_URI" \
exec python -u merge-jobs.py "$RABBITMQ_URI" \
--merge-target="$MERGE_TARGET"
}

Wyświetl plik

@ -0,0 +1,9 @@
# integration-test
Integration test for entire OSM2VectorTiles workflow.
Primary purpose is to verify that everything works together
not to verify correctness.
Take a look at the `.travis.yml` file to see how it is called.
The setup requires a working S3 endpoint to test the distributed
workflow.

Wyświetl plik

@ -0,0 +1,173 @@
#!/usr/bin/env python
""""
Integration test for entire OSM2VectorTiles workflow.
Primary purpose is to verify that everything works together
not to verify correctness.
"""
import os
import time
import subprocess
import pytest
from mbtoolbox.verify import list_required_tiles, missing_tiles
from mbtoolbox.mbtiles import MBTiles
PARENT_PROJECT_DIR = os.path.join(os.path.realpath(__file__), '../../../')
PROJECT_DIR = os.path.abspath(os.getenv('PROJECT_DIR', PARENT_PROJECT_DIR))
ALBANIA_BBOX = '19.6875,40.97989806962015,20.390625,41.50857729743933'
ALBANIA_TIRANA_TILE = (284, 191, 9)
BUCKET = os.getenv('BUCKET', 'osm2vectortiles-testing')
AWS_S3_HOST = os.getenv('AWS_S3_HOST', 'os.zhdk.cloud.switch.ch')
AWS_SECRET_ACCESS_KEY = os.environ['AWS_SECRET_ACCESS_KEY']
AWS_ACCESS_KEY_ID = os.environ['AWS_ACCESS_KEY_ID']
class DockerCompose(object):
def __init__(self, project_dir=PROJECT_DIR):
self.project_dir = project_dir
def compose(self, args):
subprocess.check_call(['docker-compose'] + args, cwd=self.project_dir)
def run(self, args):
self.compose(['run'] + args)
def scale(self, container, count):
self.compose(['scale', '{}={}'.format(container, count)])
def up(self, container):
self.compose(['up', '-d', container])
def stop(self, container):
self.compose(['stop', container])
def remove_all(self):
self.compose(['stop'])
self.compose(['rm', '-v', '--force'])
@pytest.mark.run(order=1)
def test_postgis_startup():
print(PROJECT_DIR)
dc = DockerCompose()
dc.remove_all()
dc.up('postgis')
# PostGIS can take a long time to get ready
time.sleep(10)
@pytest.mark.run(order=2)
def test_import_external():
dc = DockerCompose()
dc.run(['import-external'])
@pytest.mark.run(order=3)
def test_import_osm():
dc = DockerCompose()
dc.run(['import-osm'])
@pytest.mark.run(order=4)
def test_import_sql():
dc = DockerCompose()
dc.run(['import-sql'])
@pytest.mark.run(order=5)
def test_local_export():
"Test export of local Liechtenstein bbox and verify all tiles are present"
dc = DockerCompose()
def export_bbox(bbox, min_zoom, max_zoom):
dc.run([
'-e', 'BBOX={}'.format(bbox),
'-e', 'MIN_ZOOM={}'.format(min_zoom),
'-e', 'MAX_ZOOM={}'.format(max_zoom),
'export'
])
tile_x, tile_y, tile_z = ALBANIA_TIRANA_TILE
export_bbox(ALBANIA_BBOX, tile_z, 14)
# There are missing tiles on z14 because
# Albania does not have data at some places
exported_mbtiles = os.path.join(PROJECT_DIR, 'export/tiles.mbtiles')
tiles = find_missing_tiles(exported_mbtiles, tile_x, tile_y, tile_z, 13)
assert tiles == []
def find_missing_tiles(mbtiles_file, x, y, min_z, max_z):
mbtiles = MBTiles(mbtiles_file, 'tms')
required_tiles = list_required_tiles(x, y, min_z, max_z)
return list(missing_tiles(mbtiles, required_tiles))
@pytest.mark.run(order=6)
def test_distributed_worker():
dc = DockerCompose()
def schedule_tile_jobs(x, y, z, job_zoom):
dc.run([
'-e', 'TILE_X={}'.format(x),
'-e', 'TILE_Y={}'.format(y),
'-e', 'TILE_Z={}'.format(z),
'-e', 'JOB_ZOOM={}'.format(job_zoom),
'generate-jobs'
])
dc.up('rabbitmq')
time.sleep(10)
tile_x, tile_y, tile_z = ALBANIA_TIRANA_TILE
job_zoom = tile_z + 1
schedule_tile_jobs(tile_x, tile_y, tile_z, job_zoom)
dc.run([
'-e', 'BUCKET_NAME={}'.format(BUCKET),
'-e', 'AWS_ACCESS_KEY_ID={}'.format(AWS_ACCESS_KEY_ID),
'-e', 'AWS_SECRET_ACCESS_KEY={}'.format(AWS_SECRET_ACCESS_KEY),
'-e', 'AWS_S3_HOST={}'.format(AWS_S3_HOST),
'export-worker'
])
# Give time to merge jobs together
dc.up('merge-jobs')
time.sleep(20)
dc.stop('merge-jobs')
# Merge jobs will merge all results into the existing planet.mbtiles
# if MBTiles contains all the Albania tiles at job zoom level
# the export was successful
exported_mbtiles = os.path.join(PROJECT_DIR, 'export/planet.mbtiles')
print('Checking {} for missing tiles'.format(exported_mbtiles))
tiles = find_missing_tiles(exported_mbtiles, tile_x, tile_y, tile_z, 13)
assert [t for t in tiles if t.z > tile_z] == []
@pytest.mark.run(order=7)
def test_diff_update():
dc = DockerCompose()
# Pull the latest diffs
baseurl = 'http://download.geofabrik.de/europe/albania-updates/'
dc.run(['-e', 'OSM_UPDATE_BASEURL={}'.format(baseurl), 'update-osm-diff'])
# Import diffs and calculate the changed tiles
dc.run(['import-osm-diff'])
dc.run(['changed-tiles'])
# Read and verify that at least one tile is marked dirty
tile_file = os.path.join(PROJECT_DIR, 'export/tiles.txt')
print('Checking {} for changed tiles'.format(tile_file))
num_lines = sum(1 for line in open(tile_file))
assert num_lines > 0
@pytest.mark.run(order=8)
def test_diff_jobs():
dc = DockerCompose()
# Schedule changed tiles as jobs
dc.run(['generate-diff-jobs'])

Wyświetl plik

@ -0,0 +1,3 @@
pytest==2.9.1
pytest-ordering==0.4
-e git+https://github.com/lukasmartinelli/mbtoolbox.git@#egg=mbtoolbox

Wyświetl plik

@ -10,18 +10,18 @@ BEGIN
IF numpars = 0 THEN
fullcode = ' select null ';
ELSIF numpars = 1 THEN
fullcode = ' select count( distinct osm_id ) as alluid from ' || tablelist[1] || ' where osm_id <> 0';
fullcode = ' select count( distinct id ) as alluid from ' || tablelist[1] || ' where id <> 0';
ELSE
FOREACH table_name IN ARRAY tablelist
LOOP
IF i = 0 THEN
code = code || ' select osm_id from ' || table_name;
code = code || ' select id from ' || table_name;
ELSE
code = code || ' union select osm_id from ' || table_name;
code = code || ' union select id from ' || table_name;
END IF ;
i = i + 1;
END LOOP;
fullcode = ' select count( distinct osm_id ) as alluid from (' || code || ') as sq where osm_id <> 0';
fullcode = ' select count( distinct id ) as alluid from (' || code || ') as sq where id <> 0';
END IF;
EXECUTE fullcode INTO feature_count;
RETURN feature_count;