Irwan Fathurrahman 2025-05-23 01:44:22 +00:00 zatwierdzone przez GitHub
commit efc30133e7
Nie znaleziono w bazie danych klucza dla tego podpisu
ID klucza GPG: B5690EEEBB952194
9 zmienionych plików z 171 dodań i 16 usunięć

Wyświetl plik

@ -18,6 +18,13 @@ build:
@echo "------------------------------------------------------------------"
@docker-compose build
release-multiarch:
@echo
@echo "------------------------------------------------------------------"
@echo "Building images for production mode with release tag"
@echo "------------------------------------------------------------------"
docker buildx bake -f docker-compose.release.yml --set *.platform=linux/amd64,linux/arm64 --no-cache --push
redeploy:
@echo
@echo "------------------------------------------------------------------"

Wyświetl plik

@ -0,0 +1,13 @@
version: '3.4'
services:
imposm:
build: docker-imposm
image: meomancer/docker-osm:imposm-1.0.0
osmupdate:
build: docker-osmupdate
image: meomancer/docker-osm:osmupdate-1.0.0
osmenrich:
build: docker-osmenrich
image: meomancer/docker-osm:osmenrich-1.0.0

Wyświetl plik

@ -1,4 +1,4 @@
FROM golang:1.19
FROM golang:1.21
MAINTAINER Etienne Trimaille <etienne.trimaille@gmail.com>
RUN apt-get update && \
@ -9,7 +9,7 @@ RUN go install github.com/omniscale/imposm3/cmd/imposm@latest
WORKDIR /home
ADD requirements.txt .
RUN pip3 install -r requirements.txt
RUN pip install --break-system-packages -r requirements.txt
ADD importer.py .

Wyświetl plik

@ -0,0 +1,32 @@
# Docker ImpOSM3
> Version 1.0.0
This image will take care of doing the initial load for the selected region
(e.g. planet, or a country such as Malawi) into your database. It will then
apply, at a regular interval (default is 2 minutes), any diff that arrives
in the /home/import_queue folder to the postgis OSM database. The diffs
are fetched by a separate container (see osm_update container).
The container will look for an OSM file (*.pbf) and its state file
(*.state.txt) in BASE_PBF.
With -e, you can add some settings :
```bash
- TIME = 120, seconds between 2 executions of the script
- POSTGRES_USER = docker, default user
- POSTGRES_PASS = docker, default password
- POSTGRES_HOST = db
- POSTGRES_PORT = 5432
- SETTINGS = settings, folder for settings (with *.json and *.sql)
- CACHE = cache, folder for caching
- BASE_PBF = base_pbf, folder the OSM file
- IMPORT_DONE = import_done, folder for diff which has been imported
- IMPORT_QUEUE = import_queue, folder for diff which hasn't been imported yet
- SRID = 4326, it can be 3857
- OPTIMIZE = false, check (Imposm)[http://imposm.org/docs/imposm3/latest/tutorial.html#optimize]
- DBSCHEMA_PRODUCTION = public, check (Imposm)[http://imposm.org/docs/imposm3/latest/tutorial.html#deploy-production-tables]
- DBSCHEMA_IMPORT = import, check (Imposm)[http://imposm.org/docs/imposm3/latest/tutorial.html#deploy-production-tables]
- DBSCHEMA_BACKUP = backup, check (Imposm)[http://imposm.org/docs/imposm3/latest/tutorial.html#deploy-production-tables]
```

Wyświetl plik

@ -1,11 +1,18 @@
FROM python:3
FROM python:3.11
MAINTAINER Irwan Fathurrahman <meomancer@gmail.com>
# Install system dependencies needed to build psycopg2
RUN apt-get update && apt-get install -y \
build-essential \
libpq-dev \
python3-dev \
&& rm -rf /var/lib/apt/lists/*
ADD requirements.txt /home/requirements.txt
RUN pip3 install -r /home/requirements.txt
RUN pip3 install --no-binary=psycopg2-binary -r /home/requirements.txt
ADD enrich.py /home/
WORKDIR /home
CMD ["python3", "-u", "/home/enrich.py"]

Wyświetl plik

@ -427,7 +427,10 @@ class Enrich(object):
self.info('%s' % e)
return content
def process_empty_changeset_from_table(self, table_name, table_columns, osm_id_column, osm_type):
def process_empty_changeset_from_table(
self, table_name, table_columns,
osm_id_column, osm_type, extra_where=None
):
""" Processing all data from table
:param table_name: Table source
@ -444,6 +447,9 @@ class Enrich(object):
:param osm_id_column: Column name of osm_id
:type osm_id_column: str
:param extra_where: Other where for query
:type extra_where: str
"""
# noinspection PyUnboundLocalVariable
connection = self.create_connection()
@ -451,8 +457,15 @@ class Enrich(object):
row_batch = {}
osm_ids = []
try:
check_sql = ''' select * from %s."%s" WHERE "changeset_timestamp"
IS NULL AND "osm_id" IS NOT NULL ORDER BY "osm_id" ''' % (self.default['DBSCHEMA_PRODUCTION'], table_name)
check_sql = f'''
select * from {self.default['DBSCHEMA_PRODUCTION']}.{table_name} WHERE "changeset_timestamp"
IS NULL AND "{osm_id_column}" IS NOT NULL
'''
if extra_where:
check_sql += f' AND {extra_where} '
check_sql += f''' ORDER BY "{osm_id_column}"'''
cursor.execute(check_sql)
row = True
while row:
@ -461,10 +474,15 @@ class Enrich(object):
if row:
row = dict(zip(table_columns, row))
row_batch['%s' % row[osm_id_column]] = row
try:
osm_ids.append(f'{abs(row[osm_id_column])}')
except:
osm_ids.append('%s' % row[osm_id_column])
if len(osm_ids) == 30:
if len(osm_ids) == 20:
self.update_osm_enrich_from_api_in_batch(
osm_ids, osm_type, row_batch, table_name, osm_id_column)
osm_ids, osm_type, row_batch, table_name,
osm_id_column
)
row_batch = {}
osm_ids = []
@ -484,9 +502,22 @@ class Enrich(object):
osm_type = table_data['osm_type']
columns = table_data['columns']
if osm_id_columnn is not None:
if osm_type == 'way':
self.info('Checking data from table %s with type way' % table)
self.process_empty_changeset_from_table(
table, columns, osm_id_columnn, 'way',
extra_where=f'"{osm_id_columnn}" > 0'
)
self.info('Checking data from table %s with type relation' % table)
self.process_empty_changeset_from_table(
table, columns, osm_id_columnn, 'relation',
extra_where=f'"{osm_id_columnn}" < 0'
)
else:
self.info('Checking data from table %s' % table)
self.process_empty_changeset_from_table(
table, columns, osm_id_columnn, osm_type)
table, columns, osm_id_columnn, osm_type
)
else:
self.info('Does not know osm_id column for %s.' % table)

Wyświetl plik

@ -1,4 +1,7 @@
# Docker-osmenrich
# Docker OSMENRICH
> Version 1.0.0
Docker osm-enrich is the extension for docker osm to get the changeset of the osm data.
It will get the data from osm API and also get the update data from files that generated from docker-osmupdate

Wyświetl plik

@ -0,0 +1,48 @@
# Docker OSM Update
> Version 1.0.0
This docker image, when run will regularly fetch any new diff file for all the
changes that have happened in the world over the update interval.
You can also specify a custom url for fetching the diff if you wish to retrieve
regional diffs rather than the global one.
You can specify a polygonal area for the diff so that it will only apply features
from the diff that fall within that area. For example providing a polygon of the
borders of Malawi will result in only Malawi features being extracted from the diff.
**Note:** the diff retrieved and options specified here are not related to the
initial base map used - so for example if your initial base map is for Malawi and
you specify a diff area in Botswana, updated features in Botswana will be applied
to your base map which only includes features from Malawi. For this reason, take
care to ensure that your diff area coincides with the region covered by your
original base map.
Once the diff has been downloaded, it is placed into /home/import_queue where
it will be picked up by the long running imposm3 container, which will apply
the diff to the database.
You should have 3 folders : osm_pbf, import_queue, import_done
Put a state file in base-pbf like this one :
http://download.openstreetmap.fr/extracts/africa/south_africa.state.txt
```bash
docker build -t osmupdate .
docker run -v $('pwd')import-queue/:/home/import-queue -v $('pwd')base-pbf/:/home/base-pbf -v $('pwd')import-done/:/home/import-done -d osmupdate
```
With -e, you can add some settings :
```bash
- MAX_DAYS = 100, the maximum time range to assemble a cumulated changefile.
- DIFF = sporadic, osmupdate uses a combination of minutely, hourly and daily changefiles. This value can be minute, hour, day or sporadic.
- MAX_MERGE = 7, argument to determine the maximum number of parallely processed changefiles.
- COMPRESSION_LEVEL = 1, define level for gzip compression. values between 1 (low compression but fast) and 9 (high compression but slow)
- BASE_URL = http://planet.openstreetmap.org/replication/, change the URL to use a custom URL to fetch regional file updates.
- IMPORT_QUEUE = import_queue
- IMPORT_DONE = import_done
- OSM_PBF = osm_pbf
- TIME = 120, seconds between two executions of the script
```

Wyświetl plik

@ -300,6 +300,20 @@ With -e, you can add some settings :
You can adjust these preferences in the ```docker-compose.yml``` file provided
in this repository.
### Docker OSM Enrich
Docker osm-enrich is the extension for docker osm to get the changeset of the osm data.
It will get the data from osm API and also get the update data from files that generated from docker-osmupdate
- data is new (changeset is null) : get from docker osm
- data is exist but need to check the recent changeset : get data from file generated from osmupdate, update into database
osmenrich will create new fields which are:
- changeset_id
- changeset_timestamp
- changeset_version
- changeset_user
# PostGIS
For environment variables associated with `docker-postgis` refer to [docker postgis repository](https://github.com/kartoza/docker-postgis)