Porównaj commity

...

67 Commity

Autor SHA1 Wiadomość Data
Piero Toffanin ae6726e536
Merge pull request #1760 from pierotofy/fastcut
Skip feathered raster generation when possible
2024-05-17 15:51:32 -04:00
Piero Toffanin 6da366f806 Windows fix 2024-05-17 15:23:10 -04:00
Piero Toffanin e4e27c21f2 Skip feathered raster generation when possible 2024-05-17 14:55:26 -04:00
Piero Toffanin f9136f7a0d
Merge pull request #1758 from idimitrovski/master
Support for DJI Mavic 2 Zoom srt files
2024-05-10 11:24:52 -04:00
idimitrovski a2d9eccad5 Support for DJI Mavic 2 Zoom srt files 2024-05-10 09:29:37 +02:00
Piero Toffanin 424d9e28a0
Merge pull request #1756 from andrewharvey/patch-1
Fix PoissonRecon failed with n threads log message
2024-04-18 11:53:46 -04:00
Andrew Harvey a0fbd71d41
Fix PoissonRecon failed with n threads log message
The message was reporting failure with n threads and retrying with n // 2, however a few lines up threads was already set to n // 2 representing the next thread count to try.
2024-04-18 15:35:53 +10:00
Piero Toffanin 6084d1dca0
Merge pull request #1754 from pierotofy/minviews
Use min views filter = 1
2024-04-11 23:18:26 -04:00
Piero Toffanin aef4182cf9 More solid OpenMVS clustering fallback 2024-04-11 14:18:20 -04:00
Piero Toffanin 6c0fe6e79d Bump version 2024-04-10 22:07:36 -04:00
Piero Toffanin 17dfc7599a Update pc-filter value to 5 2024-04-10 13:48:11 -04:00
Piero Toffanin a70e7445ad Update default feature-type, pc-filter values 2024-04-10 12:26:34 -04:00
Piero Toffanin 981bf88b48 Use min views filter = 1 2024-04-10 11:13:58 -04:00
Piero Toffanin ad63392e1a
Merge pull request #1752 from pierotofy/geobom
Fix BOM encoding bug with geo files
2024-04-02 12:53:35 -04:00
Piero Toffanin 77f8ffc8cd Fix BOM encoding bug with geo files 2024-04-02 12:46:20 -04:00
Piero Toffanin 4d7cf32a8c
Merge pull request #1751 from smathermather/fish-aye
replace fisheye with fisheye_opencv but keep API the same until 4.0
2024-03-11 23:32:19 -04:00
Stephen Mather 5a439c0ab6 replace fisheye with fisheye_opencv but keep API the same until 4.0 2024-03-11 22:56:48 -04:00
Piero Toffanin ffcda0dc57
Merge pull request #1749 from smathermather/increase-default-GPS-Accuracy
increase default GPS-Accuracy to 3m
2024-03-08 22:16:44 -05:00
Stephen Mather 2c6fd1dd9f
increase default GPS-Accuracy to 3m 2024-03-08 22:13:54 -05:00
Sylvain POULAIN cb3229a3d4
Add Mavic 3 rolling shutter, not enterprise version (#1747)
* Add Mavic 3 rolling shutter

* M3
2024-02-12 09:50:22 -05:00
Piero Toffanin fc9c94880f
Merge pull request #1746 from kielnino/set-extensionsused
GLTF - obj2glb - Set extensionsUsed in all cases to be consistent with the GLTF standard
2024-02-09 10:23:22 -05:00
kielnino b204a2eb98
set extensionsUsed in all cases 2024-02-09 15:06:02 +01:00
Piero Toffanin d9f77bea54
Merge pull request #1744 from kielnino/remove-unuses-mvs_tmp_dir
Update comment on mvs_tmp_dir
2024-02-01 09:14:23 -05:00
kielnino 10947ecddf
clarify usage of tmp directory 2024-02-01 12:02:06 +01:00
kielnino f7c7044823
remove unused mvs_tmp_dir 2024-02-01 09:25:10 +01:00
Piero Toffanin ae50133886
Merge pull request #1742 from pierotofy/eptclass
Classify point cloud before generating derivative outputs
2024-01-25 12:56:36 -05:00
Piero Toffanin 9fd3bf3edd Improve SRT parser to handle abs_alt altitude reference 2024-01-23 22:24:38 +00:00
Piero Toffanin fb85b754fb Classify point cloud before generating derivative outputs 2024-01-23 17:03:36 -05:00
Piero Toffanin 30f89c068c
Merge pull request #1739 from pierotofy/smartband
Fix build
2024-01-15 19:41:20 -05:00
Piero Toffanin 260b4ef864 Manually install numpy 2024-01-15 16:21:08 -05:00
Piero Toffanin fb5d88366e
Merge pull request #1738 from pierotofy/smartband
Ignore multispectral band groups that are missing images
2024-01-15 11:38:53 -05:00
Piero Toffanin f793627402 Ignore multispectral band groups that are missing images 2024-01-15 09:51:17 -05:00
Piero Toffanin 9183218f1b Bump version 2024-01-12 00:20:35 -05:00
Piero Toffanin 1283df206e
Merge pull request #1732 from OpenDroneMap/nolocalseam
Deprecate texturing-skip-local-seam-leveling
2023-12-11 15:25:51 -05:00
Piero Toffanin 76a061b86a Deprecate texturing-skip-local-seam-leveling 2023-12-11 14:57:21 -05:00
Piero Toffanin 32d933027e
Merge pull request #1731 from pierotofy/median
C++ median smoothing filter
2023-12-08 11:34:05 -05:00
Piero Toffanin a29280157e Add radius parameter 2023-12-07 16:12:15 -05:00
Piero Toffanin 704c285b8f Remove eigen dep 2023-12-07 15:58:12 -05:00
Piero Toffanin 5674e68e9f Median filtering using fastrasterfilter 2023-12-07 18:49:43 +00:00
Piero Toffanin d419d9f038
Merge pull request #1729 from pierotofy/corridor
dem2mesh improvements
2023-12-06 19:34:47 -05:00
Piero Toffanin b3ae35f5e5 Update dem2mesh 2023-12-06 13:52:24 -05:00
Piero Toffanin 18d4d31be7 Fix pc2dem.py 2023-12-06 12:34:07 -05:00
Piero Toffanin 16ccd277ec
Merge pull request #1728 from pierotofy/corridor
Improved DEM generation efficiency
2023-12-05 14:06:54 -05:00
Piero Toffanin 7048868f28 Improved DEM generation efficiency 2023-12-05 14:01:15 -05:00
Piero Toffanin b14ffd919a Remove need for one intermediate raster 2023-12-05 12:26:14 -05:00
Piero Toffanin 4d1d0350a5
Update issue-triage.yml 2023-12-05 10:12:06 -05:00
Piero Toffanin 7261c29efc Respect max_tile_size parameter 2023-12-04 22:49:06 -05:00
Piero Toffanin 2ccad6ee9d Fix renderdem bounds calculation 2023-12-04 22:26:04 -05:00
Piero Toffanin 6acf9835e5 Update issue-triage.yml 2023-11-29 23:34:52 -05:00
Piero Toffanin 5b5df3aaf7 Add issue-trage.yml 2023-11-29 23:33:36 -05:00
Piero Toffanin 26cc9fbf93
Merge pull request #1725 from pierotofy/renderdem
Render DEM tiles using RenderDEM
2023-11-28 13:10:35 -05:00
Piero Toffanin b08f955963 Use URL 2023-11-28 11:33:09 -05:00
Piero Toffanin d028873f63 Use PDAL fork 2023-11-28 11:24:50 -05:00
Piero Toffanin 2d2b809530 Set maxTiles check only in absence of georeferenced photos 2023-11-28 00:43:11 -05:00
Piero Toffanin 7e05a5b04e Minor fix 2023-11-27 16:34:08 -05:00
Piero Toffanin e0ab6ae7ed Bump version 2023-11-27 16:25:11 -05:00
Piero Toffanin eceae8d2e4 Render DEM tiles using RenderDEM 2023-11-27 16:20:21 -05:00
Piero Toffanin 55570385c1
Merge pull request #1720 from pierotofy/autorerun
Feat: Auto rerun-from
2023-11-13 13:42:04 -05:00
Piero Toffanin eed840c9bb Always auto-rerun from beginning with split 2023-11-13 13:40:55 -05:00
Piero Toffanin 8376f24f08 Remove duplicate stmt 2023-11-08 11:40:22 -05:00
Piero Toffanin 6d70a4f0be Fix processopts slice 2023-11-08 11:14:15 -05:00
Piero Toffanin 6df5e0b711 Feat: Auto rerun-from 2023-11-08 11:07:20 -05:00
Piero Toffanin 5d9564fda3
Merge pull request #1717 from pierotofy/fcp
Pin Eigen 3.4
2023-11-05 15:52:28 -05:00
Piero Toffanin eccb203d7a Pin eigen34 2023-11-05 15:45:12 -05:00
Piero Toffanin 2df4afaecf
Merge pull request #1716 from pierotofy/fcp
Fix fast_floor in FPC Filter, Invalid PLY file (expected 'property uint8 views')
2023-11-04 20:22:44 -04:00
Piero Toffanin e5ed68846e Fix OpenMVS subscene logic 2023-11-04 20:00:57 -04:00
Piero Toffanin 7cf71628f3 Fix fast_floor in FPC Filter 2023-11-04 13:32:40 -04:00
31 zmienionych plików z 485 dodań i 351 usunięć

Wyświetl plik

@ -0,0 +1,33 @@
name: Issue Triage
on:
issues:
types:
- opened
jobs:
issue_triage:
runs-on: ubuntu-latest
permissions:
issues: write
steps:
- uses: pierotofy/issuewhiz@v1
with:
ghToken: ${{ secrets.GITHUB_TOKEN }}
openAI: ${{ secrets.OPENAI_TOKEN }}
filter: |
- "#"
variables: |
- Q: "A question about using a software or seeking guidance on doing something?"
- B: "Reporting an issue or a software bug?"
- P: "Describes an issue with processing a set of images or a particular dataset?"
- D: "Contains a link to a dataset or images?"
- E: "Contains a suggestion for an improvement or a feature request?"
- SC: "Describes an issue related to compiling or building source code?"
logic: |
- 'Q and (not B) and (not P) and (not E) and (not SC) and not (title_lowercase ~= ".*bug: .+")': [comment: "Could we move this conversation over to the forum at https://community.opendronemap.org? The forum is the right place to ask questions (we try to keep the GitHub issue tracker for feature requests and bugs only). Thank you!", close: true, stop: true]
- "B and (not P) and (not E) and (not SC)": [label: "software fault", stop: true]
- "P and D": [label: "possible software fault", stop: true]
- "P and (not D) and (not SC) and (not E)": [comment: "Thanks for the report, but it looks like you didn't include a copy of your dataset for us to reproduce this issue? Please make sure to follow our [issue guidelines](https://github.com/OpenDroneMap/ODM/blob/master/docs/issue_template.md) :pray: ", close: true, stop: true]
- "E": [label: enhancement, stop: true]
- "SC": [label: "possible software fault"]
signature: "p.s. I'm just an automated script, not a human being."

Wyświetl plik

@ -179,6 +179,7 @@ set(custom_libs OpenSfM
Obj2Tiles
OpenPointClass
ExifTool
RenderDEM
)
externalproject_add(mve
@ -222,7 +223,7 @@ externalproject_add(poissonrecon
externalproject_add(dem2mesh
GIT_REPOSITORY https://github.com/OpenDroneMap/dem2mesh.git
GIT_TAG 313
GIT_TAG 334
PREFIX ${SB_BINARY_DIR}/dem2mesh
SOURCE_DIR ${SB_SOURCE_DIR}/dem2mesh
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
@ -250,6 +251,15 @@ externalproject_add(odm_orthophoto
${WIN32_CMAKE_ARGS} ${WIN32_GDAL_ARGS}
)
externalproject_add(fastrasterfilter
GIT_REPOSITORY https://github.com/OpenDroneMap/FastRasterFilter.git
GIT_TAG main
PREFIX ${SB_BINARY_DIR}/fastrasterfilter
SOURCE_DIR ${SB_SOURCE_DIR}/fastrasterfilter
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
${WIN32_CMAKE_ARGS} ${WIN32_GDAL_ARGS}
)
externalproject_add(lastools
GIT_REPOSITORY https://github.com/OpenDroneMap/LAStools.git
GIT_TAG 250

Wyświetl plik

@ -8,7 +8,7 @@ ExternalProject_Add(${_proj_name}
#--Download step--------------
DOWNLOAD_DIR ${SB_DOWNLOAD_DIR}
GIT_REPOSITORY https://github.com/OpenDroneMap/FPCFilter
GIT_TAG 320
GIT_TAG 331
#--Update/Patch step----------
UPDATE_COMMAND ""
#--Configure step-------------

Wyświetl plik

@ -14,7 +14,7 @@ externalproject_add(vcg
externalproject_add(eigen34
GIT_REPOSITORY https://gitlab.com/libeigen/eigen.git
GIT_TAG 3.4
GIT_TAG 7176ae16238ded7fb5ed30a7f5215825b3abd134
UPDATE_COMMAND ""
SOURCE_DIR ${SB_SOURCE_DIR}/eigen34
CONFIGURE_COMMAND ""

Wyświetl plik

@ -16,7 +16,7 @@ ExternalProject_Add(${_proj_name}
STAMP_DIR ${_SB_BINARY_DIR}/stamp
#--Download step--------------
DOWNLOAD_DIR ${SB_DOWNLOAD_DIR}
URL https://github.com/PDAL/PDAL/archive/refs/tags/2.4.3.zip
URL https://github.com/OpenDroneMap/PDAL/archive/refs/heads/333.zip
#--Update/Patch step----------
UPDATE_COMMAND ""
#--Configure step-------------

Wyświetl plik

@ -0,0 +1,30 @@
set(_proj_name renderdem)
set(_SB_BINARY_DIR "${SB_BINARY_DIR}/${_proj_name}")
ExternalProject_Add(${_proj_name}
DEPENDS pdal
PREFIX ${_SB_BINARY_DIR}
TMP_DIR ${_SB_BINARY_DIR}/tmp
STAMP_DIR ${_SB_BINARY_DIR}/stamp
#--Download step--------------
DOWNLOAD_DIR ${SB_DOWNLOAD_DIR}
GIT_REPOSITORY https://github.com/OpenDroneMap/RenderDEM
GIT_TAG main
#--Update/Patch step----------
UPDATE_COMMAND ""
#--Configure step-------------
SOURCE_DIR ${SB_SOURCE_DIR}/${_proj_name}
CMAKE_ARGS
-DPDAL_DIR=${SB_INSTALL_DIR}/lib/cmake/PDAL
-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}
-DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
${WIN32_CMAKE_ARGS}
#--Build step-----------------
BINARY_DIR ${_SB_BINARY_DIR}
#--Install step---------------
INSTALL_DIR ${SB_INSTALL_DIR}
#--Output logging-------------
LOG_DOWNLOAD OFF
LOG_CONFIGURE OFF
LOG_BUILD OFF
)

Wyświetl plik

@ -1 +1 @@
3.3.0
3.5.1

Wyświetl plik

@ -127,6 +127,9 @@ installreqs() {
installdepsfromsnapcraft build openmvs
set -e
# edt requires numpy to build
pip install --ignore-installed numpy==1.23.1
pip install --ignore-installed -r requirements.txt
#if [ ! -z "$GPU_INSTALL" ]; then
#fi

Wyświetl plik

@ -51,6 +51,5 @@ commands.create_dem(args.point_cloud,
outdir=outdir,
resolution=args.resolution,
decimation=1,
max_workers=multiprocessing.cpu_count(),
keep_unfilled_copy=False
max_workers=multiprocessing.cpu_count()
)

Wyświetl plik

@ -0,0 +1,76 @@
from opendm import log
from shlex import _find_unsafe
import json
import os
def double_quote(s):
"""Return a shell-escaped version of the string *s*."""
if not s:
return '""'
if _find_unsafe(s) is None:
return s
# use double quotes, and prefix double quotes with a \
# the string $"b is then quoted as "$\"b"
return '"' + s.replace('"', '\\\"') + '"'
def args_to_dict(args):
args_dict = vars(args)
result = {}
for k in sorted(args_dict.keys()):
# Skip _is_set keys
if k.endswith("_is_set"):
continue
# Don't leak token
if k == 'sm_cluster' and args_dict[k] is not None:
result[k] = True
else:
result[k] = args_dict[k]
return result
def save_opts(opts_json, args):
try:
with open(opts_json, "w", encoding='utf-8') as f:
f.write(json.dumps(args_to_dict(args)))
except Exception as e:
log.ODM_WARNING("Cannot save options to %s: %s" % (opts_json, str(e)))
def compare_args(opts_json, args, rerun_stages):
if not os.path.isfile(opts_json):
return {}
try:
diff = {}
with open(opts_json, "r", encoding="utf-8") as f:
prev_args = json.loads(f.read())
cur_args = args_to_dict(args)
for opt in cur_args:
cur_value = cur_args[opt]
prev_value = prev_args.get(opt, None)
stage = rerun_stages.get(opt, None)
if stage is not None and cur_value != prev_value:
diff[opt] = prev_value
return diff
except:
return {}
def find_rerun_stage(opts_json, args, rerun_stages, processopts):
# Find the proper rerun stage if one is not explicitly set
if not ('rerun_is_set' in args or 'rerun_from_is_set' in args or 'rerun_all_is_set' in args):
args_diff = compare_args(opts_json, args, rerun_stages)
if args_diff:
if 'split_is_set' in args:
return processopts[processopts.index('dataset'):], args_diff
try:
stage_idxs = [processopts.index(rerun_stages[opt]) for opt in args_diff.keys() if rerun_stages[opt] is not None]
return processopts[min(stage_idxs):], args_diff
except ValueError as e:
print(str(e))
return None, {}

Wyświetl plik

@ -13,6 +13,100 @@ processopts = ['dataset', 'split', 'merge', 'opensfm', 'openmvs', 'odm_filterpoi
'odm_meshing', 'mvs_texturing', 'odm_georeferencing',
'odm_dem', 'odm_orthophoto', 'odm_report', 'odm_postprocess']
rerun_stages = {
'3d_tiles': 'odm_postprocess',
'align': 'odm_georeferencing',
'auto_boundary': 'odm_filterpoints',
'auto_boundary_distance': 'odm_filterpoints',
'bg_removal': 'dataset',
'boundary': 'odm_filterpoints',
'build_overviews': 'odm_orthophoto',
'camera_lens': 'dataset',
'cameras': 'dataset',
'cog': 'odm_dem',
'copy_to': 'odm_postprocess',
'crop': 'odm_georeferencing',
'dem_decimation': 'odm_dem',
'dem_euclidean_map': 'odm_dem',
'dem_gapfill_steps': 'odm_dem',
'dem_resolution': 'odm_dem',
'dsm': 'odm_dem',
'dtm': 'odm_dem',
'end_with': None,
'fast_orthophoto': 'odm_filterpoints',
'feature_quality': 'opensfm',
'feature_type': 'opensfm',
'force_gps': 'opensfm',
'gcp': 'dataset',
'geo': 'dataset',
'gltf': 'mvs_texturing',
'gps_accuracy': 'dataset',
'help': None,
'ignore_gsd': 'opensfm',
'matcher_neighbors': 'opensfm',
'matcher_order': 'opensfm',
'matcher_type': 'opensfm',
'max_concurrency': None,
'merge': 'Merge',
'mesh_octree_depth': 'odm_meshing',
'mesh_size': 'odm_meshing',
'min_num_features': 'opensfm',
'name': None,
'no_gpu': None,
'optimize_disk_space': None,
'orthophoto_compression': 'odm_orthophoto',
'orthophoto_cutline': 'odm_orthophoto',
'orthophoto_kmz': 'odm_orthophoto',
'orthophoto_no_tiled': 'odm_orthophoto',
'orthophoto_png': 'odm_orthophoto',
'orthophoto_resolution': 'odm_orthophoto',
'pc_classify': 'odm_georeferencing',
'pc_copc': 'odm_georeferencing',
'pc_csv': 'odm_georeferencing',
'pc_ept': 'odm_georeferencing',
'pc_filter': 'openmvs',
'pc_las': 'odm_georeferencing',
'pc_quality': 'opensfm',
'pc_rectify': 'odm_georeferencing',
'pc_sample': 'odm_filterpoints',
'pc_skip_geometric': 'openmvs',
'primary_band': 'dataset',
'project_path': None,
'radiometric_calibration': 'opensfm',
'rerun': None,
'rerun_all': None,
'rerun_from': None,
'rolling_shutter': 'opensfm',
'rolling_shutter_readout': 'opensfm',
'sfm_algorithm': 'opensfm',
'sfm_no_partial': 'opensfm',
'skip_3dmodel': 'odm_meshing',
'skip_band_alignment': 'opensfm',
'skip_orthophoto': 'odm_orthophoto',
'skip_report': 'odm_report',
'sky_removal': 'dataset',
'sm_cluster': 'split',
'sm_no_align': 'split',
'smrf_scalar': 'odm_dem',
'smrf_slope': 'odm_dem',
'smrf_threshold': 'odm_dem',
'smrf_window': 'odm_dem',
'split': 'split',
'split_image_groups': 'split',
'split_overlap': 'split',
'texturing_keep_unseen_faces': 'mvs_texturing',
'texturing_single_material': 'mvs_texturing',
'texturing_skip_global_seam_leveling': 'mvs_texturing',
'tiles': 'odm_dem',
'use_3dmesh': 'mvs_texturing',
'use_exif': 'dataset',
'use_fixed_camera_params': 'opensfm',
'use_hybrid_bundle_adjustment': 'opensfm',
'version': None,
'video_limit': 'dataset',
'video_resolution': 'dataset',
}
with open(os.path.join(context.root_path, 'VERSION')) as version_file:
__version__ = version_file.read().strip()
@ -123,7 +217,7 @@ def config(argv=None, parser=None):
parser.add_argument('--feature-type',
metavar='<string>',
action=StoreValue,
default='sift',
default='dspsift',
choices=['akaze', 'dspsift', 'hahog', 'orb', 'sift'],
help=('Choose the algorithm for extracting keypoints and computing descriptors. '
'Can be one of: %(choices)s. Default: '
@ -391,7 +485,7 @@ def config(argv=None, parser=None):
metavar='<positive float>',
action=StoreValue,
type=float,
default=2.5,
default=5,
help='Filters the point cloud by removing points that deviate more than N standard deviations from the local mean. Set to 0 to disable filtering. '
'Default: %(default)s')
@ -448,12 +542,6 @@ def config(argv=None, parser=None):
default=False,
help=('Skip normalization of colors across all images. Useful when processing radiometric data. Default: %(default)s'))
parser.add_argument('--texturing-skip-local-seam-leveling',
action=StoreTrue,
nargs=0,
default=False,
help='Skip the blending of colors near seams. Default: %(default)s')
parser.add_argument('--texturing-keep-unseen-faces',
action=StoreTrue,
nargs=0,
@ -750,7 +838,7 @@ def config(argv=None, parser=None):
type=float,
action=StoreValue,
metavar='<positive float>',
default=10,
default=3,
help='Set a value in meters for the GPS Dilution of Precision (DOP) '
'information for all images. If your images are tagged '
'with high precision GPS information (RTK), this value will be automatically '
@ -792,7 +880,7 @@ def config(argv=None, parser=None):
'Default: %(default)s'))
args, unknown = parser.parse_known_args(argv)
DEPRECATED = ["--verbose", "--debug", "--time", "--resize-to", "--depthmap-resolution", "--pc-geometric", "--texturing-data-term", "--texturing-outlier-removal-type", "--texturing-tone-mapping"]
DEPRECATED = ["--verbose", "--debug", "--time", "--resize-to", "--depthmap-resolution", "--pc-geometric", "--texturing-data-term", "--texturing-outlier-removal-type", "--texturing-tone-mapping", "--texturing-skip-local-seam-leveling"]
unknown_e = [p for p in unknown if p not in DEPRECATED]
if len(unknown_e) > 0:
raise parser.error("unrecognized arguments: %s" % " ".join(unknown_e))

Wyświetl plik

@ -5,23 +5,17 @@ import numpy
import math
import time
import shutil
import functools
import threading
import glob
import re
from joblib import delayed, Parallel
from opendm.system import run
from opendm import point_cloud
from opendm import io
from opendm import system
from opendm.concurrency import get_max_memory, parallel_map, get_total_memory
from scipy import ndimage
from datetime import datetime
from opendm.vendor.gdal_fillnodata import main as gdal_fillnodata
from opendm import log
try:
import Queue as queue
except:
import queue
import threading
from .ground_rectification.rectify import run_rectification
from . import pdal
@ -69,119 +63,51 @@ error = None
def create_dem(input_point_cloud, dem_type, output_type='max', radiuses=['0.56'], gapfill=True,
outdir='', resolution=0.1, max_workers=1, max_tile_size=4096,
decimation=None, keep_unfilled_copy=False,
decimation=None, with_euclidean_map=False,
apply_smoothing=True, max_tiles=None):
""" Create DEM from multiple radii, and optionally gapfill """
global error
error = None
start = datetime.now()
if not os.path.exists(outdir):
log.ODM_INFO("Creating %s" % outdir)
os.mkdir(outdir)
extent = point_cloud.get_extent(input_point_cloud)
log.ODM_INFO("Point cloud bounds are [minx: %s, maxx: %s] [miny: %s, maxy: %s]" % (extent['minx'], extent['maxx'], extent['miny'], extent['maxy']))
ext_width = extent['maxx'] - extent['minx']
ext_height = extent['maxy'] - extent['miny']
w, h = (int(math.ceil(ext_width / float(resolution))),
int(math.ceil(ext_height / float(resolution))))
# Set a floor, no matter the resolution parameter
# (sometimes a wrongly estimated scale of the model can cause the resolution
# to be set unrealistically low, causing errors)
RES_FLOOR = 64
if w < RES_FLOOR and h < RES_FLOOR:
prev_w, prev_h = w, h
if w >= h:
w, h = (RES_FLOOR, int(math.ceil(ext_height / ext_width * RES_FLOOR)))
else:
w, h = (int(math.ceil(ext_width / ext_height * RES_FLOOR)), RES_FLOOR)
floor_ratio = prev_w / float(w)
resolution *= floor_ratio
radiuses = [str(float(r) * floor_ratio) for r in radiuses]
log.ODM_WARNING("Really low resolution DEM requested %s will set floor at %s pixels. Resolution changed to %s. The scale of this reconstruction might be off." % ((prev_w, prev_h), RES_FLOOR, resolution))
final_dem_pixels = w * h
num_splits = int(max(1, math.ceil(math.log(math.ceil(final_dem_pixels / float(max_tile_size * max_tile_size)))/math.log(2))))
num_tiles = num_splits * num_splits
log.ODM_INFO("DEM resolution is %s, max tile size is %s, will split DEM generation into %s tiles" % ((h, w), max_tile_size, num_tiles))
tile_bounds_width = ext_width / float(num_splits)
tile_bounds_height = ext_height / float(num_splits)
tiles = []
for r in radiuses:
minx = extent['minx']
for x in range(num_splits):
miny = extent['miny']
if x == num_splits - 1:
maxx = extent['maxx']
else:
maxx = minx + tile_bounds_width
for y in range(num_splits):
if y == num_splits - 1:
maxy = extent['maxy']
else:
maxy = miny + tile_bounds_height
filename = os.path.join(os.path.abspath(outdir), '%s_r%s_x%s_y%s.tif' % (dem_type, r, x, y))
tiles.append({
'radius': r,
'bounds': {
'minx': minx,
'maxx': maxx,
'miny': miny,
'maxy': maxy
},
'filename': filename
})
miny = maxy
minx = maxx
# Safety check
if max_tiles is not None:
if len(tiles) > max_tiles and (final_dem_pixels * 4) > get_total_memory():
raise system.ExitException("Max tiles limit exceeded (%s). This is a strong indicator that the reconstruction failed and we would probably run out of memory trying to process this" % max_tiles)
# Sort tiles by increasing radius
tiles.sort(key=lambda t: float(t['radius']), reverse=True)
def process_tile(q):
log.ODM_INFO("Generating %s (%s, radius: %s, resolution: %s)" % (q['filename'], output_type, q['radius'], resolution))
d = pdal.json_gdal_base(q['filename'], output_type, q['radius'], resolution, q['bounds'])
if dem_type == 'dtm':
d = pdal.json_add_classification_filter(d, 2)
if decimation is not None:
d = pdal.json_add_decimation_filter(d, decimation)
pdal.json_add_readers(d, [input_point_cloud])
pdal.run_pipeline(d)
parallel_map(process_tile, tiles, max_workers)
kwargs = {
'input': input_point_cloud,
'outdir': outdir,
'outputType': output_type,
'radiuses': ",".join(map(str, radiuses)),
'resolution': resolution,
'maxTiles': 0 if max_tiles is None else max_tiles,
'decimation': 1 if decimation is None else decimation,
'classification': 2 if dem_type == 'dtm' else -1,
'tileSize': max_tile_size
}
system.run('renderdem "{input}" '
'--outdir "{outdir}" '
'--output-type {outputType} '
'--radiuses {radiuses} '
'--resolution {resolution} '
'--max-tiles {maxTiles} '
'--decimation {decimation} '
'--classification {classification} '
'--tile-size {tileSize} '
'--force '.format(**kwargs), env_vars={'OMP_NUM_THREADS': max_workers})
output_file = "%s.tif" % dem_type
output_path = os.path.abspath(os.path.join(outdir, output_file))
# Verify tile results
for t in tiles:
if not os.path.exists(t['filename']):
raise Exception("Error creating %s, %s failed to be created" % (output_file, t['filename']))
# Fetch tiles
tiles = []
for p in glob.glob(os.path.join(os.path.abspath(outdir), "*.tif")):
filename = os.path.basename(p)
m = re.match("^r([\d\.]+)_x\d+_y\d+\.tif", filename)
if m is not None:
tiles.append({'filename': p, 'radius': float(m.group(1))})
if len(tiles) == 0:
raise system.ExitException("No DEM tiles were generated, something went wrong")
log.ODM_INFO("Generated %s tiles" % len(tiles))
# Sort tiles by decreasing radius
tiles.sort(key=lambda t: float(t['radius']), reverse=True)
# Create virtual raster
tiles_vrt_path = os.path.abspath(os.path.join(outdir, "tiles.vrt"))
@ -193,7 +119,6 @@ def create_dem(input_point_cloud, dem_type, output_type='max', radiuses=['0.56']
run('gdalbuildvrt -input_file_list "%s" "%s" ' % (tiles_file_list, tiles_vrt_path))
merged_vrt_path = os.path.abspath(os.path.join(outdir, "merged.vrt"))
geotiff_tmp_path = os.path.abspath(os.path.join(outdir, 'tiles.tmp.tif'))
geotiff_small_path = os.path.abspath(os.path.join(outdir, 'tiles.small.tif'))
geotiff_small_filled_path = os.path.abspath(os.path.join(outdir, 'tiles.small_filled.tif'))
geotiff_path = os.path.abspath(os.path.join(outdir, 'tiles.tif'))
@ -205,7 +130,6 @@ def create_dem(input_point_cloud, dem_type, output_type='max', radiuses=['0.56']
'tiles_vrt': tiles_vrt_path,
'merged_vrt': merged_vrt_path,
'geotiff': geotiff_path,
'geotiff_tmp': geotiff_tmp_path,
'geotiff_small': geotiff_small_path,
'geotiff_small_filled': geotiff_small_filled_path
}
@ -214,31 +138,27 @@ def create_dem(input_point_cloud, dem_type, output_type='max', radiuses=['0.56']
# Sometimes, for some reason gdal_fillnodata.py
# behaves strangely when reading data directly from a .VRT
# so we need to convert to GeoTIFF first.
# Scale to 10% size
run('gdal_translate '
'-co NUM_THREADS={threads} '
'-co BIGTIFF=IF_SAFER '
'-co COMPRESS=DEFLATE '
'--config GDAL_CACHEMAX {max_memory}% '
'"{tiles_vrt}" "{geotiff_tmp}"'.format(**kwargs))
# Scale to 10% size
run('gdal_translate '
'-co NUM_THREADS={threads} '
'-co BIGTIFF=IF_SAFER '
'--config GDAL_CACHEMAX {max_memory}% '
'-outsize 10% 0 '
'"{geotiff_tmp}" "{geotiff_small}"'.format(**kwargs))
'-outsize 10% 0 '
'"{tiles_vrt}" "{geotiff_small}"'.format(**kwargs))
# Fill scaled
gdal_fillnodata(['.',
'-co', 'NUM_THREADS=%s' % kwargs['threads'],
'-co', 'BIGTIFF=IF_SAFER',
'-co', 'COMPRESS=DEFLATE',
'--config', 'GDAL_CACHE_MAX', str(kwargs['max_memory']) + '%',
'-b', '1',
'-of', 'GTiff',
kwargs['geotiff_small'], kwargs['geotiff_small_filled']])
# Merge filled scaled DEM with unfilled DEM using bilinear interpolation
run('gdalbuildvrt -resolution highest -r bilinear "%s" "%s" "%s"' % (merged_vrt_path, geotiff_small_filled_path, geotiff_tmp_path))
run('gdalbuildvrt -resolution highest -r bilinear "%s" "%s" "%s"' % (merged_vrt_path, geotiff_small_filled_path, tiles_vrt_path))
run('gdal_translate '
'-co NUM_THREADS={threads} '
'-co TILED=YES '
@ -261,14 +181,14 @@ def create_dem(input_point_cloud, dem_type, output_type='max', radiuses=['0.56']
else:
os.replace(geotiff_path, output_path)
if os.path.exists(geotiff_tmp_path):
if not keep_unfilled_copy:
os.remove(geotiff_tmp_path)
else:
os.replace(geotiff_tmp_path, io.related_file_path(output_path, postfix=".unfilled"))
if os.path.exists(tiles_vrt_path):
if with_euclidean_map:
emap_path = io.related_file_path(output_path, postfix=".euclideand")
compute_euclidean_map(tiles_vrt_path, emap_path, overwrite=True)
for cleanup_file in [tiles_vrt_path, tiles_file_list, merged_vrt_path, geotiff_small_path, geotiff_small_filled_path]:
if os.path.exists(cleanup_file): os.remove(cleanup_file)
for t in tiles:
if os.path.exists(t['filename']): os.remove(t['filename'])
@ -284,12 +204,20 @@ def compute_euclidean_map(geotiff_path, output_path, overwrite=False):
with rasterio.open(geotiff_path) as f:
nodata = f.nodatavals[0]
if not os.path.exists(output_path) or overwrite:
if not os.path.isfile(output_path) or overwrite:
if os.path.isfile(output_path):
os.remove(output_path)
log.ODM_INFO("Computing euclidean distance: %s" % output_path)
if gdal_proximity is not None:
try:
gdal_proximity(['gdal_proximity.py', geotiff_path, output_path, '-values', str(nodata)])
gdal_proximity(['gdal_proximity.py',
geotiff_path, output_path, '-values', str(nodata),
'-co', 'TILED=YES',
'-co', 'BIGTIFF=IF_SAFER',
'-co', 'COMPRESS=DEFLATE',
])
except Exception as e:
log.ODM_WARNING("Cannot compute euclidean distance: %s" % str(e))
@ -305,106 +233,31 @@ def compute_euclidean_map(geotiff_path, output_path, overwrite=False):
return output_path
def median_smoothing(geotiff_path, output_path, smoothing_iterations=1, window_size=512, num_workers=1):
def median_smoothing(geotiff_path, output_path, window_size=512, num_workers=1, radius=4):
""" Apply median smoothing """
start = datetime.now()
if not os.path.exists(geotiff_path):
raise Exception('File %s does not exist!' % geotiff_path)
# Prepare temporary files
folder_path, output_filename = os.path.split(output_path)
basename, ext = os.path.splitext(output_filename)
output_dirty_in = os.path.join(folder_path, "{}.dirty_1{}".format(basename, ext))
output_dirty_out = os.path.join(folder_path, "{}.dirty_2{}".format(basename, ext))
log.ODM_INFO('Starting smoothing...')
with rasterio.open(geotiff_path, num_threads=num_workers) as img, rasterio.open(output_dirty_in, "w+", BIGTIFF="IF_SAFER", num_threads=num_workers, **img.profile) as imgout, rasterio.open(output_dirty_out, "w+", BIGTIFF="IF_SAFER", num_threads=num_workers, **img.profile) as imgout2:
nodata = img.nodatavals[0]
dtype = img.dtypes[0]
shape = img.shape
for i in range(smoothing_iterations):
log.ODM_INFO("Smoothing iteration %s" % str(i + 1))
rows, cols = numpy.meshgrid(numpy.arange(0, shape[0], window_size), numpy.arange(0, shape[1], window_size))
rows = rows.flatten()
cols = cols.flatten()
rows_end = numpy.minimum(rows + window_size, shape[0])
cols_end= numpy.minimum(cols + window_size, shape[1])
windows = numpy.dstack((rows, cols, rows_end, cols_end)).reshape(-1, 4)
filter = functools.partial(ndimage.median_filter, size=9, output=dtype, mode='nearest')
# We cannot read/write to the same file from multiple threads without causing race conditions.
# To safely read/write from multiple threads, we use a lock to protect the DatasetReader/Writer.
read_lock = threading.Lock()
write_lock = threading.Lock()
# threading backend and GIL released filter are important for memory efficiency and multi-core performance
Parallel(n_jobs=num_workers, backend='threading')(delayed(window_filter_2d)(img, imgout, nodata , window, 9, filter, read_lock, write_lock) for window in windows)
# Between each iteration we swap the input and output temporary files
#img_in, img_out = img_out, img_in
if (i == 0):
img = imgout
imgout = imgout2
else:
img, imgout = imgout, img
# If the number of iterations was even, we need to swap temporary files
if (smoothing_iterations % 2 != 0):
output_dirty_in, output_dirty_out = output_dirty_out, output_dirty_in
# Cleaning temporary files
if os.path.exists(output_dirty_out):
os.replace(output_dirty_out, output_path)
if os.path.exists(output_dirty_in):
os.remove(output_dirty_in)
kwargs = {
'input': geotiff_path,
'output': output_path,
'window': window_size,
'radius': radius,
}
system.run('fastrasterfilter "{input}" '
'--output "{output}" '
'--window-size {window} '
'--radius {radius} '
'--co TILED=YES '
'--co BIGTIFF=IF_SAFER '
'--co COMPRESS=DEFLATE '.format(**kwargs), env_vars={'OMP_NUM_THREADS': num_workers})
log.ODM_INFO('Completed smoothing to create %s in %s' % (output_path, datetime.now() - start))
return output_path
def window_filter_2d(img, imgout, nodata, window, kernel_size, filter, read_lock, write_lock):
"""
Apply a filter to dem within a window, expects to work with kernal based filters
:param img: path to the geotiff to filter
:param imgout: path to write the giltered geotiff to. It can be the same as img to do the modification in place.
:param window: the window to apply the filter, should be a list contains row start, col_start, row_end, col_end
:param kernel_size: the size of the kernel for the filter, works with odd numbers, need to test if it works with even numbers
:param filter: the filter function which takes a 2d array as input and filter results as output.
:param read_lock: threading lock for the read operation
:param write_lock: threading lock for the write operation
"""
shape = img.shape[:2]
if window[0] < 0 or window[1] < 0 or window[2] > shape[0] or window[3] > shape[1]:
raise Exception('Window is out of bounds')
expanded_window = [ max(0, window[0] - kernel_size // 2), max(0, window[1] - kernel_size // 2), min(shape[0], window[2] + kernel_size // 2), min(shape[1], window[3] + kernel_size // 2) ]
# Read input window
width = expanded_window[3] - expanded_window[1]
height = expanded_window[2] - expanded_window[0]
rasterio_window = rasterio.windows.Window(col_off=expanded_window[1], row_off=expanded_window[0], width=width, height=height)
with read_lock:
win_arr = img.read(indexes=1, window=rasterio_window)
# Should have a better way to handle nodata, similar to the way the filter algorithms handle the border (reflection, nearest, interpolation, etc).
# For now will follow the old approach to guarantee identical outputs
nodata_locs = win_arr == nodata
win_arr = filter(win_arr)
win_arr[nodata_locs] = nodata
win_arr = win_arr[window[0] - expanded_window[0] : window[2] - expanded_window[0], window[1] - expanded_window[1] : window[3] - expanded_window[1]]
# Write output window
width = window[3] - window[1]
height = window[2] - window[0]
rasterio_window = rasterio.windows.Window(col_off=window[1], row_off=window[0], width=width, height=height)
with write_lock:
imgout.write(win_arr, indexes=1, window=rasterio_window)
def get_dem_radius_steps(stats_file, steps, resolution, multiplier = 1.0):
radius_steps = [point_cloud.get_spacing(stats_file, resolution) * multiplier]
for _ in range(steps - 1):

Wyświetl plik

@ -12,6 +12,9 @@ class GeoFile:
with open(self.geo_path, 'r') as f:
contents = f.read().strip()
# Strip eventual BOM characters
contents = contents.replace('\ufeff', '')
lines = list(map(str.strip, contents.split('\n')))
if lines:

Wyświetl plik

@ -279,9 +279,10 @@ def obj2glb(input_obj, output_glb, rtc=(None, None), draco_compression=True, _in
)
gltf.extensionsRequired = ['KHR_materials_unlit']
gltf.extensionsUsed = ['KHR_materials_unlit']
if rtc != (None, None) and len(rtc) >= 2:
gltf.extensionsUsed = ['CESIUM_RTC', 'KHR_materials_unlit']
gltf.extensionsUsed.append('CESIUM_RTC')
gltf.extensions = {
'CESIUM_RTC': {
'center': [float(rtc[0]), float(rtc[1]), 0.0]

Wyświetl plik

@ -7,7 +7,7 @@ import dateutil.parser
import shutil
import multiprocessing
from opendm.loghelpers import double_quote, args_to_dict
from opendm.arghelpers import double_quote, args_to_dict
from vmem import virtual_memory
if sys.platform == 'win32' or os.getenv('no_ansiesc'):

Wyświetl plik

@ -1,28 +0,0 @@
from shlex import _find_unsafe
def double_quote(s):
"""Return a shell-escaped version of the string *s*."""
if not s:
return '""'
if _find_unsafe(s) is None:
return s
# use double quotes, and prefix double quotes with a \
# the string $"b is then quoted as "$\"b"
return '"' + s.replace('"', '\\\"') + '"'
def args_to_dict(args):
args_dict = vars(args)
result = {}
for k in sorted(args_dict.keys()):
# Skip _is_set keys
if k.endswith("_is_set"):
continue
# Don't leak token
if k == 'sm_cluster' and args_dict[k] is not None:
result[k] = True
else:
result[k] = args_dict[k]
return result

Wyświetl plik

@ -187,7 +187,7 @@ def screened_poisson_reconstruction(inPointCloud, outMesh, depth = 8, samples =
if threads < 1:
break
else:
log.ODM_WARNING("PoissonRecon failed with %s threads, let's retry with %s..." % (threads, threads // 2))
log.ODM_WARNING("PoissonRecon failed with %s threads, let's retry with %s..." % (threads * 2, threads))
# Cleanup and reduce vertex count if necessary

Wyświetl plik

@ -64,7 +64,6 @@ class OSFMContext:
"Check that the images have enough overlap, "
"that there are enough recognizable features "
"and that the images are in focus. "
"You could also try to increase the --min-num-features parameter."
"The program will now exit.")
if rolling_shutter_correct:
@ -794,3 +793,12 @@ def get_all_submodel_paths(submodels_path, *all_paths):
result.append([os.path.join(submodels_path, f, ap) for ap in all_paths])
return result
def is_submodel(opensfm_root):
# A bit hackish, but works without introducing additional markers / flags
# Look at the path of the opensfm directory and see if "submodel_" is part of it
parts = os.path.abspath(opensfm_root).split(os.path.sep)
return (len(parts) >= 2 and parts[-2][:9] == "submodel_") or \
os.path.isfile(os.path.join(opensfm_root, "split_merge_stop_at_reconstruction.txt")) or \
os.path.isfile(os.path.join(opensfm_root, "features", "empty"))

Wyświetl plik

@ -430,7 +430,8 @@ class ODM_Photo:
camera_projection = camera_projection.lower()
# Parrot Sequoia's "fisheye" model maps to "fisheye_opencv"
if camera_projection == "fisheye" and self.camera_make.lower() == "parrot" and self.camera_model.lower() == "sequoia":
# or better yet, replace all fisheye with fisheye_opencv, but wait to change API signature
if camera_projection == "fisheye":
camera_projection = "fisheye_opencv"
if camera_projection in projections:

Wyświetl plik

@ -9,6 +9,8 @@ from opendm.concurrency import parallel_map
from opendm.utils import double_quote
from opendm.boundary import as_polygon, as_geojson
from opendm.dem.pdal import run_pipeline
from opendm.opc import classify
from opendm.dem import commands
def ply_info(input_ply):
if not os.path.exists(input_ply):
@ -274,6 +276,32 @@ def merge_ply(input_point_cloud_files, output_file, dims=None):
system.run(' '.join(cmd))
def post_point_cloud_steps(args, tree, rerun=False):
# Classify and rectify before generating derivate files
if args.pc_classify:
pc_classify_marker = os.path.join(tree.odm_georeferencing, 'pc_classify_done.txt')
if not io.file_exists(pc_classify_marker) or rerun:
log.ODM_INFO("Classifying {} using Simple Morphological Filter (1/2)".format(tree.odm_georeferencing_model_laz))
commands.classify(tree.odm_georeferencing_model_laz,
args.smrf_scalar,
args.smrf_slope,
args.smrf_threshold,
args.smrf_window
)
log.ODM_INFO("Classifying {} using OpenPointClass (2/2)".format(tree.odm_georeferencing_model_laz))
classify(tree.odm_georeferencing_model_laz, args.max_concurrency)
with open(pc_classify_marker, 'w') as f:
f.write('Classify: smrf\n')
f.write('Scalar: {}\n'.format(args.smrf_scalar))
f.write('Slope: {}\n'.format(args.smrf_slope))
f.write('Threshold: {}\n'.format(args.smrf_threshold))
f.write('Window: {}\n'.format(args.smrf_window))
if args.pc_rectify:
commands.rectify(tree.odm_georeferencing_model_laz)
# XYZ point cloud output
if args.pc_csv:
log.ODM_INFO("Creating CSV file (XYZ format)")

Wyświetl plik

@ -19,6 +19,7 @@ RS_DATABASE = {
'dji fc220': 64, # DJI Mavic Pro (Platinum)
'hasselblad l1d-20c': lambda p: 47 if p.get_capture_megapixels() < 17 else 56, # DJI Mavic 2 Pro (at 16:10 => 16.8MP 47ms, at 3:2 => 19.9MP 56ms. 4:3 has 17.7MP with same image height as 3:2 which can be concluded as same sensor readout)
'hasselblad l2d-20c': 16.6, # DJI Mavic 3 (not enterprise version)
'dji fc3582': lambda p: 26 if p.get_capture_megapixels() < 48 else 60, # DJI Mini 3 pro (at 48MP readout is 60ms, at 12MP it's 26ms)

Wyświetl plik

@ -13,6 +13,7 @@ from opendm import log
from opendm import io
from opendm import system
from opendm import context
from opendm import multispectral
from opendm.progress import progressbc
from opendm.photo import ODM_Photo
@ -45,19 +46,51 @@ class ODM_Reconstruction(object):
band_photos[p.band_name].append(p)
bands_count = len(band_photos)
if bands_count >= 2 and bands_count <= 8:
# Band name with the minimum number of photos
max_band_name = None
max_photos = -1
for band_name in band_photos:
if len(band_photos[band_name]) > max_photos:
max_band_name = band_name
max_photos = len(band_photos[band_name])
if bands_count >= 2 and bands_count <= 10:
# Validate that all bands have the same number of images,
# otherwise this is not a multi-camera setup
img_per_band = len(band_photos[p.band_name])
for band in band_photos:
if len(band_photos[band]) != img_per_band:
log.ODM_ERROR("Multi-camera setup detected, but band \"%s\" (identified from \"%s\") has only %s images (instead of %s), perhaps images are missing or are corrupted. Please include all necessary files to process all bands and try again." % (band, band_photos[band][0].filename, len(band_photos[band]), img_per_band))
raise RuntimeError("Invalid multi-camera images")
img_per_band = len(band_photos[max_band_name])
mc = []
for band_name in band_indexes:
mc.append({'name': band_name, 'photos': band_photos[band_name]})
filter_missing = False
for band in band_photos:
if len(band_photos[band]) < img_per_band:
log.ODM_WARNING("Multi-camera setup detected, but band \"%s\" (identified from \"%s\") has only %s images (instead of %s), perhaps images are missing or are corrupted." % (band, band_photos[band][0].filename, len(band_photos[band]), len(band_photos[max_band_name])))
filter_missing = True
if filter_missing:
# Calculate files to ignore
_, p2s = multispectral.compute_band_maps(mc, max_band_name)
max_files_per_band = 0
for filename in p2s:
max_files_per_band = max(max_files_per_band, len(p2s[filename]))
for filename in p2s:
if len(p2s[filename]) < max_files_per_band:
photos_to_remove = p2s[filename] + [p for p in self.photos if p.filename == filename]
for photo in photos_to_remove:
log.ODM_WARNING("Excluding %s" % photo.filename)
self.photos = [p for p in self.photos if p != photo]
for i in range(len(mc)):
mc[i]['photos'] = [p for p in mc[i]['photos'] if p != photo]
log.ODM_INFO("New image count: %s" % len(self.photos))
# We enforce a normalized band order for all bands that we can identify
# and rely on the manufacturer's band_indexes as a fallback for all others
normalized_band_order = {
@ -94,7 +127,7 @@ class ODM_Reconstruction(object):
for c, d in enumerate(mc):
log.ODM_INFO(f"Band {c + 1}: {d['name']}")
return mc
return None

Wyświetl plik

@ -4,7 +4,7 @@ import json
from opendm import log
from opendm.photo import find_largest_photo_dims
from osgeo import gdal
from opendm.loghelpers import double_quote
from opendm.arghelpers import double_quote
class NumpyEncoder(json.JSONEncoder):
def default(self, obj):

Wyświetl plik

@ -54,8 +54,10 @@ class SrtFileParser:
if not self.gps_data:
for d in self.data:
lat, lon, alt = d.get('latitude'), d.get('longitude'), d.get('altitude')
if alt is None:
alt = 0
tm = d.get('start')
if lat is not None and lon is not None:
if self.ll_to_utm is None:
self.ll_to_utm, self.utm_to_ll = location.utm_transformers_from_ll(lon, lat)
@ -127,6 +129,20 @@ class SrtFileParser:
# 00:00:35,000 --> 00:00:36,000
# F/6.3, SS 60, ISO 100, EV 0, RTK (120.083799, 30.213635, 28), HOME (120.084146, 30.214243, 103.55m), D 75.36m, H 76.19m, H.S 0.30m/s, V.S 0.00m/s, F.PRY (-5.3°, 2.1°, 28.3°), G.PRY (-40.0°, 0.0°, 28.2°)
# DJI Unknown Model #1
# 1
# 00:00:00,000 --> 00:00:00,033
# <font size="28">SrtCnt : 1, DiffTime : 33ms
# 2024-01-18 10:23:26.397
# [iso : 150] [shutter : 1/5000.0] [fnum : 170] [ev : 0] [ct : 5023] [color_md : default] [focal_len : 240] [dzoom_ratio: 10000, delta:0],[latitude: -22.724555] [longitude: -47.602414] [rel_alt: 0.300 abs_alt: 549.679] </font>
# DJI Mavic 2 Zoom
# 1
# 00:00:00,000 --> 00:00:00,041
# <font size="36">FrameCnt : 1, DiffTime : 41ms
# 2023-07-15 11:55:16,320,933
# [iso : 100] [shutter : 1/400.0] [fnum : 280] [ev : 0] [ct : 5818] [color_md : default] [focal_len : 240] [latitude : 0.000000] [longtitude : 0.000000] [altitude: 0.000000] </font>
with open(self.filename, 'r') as f:
iso = None
@ -197,12 +213,14 @@ class SrtFileParser:
latitude = match_single([
("latitude: ([\d\.\-]+)", lambda v: float(v) if v != 0 else None),
("latitude : ([\d\.\-]+)", lambda v: float(v) if v != 0 else None),
("GPS \([\d\.\-]+,? ([\d\.\-]+),? [\d\.\-]+\)", lambda v: float(v) if v != 0 else None),
("RTK \([-+]?\d+\.\d+, (-?\d+\.\d+), -?\d+\)", lambda v: float(v) if v != 0 else None),
], line)
longitude = match_single([
("longitude: ([\d\.\-]+)", lambda v: float(v) if v != 0 else None),
("longtitude : ([\d\.\-]+)", lambda v: float(v) if v != 0 else None),
("GPS \(([\d\.\-]+),? [\d\.\-]+,? [\d\.\-]+\)", lambda v: float(v) if v != 0 else None),
("RTK \((-?\d+\.\d+), [-+]?\d+\.\d+, -?\d+\)", lambda v: float(v) if v != 0 else None),
], line)
@ -211,4 +229,5 @@ class SrtFileParser:
("altitude: ([\d\.\-]+)", lambda v: float(v) if v != 0 else None),
("GPS \([\d\.\-]+,? [\d\.\-]+,? ([\d\.\-]+)\)", lambda v: float(v) if v != 0 else None),
("RTK \([-+]?\d+\.\d+, [-+]?\d+\.\d+, (-?\d+)\)", lambda v: float(v) if v != 0 else None),
("abs_alt: ([\d\.\-]+)", lambda v: float(v) if v != 0 else None),
], line)

29
run.py
Wyświetl plik

@ -13,7 +13,7 @@ from opendm import system
from opendm import io
from opendm.progress import progressbc
from opendm.utils import get_processing_results_paths, rm_r
from opendm.loghelpers import args_to_dict
from opendm.arghelpers import args_to_dict, save_opts, compare_args, find_rerun_stage
from stages.odm_app import ODMApp
@ -29,20 +29,26 @@ if __name__ == '__main__':
log.ODM_INFO('Initializing ODM %s - %s' % (odm_version(), system.now()))
progressbc.set_project_name(args.name)
args.project_path = os.path.join(args.project_path, args.name)
if not io.dir_exists(args.project_path):
log.ODM_ERROR('Directory %s does not exist.' % args.name)
exit(1)
opts_json = os.path.join(args.project_path, "options.json")
auto_rerun_stage, opts_diff = find_rerun_stage(opts_json, args, config.rerun_stages, config.processopts)
if auto_rerun_stage is not None and len(auto_rerun_stage) > 0:
log.ODM_INFO("Rerunning from: %s" % auto_rerun_stage[0])
args.rerun_from = auto_rerun_stage
# Print args
args_dict = args_to_dict(args)
log.ODM_INFO('==============')
for k in args_dict.keys():
log.ODM_INFO('%s: %s' % (k, args_dict[k]))
log.ODM_INFO('%s: %s%s' % (k, args_dict[k], ' [changed]' if k in opts_diff else ''))
log.ODM_INFO('==============')
progressbc.set_project_name(args.name)
# Add project dir if doesn't exist
args.project_path = os.path.join(args.project_path, args.name)
if not io.dir_exists(args.project_path):
log.ODM_WARNING('Directory %s does not exist. Creating it now.' % args.name)
system.mkdir_p(os.path.abspath(args.project_path))
# If user asks to rerun everything, delete all of the existing progress directories.
if args.rerun_all:
@ -57,6 +63,9 @@ if __name__ == '__main__':
app = ODMApp(args)
retcode = app.execute()
if retcode == 0:
save_opts(opts_json, args)
# Do not show ASCII art for local submodels runs
if retcode == 0 and not "submodels" in args.project_path:

Wyświetl plik

@ -81,14 +81,11 @@ class ODMMvsTexStage(types.ODM_Stage):
# Format arguments to fit Mvs-Texturing app
skipGlobalSeamLeveling = ""
skipLocalSeamLeveling = ""
keepUnseenFaces = ""
nadir = ""
if args.texturing_skip_global_seam_leveling:
skipGlobalSeamLeveling = "--skip_global_seam_leveling"
if args.texturing_skip_local_seam_leveling:
skipLocalSeamLeveling = "--skip_local_seam_leveling"
if args.texturing_keep_unseen_faces:
keepUnseenFaces = "--keep_unseen_faces"
if (r['nadir']):
@ -102,7 +99,6 @@ class ODMMvsTexStage(types.ODM_Stage):
'dataTerm': 'gmi',
'outlierRemovalType': 'gauss_clamping',
'skipGlobalSeamLeveling': skipGlobalSeamLeveling,
'skipLocalSeamLeveling': skipLocalSeamLeveling,
'keepUnseenFaces': keepUnseenFaces,
'toneMapping': 'none',
'nadirMode': nadir,
@ -114,7 +110,7 @@ class ODMMvsTexStage(types.ODM_Stage):
mvs_tmp_dir = os.path.join(r['out_dir'], 'tmp')
# Make sure tmp directory is empty
# mvstex creates a tmp directory, so make sure it is empty
if io.dir_exists(mvs_tmp_dir):
log.ODM_INFO("Removing old tmp directory {}".format(mvs_tmp_dir))
shutil.rmtree(mvs_tmp_dir)
@ -125,7 +121,6 @@ class ODMMvsTexStage(types.ODM_Stage):
'-t {toneMapping} '
'{intermediate} '
'{skipGlobalSeamLeveling} '
'{skipLocalSeamLeveling} '
'{keepUnseenFaces} '
'{nadirMode} '
'{labelingFile} '

Wyświetl plik

@ -27,6 +27,7 @@ class ODMApp:
Initializes the application and defines the ODM application pipeline stages
"""
json_log_paths = [os.path.join(args.project_path, "log.json")]
if args.copy_to:
json_log_paths.append(args.copy_to)

Wyświetl plik

@ -12,7 +12,6 @@ from opendm.cropper import Cropper
from opendm import pseudogeo
from opendm.tiles.tiler import generate_dem_tiles
from opendm.cogeo import convert_to_cogeo
from opendm.opc import classify
class ODMDEMStage(types.ODM_Stage):
def process(self, args, outputs):
@ -35,7 +34,6 @@ class ODMDEMStage(types.ODM_Stage):
ignore_resolution=ignore_resolution and args.ignore_gsd,
has_gcp=reconstruction.has_gcp())
log.ODM_INFO('Classify: ' + str(args.pc_classify))
log.ODM_INFO('Create DSM: ' + str(args.dsm))
log.ODM_INFO('Create DTM: ' + str(args.dtm))
log.ODM_INFO('DEM input file {0} found: {1}'.format(dem_input, str(pc_model_found)))
@ -45,34 +43,9 @@ class ODMDEMStage(types.ODM_Stage):
if not io.dir_exists(odm_dem_root):
system.mkdir_p(odm_dem_root)
if args.pc_classify and pc_model_found:
pc_classify_marker = os.path.join(odm_dem_root, 'pc_classify_done.txt')
if not io.file_exists(pc_classify_marker) or self.rerun():
log.ODM_INFO("Classifying {} using Simple Morphological Filter (1/2)".format(dem_input))
commands.classify(dem_input,
args.smrf_scalar,
args.smrf_slope,
args.smrf_threshold,
args.smrf_window
)
log.ODM_INFO("Classifying {} using OpenPointClass (2/2)".format(dem_input))
classify(dem_input, args.max_concurrency)
with open(pc_classify_marker, 'w') as f:
f.write('Classify: smrf\n')
f.write('Scalar: {}\n'.format(args.smrf_scalar))
f.write('Slope: {}\n'.format(args.smrf_slope))
f.write('Threshold: {}\n'.format(args.smrf_threshold))
f.write('Window: {}\n'.format(args.smrf_window))
progress = 20
self.update_progress(progress)
if args.pc_rectify:
commands.rectify(dem_input)
# Do we need to process anything here?
if (args.dsm or args.dtm) and pc_model_found:
dsm_output_filename = os.path.join(odm_dem_root, 'dsm.tif')
@ -100,8 +73,8 @@ class ODMDEMStage(types.ODM_Stage):
resolution=resolution / 100.0,
decimation=args.dem_decimation,
max_workers=args.max_concurrency,
keep_unfilled_copy=args.dem_euclidean_map,
max_tiles=math.ceil(len(reconstruction.photos) / 2)
with_euclidean_map=args.dem_euclidean_map,
max_tiles=None if reconstruction.has_geotagged_photos() else math.ceil(len(reconstruction.photos) / 2)
)
dem_geotiff_path = os.path.join(odm_dem_root, "{}.tif".format(product))
@ -111,17 +84,6 @@ class ODMDEMStage(types.ODM_Stage):
# Crop DEM
Cropper.crop(bounds_file_path, dem_geotiff_path, utils.get_dem_vars(args), keep_original=not args.optimize_disk_space)
if args.dem_euclidean_map:
unfilled_dem_path = io.related_file_path(dem_geotiff_path, postfix=".unfilled")
if args.crop > 0 or args.boundary:
# Crop unfilled DEM
Cropper.crop(bounds_file_path, unfilled_dem_path, utils.get_dem_vars(args), keep_original=not args.optimize_disk_space)
commands.compute_euclidean_map(unfilled_dem_path,
io.related_file_path(dem_geotiff_path, postfix=".euclideand"),
overwrite=True)
if pseudo_georeference:
pseudogeo.add_pseudo_georeferencing(dem_geotiff_path)
@ -131,7 +93,7 @@ class ODMDEMStage(types.ODM_Stage):
if args.cog:
convert_to_cogeo(dem_geotiff_path, max_workers=args.max_concurrency)
progress += 30
progress += 40
self.update_progress(progress)
else:
log.ODM_WARNING('Found existing outputs in: %s' % odm_dem_root)

Wyświetl plik

@ -60,7 +60,7 @@ class ODMeshingStage(types.ODM_Stage):
available_cores=args.max_concurrency,
method='poisson' if args.fast_orthophoto else 'gridded',
smooth_dsm=True,
max_tiles=math.ceil(len(reconstruction.photos) / 2))
max_tiles=None if reconstruction.has_geotagged_photos() else math.ceil(len(reconstruction.photos) / 2))
else:
log.ODM_WARNING('Found a valid ODM 2.5D Mesh file in: %s' %
tree.odm_25dmesh)

Wyświetl plik

@ -7,6 +7,7 @@ from opendm import context
from opendm import types
from opendm import gsd
from opendm import orthophoto
from opendm.osfm import is_submodel
from opendm.concurrency import get_max_memory_mb
from opendm.cutline import compute_cutline
from opendm.utils import double_quote
@ -114,6 +115,7 @@ class ODMOrthoPhotoStage(types.ODM_Stage):
# Cutline computation, before cropping
# We want to use the full orthophoto, not the cropped one.
submodel_run = is_submodel(tree.opensfm)
if args.orthophoto_cutline:
cutline_file = os.path.join(tree.odm_orthophoto, "cutline.gpkg")
@ -122,15 +124,18 @@ class ODMOrthoPhotoStage(types.ODM_Stage):
cutline_file,
args.max_concurrency,
scale=0.25)
orthophoto.compute_mask_raster(tree.odm_orthophoto_tif, cutline_file,
os.path.join(tree.odm_orthophoto, "odm_orthophoto_cut.tif"),
blend_distance=20, only_max_coords_feature=True)
if submodel_run:
orthophoto.compute_mask_raster(tree.odm_orthophoto_tif, cutline_file,
os.path.join(tree.odm_orthophoto, "odm_orthophoto_cut.tif"),
blend_distance=20, only_max_coords_feature=True)
else:
log.ODM_INFO("Not a submodel run, skipping mask raster generation")
orthophoto.post_orthophoto_steps(args, bounds_file_path, tree.odm_orthophoto_tif, tree.orthophoto_tiles, resolution)
# Generate feathered orthophoto also
if args.orthophoto_cutline:
if args.orthophoto_cutline and submodel_run:
orthophoto.feather_raster(tree.odm_orthophoto_tif,
os.path.join(tree.odm_orthophoto, "odm_orthophoto_feathered.tif"),
blend_distance=20

Wyświetl plik

@ -65,7 +65,7 @@ class ODMOpenMVSStage(types.ODM_Stage):
filter_point_th = -20
config = [
" --resolution-level %s" % int(resolution_level),
"--resolution-level %s" % int(resolution_level),
'--dense-config-file "%s"' % densify_ini_file,
"--max-resolution %s" % int(outputs['undist_image_max_size']),
"--max-threads %s" % args.max_concurrency,
@ -79,7 +79,6 @@ class ODMOpenMVSStage(types.ODM_Stage):
gpu_config = []
use_gpu = has_gpu(args)
if use_gpu:
#gpu_config.append("--cuda-device -3")
gpu_config.append("--cuda-device -1")
else:
gpu_config.append("--cuda-device -2")
@ -95,12 +94,13 @@ class ODMOpenMVSStage(types.ODM_Stage):
extra_config.append("--ignore-mask-label 0")
with open(densify_ini_file, 'w+') as f:
f.write("Optimize = 7\n")
f.write("Optimize = 7\nMin Views Filter = 1\n")
def run_densify():
system.run('"%s" "%s" %s' % (context.omvs_densify_path,
openmvs_scene_file,
' '.join(config + gpu_config + extra_config)))
try:
run_densify()
except system.SubprocessException as e:
@ -110,7 +110,7 @@ class ODMOpenMVSStage(types.ODM_Stage):
log.ODM_WARNING("OpenMVS failed with GPU, is your graphics card driver up to date? Falling back to CPU.")
gpu_config = ["--cuda-device -2"]
run_densify()
elif (e.errorCode == 137 or e.errorCode == 3221226505) and not pc_tile:
elif (e.errorCode == 137 or e.errorCode == 143 or e.errorCode == 3221226505) and not pc_tile:
log.ODM_WARNING("OpenMVS ran out of memory, we're going to turn on tiling to see if we can process this.")
pc_tile = True
config.append("--fusion-mode 1")
@ -127,10 +127,10 @@ class ODMOpenMVSStage(types.ODM_Stage):
subscene_densify_ini_file = os.path.join(tree.openmvs, 'subscene-config.ini')
with open(subscene_densify_ini_file, 'w+') as f:
f.write("Optimize = 0\n")
f.write("Optimize = 0\nEstimation Geometric Iters = 0\nMin Views Filter = 1\n")
config = [
"--sub-scene-area 660000",
"--sub-scene-area 660000", # 8000
"--max-threads %s" % args.max_concurrency,
'-w "%s"' % depthmaps_dir,
"-v 0",
@ -161,9 +161,13 @@ class ODMOpenMVSStage(types.ODM_Stage):
config = [
'--resolution-level %s' % int(resolution_level),
'--max-resolution %s' % int(outputs['undist_image_max_size']),
"--sub-resolution-levels %s" % subres_levels,
'--dense-config-file "%s"' % subscene_densify_ini_file,
'--number-views-fuse %s' % number_views_fuse,
'--max-threads %s' % args.max_concurrency,
'--archive-type 3',
'--postprocess-dmaps 0',
'--geometric-iters 0',
'-w "%s"' % depthmaps_dir,
'-v 0',
]
@ -179,7 +183,7 @@ class ODMOpenMVSStage(types.ODM_Stage):
else:
# Filter
if args.pc_filter > 0:
system.run('"%s" "%s" --filter-point-cloud %s -v 0 %s' % (context.omvs_densify_path, scene_dense_mvs, filter_point_th, ' '.join(gpu_config)))
system.run('"%s" "%s" --filter-point-cloud %s -v 0 --archive-type 3 %s' % (context.omvs_densify_path, scene_dense_mvs, filter_point_th, ' '.join(gpu_config)))
else:
# Just rename
log.ODM_INFO("Skipped filtering, %s --> %s" % (scene_ply_unfiltered, scene_ply))
@ -219,7 +223,7 @@ class ODMOpenMVSStage(types.ODM_Stage):
try:
system.run('"%s" %s' % (context.omvs_densify_path, ' '.join(config + gpu_config + extra_config)))
except system.SubprocessException as e:
if e.errorCode == 137 or e.errorCode == 3221226505:
if e.errorCode == 137 or e.errorCode == 143 or e.errorCode == 3221226505:
log.ODM_WARNING("OpenMVS filtering ran out of memory, visibility checks will be skipped.")
skip_filtering()
else: