-

-OpenDroneMap

- -

- -

-What is it?

- -

OpenDroneMap is an open source toolkit for processing aerial drone imagery. Typical drones use simple point-and-shoot cameras, so the images from drones, while from a different perspective, are similar to any pictures taken from point-and-shoot cameras, i.e. non-metric imagery. OpenDroneMap turns those simple images into three dimensional geographic data that can be used in combination with other geographic datasets.

- -

- -

In a word, OpenDroneMap is a toolchain for processing raw civilian UAS imagery to other useful products. What kind of products?

- -
    -
  1. Point Clouds
  2. -
  3. Digital Surface Models
  4. -
  5. Textured Digital Surface Models
  6. -
  7. Orthorectified Imagery
  8. -
  9. Classified Point Clouds
  10. -
  11. Digital Elevation Models
  12. -
  13. etc.
  14. -
- -

So far, it does Point Clouds, Digital Surface Models, Textured Digital Surface Models, and Orthorectified Imagery.

- -

Users' mailing list: http://lists.osgeo.org/cgi-bin/mailman/listinfo/opendronemap-users

- -

Developer's mailing list: http://lists.osgeo.org/cgi-bin/mailman/listinfo/opendronemap-dev

- -

Overview video: https://www.youtube.com/watch?v=0UctfoeNB_Y

- -

-Developers

- -

Help improve our software!

- -

Join the chat at https://gitter.im/OpenDroneMap/OpenDroneMap

- -
    -
  1. Try to keep commits clean and simple
  2. -
  3. Submit a pull request with detailed changes and test results
  4. -
- -

-Steps to get OpenDroneMap running:

- -

(Requires Ubuntu 14.04 or later, see https://github.com/OpenDroneMap/odm_vagrant for running on Windows in a VM)

- -

Support for Ubuntu 12.04 is currently BROKEN with the addition of OpenSfM and Ceres-Solver. We are working hard to get it working again in the future.

- -

-Building OpenDroneMap using git

- -
cd path/to/odm/dir
-git clone https://github.com/OpenDroneMap/OpenDroneMap.git .
-export PYTHONPATH=$PYTHONPATH:`pwd`/SuperBuild/install/lib/python2.7/dist-packages:`pwd`/SuperBuild/src/opensfm
-export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:`pwd`/SuperBuild/install/lib
-bash configure.sh
-mkdir build && cd build && cmake .. && make && cd ..
-
- -

For Ubuntu 15.10 users, this will help you get running:

- -
sudo apt-get install python-xmltodict
-sudo ln -s /usr/lib/x86_64-linux-gnu/libproj.so.9 /usr/lib/libproj.so
-
- -

-Running OpenDroneMap

- -

First you need a set of images, which may or may not be georeferenced. There are two ways OpenDroneMap can understand geographic coordinates. First, the images can be geotagged in their EXIF data. This is the default. Alternatively, you can create a GCP file, a process detailed here

- -

Create a project folder and places your images in an "images" directory:

- -
|-- /path/to/project/
-    |-- images/
-        |-- img-1234.jpg
-        |-- ...
-
- -

Example data can be cloned from https://github.com/OpenDroneMap/odm_data

- -

Then run:

- -
python run.py --project-path /path/to/project
-
- -

There are many options for tuning your project. See the wiki or run python run.py -h

- -

When the process finishes, the results will be organized as follows

- -
|-- images/
-    |-- img-1234.jpg
-    |-- ...
-|-- images_resize/
-    |-- img-1234.jpg
-    |-- ...
-|-- opensfm/
-    |-- not much useful in here
-|-- pmvs/
-    |-- recon0/
-        |-- models/
-            |-- option-0000.ply         # Dense point cloud
-|-- odm_meshing/
-    |-- odm_mesh.ply                    # A 3D mesh
-    |-- odm_meshing_log.txt             # Output of the meshing task. May point out errors.
-|-- odm_texturing/
-    |-- odm_textured_model.obj          # Textured mesh
-    |-- odm_textured_model_geo.obj      # Georeferenced textured mesh
-    |-- texture_N.jpg                   # Associated textured images used by the model
-|-- odm_georeferencing/
-    |-- odm_georeferenced_model.ply     # A georeferenced dense point cloud
-    |-- odm_georeferenced_model.ply.laz # LAZ format point cloud
-    |-- odm_georeferenced_model.csv     # XYZ format point cloud
-    |-- odm_georeferencing_log.txt      # Georeferencing log
-    |-- odm_georeferencing_utm_log.txt  # Log for the extract_utm portion
-|-- odm_georeferencing/
-    |-- odm_orthophoto.png              # Orthophoto image (no coordinates)
-    |-- odm_orthophoto.tif              # Orthophoto GeoTiff
-    |-- odm_orthophoto_log.txt          # Log file
-    |-- gdal_translate_log.txt          # Log for georeferencing the png file
-
- -
-Viewing your results
- -

Any file ending in .obj or .ply can be opened and viewed in MeshLab or similar software. That includes pmvs/recon0/models/option-000.ply, odm_meshing/odm_mesh.ply, odm_texturing/odm_textured_model[_geo].obj, or odm_georeferencing/odm_georeferenced_model.ply. Below is an example textured mesh:

- -

- -

You can also view the orthophoto GeoTIFF in QGIS or other mapping software:

- -

- -

-Using Docker

- -

You can build and run OpenDroneMap in a Docker container:

- -
export IMAGES=/absolute/path/to/your/project
-docker build -t opendronemap:latest .
-docker run -v $IMAGES:/images opendronemap:latest
-
- -

Replace /absolute/path/to/your/images with an absolute path to the directory containing your project (where the images are) -To pass in custom parameters to the run.py script, simply pass it as arguments to the docker run command.

- -
- -

Here are some other videos, which may be outdated:

- - - -

Now that texturing is in the code base, you can access the full textured meshes using MeshLab. Open MeshLab, choose File:Import Mesh and choose your textured mesh from a location similar to the following: reconstruction-with-image-size-1200-results\odm_texturing\odm_textured_model.obj

- -
- -

Long term, the aim is for the toolchain to also be able to optionally push to a variety of online data repositories, pushing hi-resolution aerials to OpenAerialMap, point clouds to OpenTopography, and pushing digital elevation models to an emerging global repository (yet to be named...). That leaves only digital surface model meshes and UV textured meshes with no global repository home.

- -
- -

-Documentation:

- -

For documentation, please take a look at our wiki.

- - - -