Understanding the map stitching process is important when optimizing your chance of generating successful maps. In short, each photo taken from the drone contains ‘features’ such as crop rows, trees, buildings, trails left by equipment, or anything that is distinctly recognizable in the visual space. As the aircraft takes continuous photos during a mission, multiple photos of each distinct feature, from multiple angles, are captured. These features are identified and matched by a mathematical process and aligned on top of each other.
Consider a puzzle, where all the pieces have straight edges, are overlapped by an unknown amount, and there is a different perspective in each image. That means the pieces have to be warped by an unknown amount to make them match. Now, include the fact that vertical structures will look different from each side! This would be impossible for a human.
Examples of Good Stitching:
Below are examples of two consecutive photos captured during a flight over a canyon where you can see common features outlined in red, yellow, and blue.
Note the presence of all 3 features in both photos - this is ideal for photogrammetry stitching!
What Can Go Wrong in Processing?
Photogrammetry stitching can be considered a black box process; data is imported, but the mathematical process driving the stitching is complex and difficult to predict - or even to understand precisely. Below are the main issues customers face when sending their drone imagery.
1. Holes: Unstitched regions within an orthomosaic map and/or 3D model.
How to Resolve: Ensuring your data capture covers the entire region of interest, holes will be mostly minimized. Also, ensure you are following Flight Planning recommendations as this will minimize the possibility of holes.
2. Motion Blur: Motion blur within the drone imagery is caused by fast-moving drones or vibration - this either means the shutter speed isn't fast enough, or the pilot is flying too fast.
How to Resolve: The best way to solve motion blur is to improve shutter speed. Flying slower or flying at a higher altitude will help as well.
Below is an example of what motion blur looks like:
*Notice the distortion in the bottom right corner of the field.
3. Unfocused Camera: Similar to motion blur, you may be experiencing an unfocused camera issue which can lead to quality issues.
How to resolve: Make sure that autofocus is on, and that there is no dust or particles on the lens. You can adjust the camera settings manually from the DJI GO/FLY app also.
4. Vignetting on Images: Vignetting is caused by a lack of light when the original image was captured.
How to resolve: Re-flying the mission with less cloud cover can help. Check the lens for dust or particles that may be causing the dark images. The dark corners of this picture are evidence of vignetting.
5. Insufficient Image Overlap: Insufficient image overlap can cause holes or distortion in your processed map.
How to resolve: The higher the image overlap (altitude at which the image was taken), the easier it is for our software to process your dataset. High overlap produces greater map detail over a smaller total space. You can adjust your overlap settings in Advanced Settings in your flight plan. Keep in mind the amount of overlap between photos will affect the map quality and size as seen below:
6. Non-Nadir Photos (e.g. Photos Taken During Turns/Horizontal Images): By including the horizon in your dataset, the internal distance of the map will be distorted. DroneDeploy will try to include the areas far away in stitching rather than the area of interest immediately below. Though beautiful, the image below would hurt the quality of your map.
How to Resolve: Capture your images at an oblique (45-degree) or nadir (90-degree top-down) angle.
7. Images Captured at Low Altitude: Taking images at a low altitude lowers the surface area (overlap) per image, which will make them difficult to stitch together. This can result in blurry maps.
Note: Make sure to always obey your local/national altitude restriction regulations.
How to Resolve: Increase your altitude and overlap settings within your flight plan.
8. Homogenous imagery (e.g. Full Crop Cover in Fields): A great example of homogenous imagery is a field with full crop cover. Because there is little variation or distinguishable features, and a tendency to have hard-to-determine patterns, it can be difficult to stitch together the images.
How to Resolve: Using an ND filter can help when flying over a homogeneous area. It can also help to fly when the sun is not directly over the area, which can help to break it up with a few shadows. Flying higher is also a quick way to help solve this issue as it gives your drone's camera more chances to cover common unique features in multiple images.
Below are two images of a cornfield taken from the same dataset. Both images look almost identical which may cause issues with image processing.
9. Over-Exposed or Generally Colorless Features: Stitching is based on connecting like-points like a puzzle, you can imagine over-exposed images or primarily colorless imagery will be difficult to stitch. This means features like snow cover, reflective roofs, or solid-white objects could cause connectivity issues when stitching.
How to Resolve: Similar to glare on moving water, these bright overexposed images or subjects provide little depth. We recommend using an ND filter to account for image brightness and fly when the sun is not directly above the reflective surface.