Friday, May 12, 2017

UWEC Priory Property Navigation

On Wednesday, May 10th, the Field Methods class met at the UWEC Priory property used for housing and including a large chunk of forest. In separate groups the class used BadElf GPS units connected via bluetooth to an iPhone in conjunction with the navigation maps made earlier in the semester (the making of which can be seen in an earlier post below) to navigate to different points on the property and mark these points. The maps have UTM northings and eastings displayed in a grip format, and given the coordinates, the GPS unit, the map, and a compass, points were able to be found.
Point 1

Point 2

Point 3

Point 4

Point 5

Saturday, April 29, 2017

Survey of Point Features with Survey Grade GPS

Introduction and Study Area:

On Wednesday, April 26th, 2017, at 3:15 PM, class worked at the south side community gardens near Eau Claire South Middle School collecting point soil data. The fenced garden is just to the south-east of the intersection of Hester St and Mitchell Ave. TDR (Time Domain Reflectrometry), Ph, and temperature were collected and then recorded at each point. Measurements were taken on independent instruments, then recorded on the survey grade dual frequency Topcon GPS unit when the sub-meter accuracy point was taken at the location of sample collection.

This process mimicked data collection for soil monitoring on a farm, a useful practice because of the crop health diagnostic utility of the soil information taken. While remote sensing can tell you what crops are doing well and which are doing poor, soil monitoring can diagnose the factors causing the health condition. The types of soil data that were collected in this survey and their uses in crop diagnostic are below. This is not a comprehensive list of useful measurements, rather what the class took for practice in this assignment.

Time Domain Reflectrometry: This is a measurement technique of moisture content. The strong relationship between the permittivity of a substance and its water content is exploited to get a reading. Using two probes, a permittivity measurement is taken, then a moisture content is calculated by the device based on the measurement. Moisture is a necessity for good crop health.

Temperature: Different crops require different minimum temperatures. Tomatoes, cucumbers and snap peas thrive in soils at least 16 degrees Celsius, however Watermelon, peppers, squash, okra, cantaloupe and sweet potatoes thrive at soil temperatures around 20 degrees Celsius.

pH: This is the measurement of acidity or basicity, or respectively the concentration of hydrogen ions or hydronium ions. The acidity or basicity of the soil can affect crop yield. Figure 1 shows the relationship between pH and yield of several crops.
Figure 1


Methods:

After splitting into groups, each group collected information on the soil at the same marked points that the first group designated with flags. Each group member took turns with each instrument, collecting each type of data, while an assisting group member recorded the measurements in a field notebook. A Spectrum Technologies Fieldscout TDR 200 was used to take multiple TDR readings at each individual flagged point (Figure 2). These were then averaged to find a more accurate value which was recorded with the point data. Probe measurements were also taken with a thermometer in the same manner (Figure 3). pH measurements were taken with a probe and and a mixture of soil and distilled water in a sample container. After the the tree measurements were taken, they were recorded into a new point's attributed fields on the GPS unit. The unit was the placed as vertical as possible and the point location recorded with a tap on the appropriate button. The GPS unit took 30 coordinate measurements to average and get accurate coordinates. GPS use is depicted in Figure 4.

Figure 2

Figure 3

Figure 4



Sources:

https://www.gardeningknowhow.com/garden-how-to/soil-fertilizers/determining-soil-temperature.htm

http://soilquality.org/indicators/soil_ph.html

Tuesday, April 25, 2017

Arc Collector Part 2

Introduction:

In this assignment ArcCatalog, ArcMap, ArcGIS Online, and the Arc Collector iOS application on an iPhone were used to design an original data collection scheme. ArcCatalog was used to create a file geodatabase with domains made specifically for the project, and also to create the point feature class with different fields, some using the earlier defined domains, that were to be filled in in data collection. ArcMap was then used to create a map and upload the service with its geodatabase and feature class to ArcGIS Online. ArcGIS Online was in turn used to add the feature class to a new map which was saved and which was able to be downloaded onto the ArcCatalog app for data collection, before the data collected was then downloaded again onto the PC and mapped in ArcMap.

The data collected was of potholes and cracks in the street. This data could be used, in turn, to figure out which potholes and cracks should be filled in first. A research question (a necessity for this assignment) that would be posed which this data would help answer would be: which potholes and cracks should be filled in first?

A few points must be made about proper study design. First, the units for each question should be consistent or stated in the form that is in Arc Collector. This helps all data collectors collect consistent data. Second, more descriptions of each field should be added that will help data collectors collect consistent data. For example in this data collection scheme, the distance away from the curb was to be measured from the center of the pothole to the nearest side of the road. With this project, with only one data collector, these steps were less important, but if details about any fields are at all ambiguous to anyone in a larger group collected study, data collection error can occur as seen in the last Arc Collector project. Third, fields cannot just be set up in the feature class. These feature class field must sometimes use domains that have been predefined in the settings of the containing geodatabase. This is especially necessary when there are multiple coded options that one needs to choose for a field. The options are coded in the domain settings of the geodatabase, then the domain is chosen when the field is created in the feature class. Domains also make it convenient when multiple feature classes are used and some need the same fields. Instead of having to enter options for a field multiple times, they can just be entered once and then chosen multiple times for different fields. Finally, a notes field should always be created so that miscellaneous information can be added that may come in handy later.

Study Area:

The data collected was on a few square blocks of the student neighborhood in Eau Claire. The 300 through 400 blocks of Niagara St, Broadway St, and Hudson St, as well as the sections of 3rd Ave, and 4th Ave between these streets were studied. This area is shown below in Figure 1.
Figure 1
Methods:

After opening ArcCatalog and connecting to the folder where the project was to be stored, a new file geodatabase with a project specific name was created with a right click in the folder, then highlighting the new option, then clicking file geodatabase. The domains were then set up by right clicking on the geodatabase, clicking properties, then domains. The domains shown in Figure 2 were created. Crack or Hole was a coded text domain and how it was programmed is shown as it is highlighted in the figure. The rest of the domains were not necessary to create because they were only used for one feature class and were not coded but they were created anyways so that if they were needed to be used again in a different feature class if it was decided to be made they could be. These were all float field types due to the need for decimal places to be used.
Figure 2
After the database was configured, the feature class was created. With a right click on the geodatabase, then a hover over new, and a click on feature class, the feature class creation wizard was opened. The point option was chosen, a name and alias created, the WGS_1984_Web_Mercator_Auxiliary_Sphere projected coordinate system chosen, and fields created, and the rest was left to default. The fields are shown in Figure 3. The matching domains were chosen for all besides notes which was a field not linked to a domain.
Figure 3
Notice that the distance from the curb field has been given a descriptive name that would remind a collector of data that the measurement is to be taken from the closest curb to the center of the pothole.

Next, the feature class was brought into ArcMap. The map was laid over a base map using the add base map function, and a test point was placed after starting editing on the editing toolbar. The point's fields were then filled in in the window that appeared after adding a point. After this verification that the fields worked as desired and in full functionality (decimal points added), the point was deleted. Now, the base map was deleted so as not to use any more credits than necessary on ArcMap online, and the service was published. Sign in was selected under the file menu, and a personal enterprise login was used, then share as service was selected under the file menu. Publish a service was chosen, then the university connection was chosen. After naming the service, specific settings could be edited. Parameter setting were left as default, tiled mapping was turned off under capabilities, all operations were checked under feature access, and then an item description and sharing options were added. The service was now shared.

Logging into ArcGIS Online in a browser, a new map was made using the map tab. The service uploaded was then added to the map using the add data button and the map shared. The map could now be opened in ArcCollector. Logging in, opening the map, and then clicking the plus sign on top of the map, data was added in Arc Collector. The place where the phone was when the plus was clicked is the position that was recorded for the point. The resulting page after clicking on the plus is shown in Figure 4. All fields were measured and filled in, and then submitted. Before going out, the map was downloaded to the phone so that data connection would not be used to load base maps, and all data points were then stored on the phone and synced later using wifi instead of using the cellular data connection. This practice also saved battery life.

Back in the computer lab, the map was then clicked on in the my content section of ArcGIS online, and then the open in ArcGIS Desktop button was clicked. From here graduated symbol maps were made from every different field collected.

Results and Discussion:

Because cracks are lines it could have made sense to make a second line feature class that also would have had data collected in it. However, due to the nature of the data only being for locating of features, and not needing to represent them in spatial nature (shape, line, or polygon), this was not necessary. A simple Crack or Hole field and linked coded domain separated the two types of features being collected. Next, because the text string field for notes only allows 50 characters, it may be necessary to make multiple notes fields. Another issue that arose was that when collecting data on cracks, which were mostly found to go across the entire road, the distance that was collected for distance from curb was usually zero. This did not fit the rule made in the beginning that the distance from the curb would be measured to the center of the feature. The purpose of this value was that it could be mapped the features that were farther from the curb and therefore more likely to be run over as people drive all over the middle of the road on these side streets. These values had to be edited to be for the center of the road: at 5 meters. This was done easily in ArcGIS Online. 

The resulting maps are shown below:

Cracks by Depth:

One way that road flaws could be chosen to be fixed first is based on depth. A larger depth means a larger driving disturbance, and possibly indicative of deeper structural damage to the road, so these may be the ones that should be fixed first.
Figure 4


Road Flaws by Distance to Curb:

This is a map of the road flaws by the distance to the nearest curb from the center of the flaw. One way of figuring out which flaws to fix first could be to find the ones that are the farthest out into the middle of the road, where people tend to drive on these side streets when there is no oncoming traffic and there are people parked on both sides.

Figure 5
Road Flaws by Length:

The length in this data is the distance in the direction of the road. A longer flaw may mean a greater disturbance, so this could be another way to find out which flaws to focus on first.

Figure 6

Road Flaws by Width:

This is seemingly a last resort for being something that would suggest the first flaws to fix, but it is important because it denotes which are cracks that cross the entire street. These are the biggest symbols on the map.

Figure 7
Conclusion:

Arc Collector, in combination with the mobile and desktop platforms and other software elements used, is a great option for collecting data in the field. Being integrated with other ESRI software, like ArcGIS Desktop and ArcGIS Online, the process of creating a geodatabase and feature class, uploading it to ArcGIS Online, then bringing it back down to the desktop or leaving it in the cloud for mapping after data collection is fairly easy and seamless.






Tuesday, April 11, 2017

Arc Collector Part 1

Introduction:

A modern smart phone has an amazing amount of computing power, and so it does not make sense to buy another stand alone device that also includes a highly accurate GPS to collect data. For very basic data collection simply a phone and its own GPS unit can be used in conjunction with the ArcCollector software, and if a higher degree of spatial accuracy is required for a project, then a bluetooth GPS unit that is compatable with ArcCollector can be used. This setup with a phone is even more powerful when it is considered that it is not limited to only its local storage and unlimited amounts of data can be uploaded in the moment via a cellular data connection to an online server and even viewed from there in real time by others also collecting data in the field or those who would like to watch what is coming in from afar possibly to give real time feedback.

In this project, the goal was to get microclimate data throughout the entire campus. A tutorial was given on how to create a geodatabase, feature classes, and fields that could be populated in the field using ArcCollector, then after each student connected to the project that was created by professor Joseph Hupy on the university ESRI website server, we each logged into ArcCollector using our personal enterprise accounts and went out and collected data for our assigned zones. Two students were assigned to each zone. Data was collected on Wednesday, March 29th between 3:30 and 5:00pm. The fields that were populated were temperature, dew point, wind chill, wind direction, wind speed, time, and group number.
Figure 1

Figure 2



Methods:

A map with a file geodatabase with a point feature class and group zone polygon feature class was created in ArcMap then uploaded to ArcGIS Online was supplied to everyone in the class. Everyone in the class then went through the steps shown in subsequent writing below to add their own data points to the point feature class after being added to the group containing the project on ArcGIS online. The data was then copied to our own content page on our ArcGIS online accounts so that individual maps could be made.

Each group coordinated to get a collection of points that covered the entire area of the group zone. Each member collected about 20 points. The ArcCollector app was downloaded to the smartphone, and after logging into a personal enterprise account the shared group map was opened. New points were added by tapping on the boxed button shown in Figure 3. Then, the boxed microgrp feature class shown in Figure 4 was selected for editing. Finally, every field was selected and edited to include accurate information. The fields are shown in Figure 5.
Figure 3

Figure 4

Figure 5
The meanings of each field name are shown below:

GPR: Group (Zone) Number
TP: Temperature
DP: Dew Point
WC: Wind Chill
WS: Wind Speed
WD: Wind Direction
Notes: Any extra notes about a data point. Interfering factors such as vents were noted here mostly.
Time: This was the time recorded as military time without any symbols. For example 1:20 PM would be recorded as 1320.

All weather related fields were found using a Kestrel 3000 pocket weather meter. This device is pictured in Figure 6, and the symbols at the bottom of meter screen which indicated different readings were deciphered in the field using the manual found online. A screen shot of the symbol key is shown in Figure 7.

Figure 6


Figure 7
After collecting the data it was ready to map. The data, copied into the personal content folder in the ArcGIS Online interface, was clicked on, then the Open in ArcGIS Desktop button was clicked. With the data in ArcMap basic overview maps (shown in Figure 1 and Figure 2) were able to be made very easily. Next, to get data files to work with stored on the local network instead of in the cloud, the two layers (the group zones and the points with the weather data) were exported to a locally stored file geodatabase. The point feature class that was exported was then used with the IDW Interpolation tool to create heat maps for temperature, wind chill, and dew point fields, using their respective fields. These maps can be seen in Figures 10-12. All maps were classified with equal interval and five classes. Figure 8 shows the IDW Interpolation tool used to make the heat maps. The wind speed and direction map was made by simply going into the symbolization settings for the point feature class and changing the setting to graduated symbols set to the WS (wind speed) value, then clicking advanced, then rotation to choose the WD (wind direction) field for rotation, and finally making sure to choose an arrow symbol that was pointing down so that the direction was correct on the map. The symbology settings are shown in Figure 9.

Figure 8

Figure 9


Results and Discussion:

Figure 10
The temperature map above (Figure 10) shows temperatures slightly cooler on upper campus. This may have something to do with higher wind speeds on upper campus.  
Figure 11
The dew point map above (Figure 11) shows higher dew points at the creek and near the river. This is because of the high moisture content of the air as would be expected. Other points besides on the bridge along the river were not shown to have high dew points but this is because the IDW interpolation was doing the best it could with the point that it had, filling in areas between high dew points with high dew points. Lower moisture content in the air was detected on upper campus where there was no running water and dry air could blow in.
Figure 12
Figure 12, showing the wind speeds and directions, shows higher wind speeds on upper campus, on the bridge, and near parking lots as expected. It also shows more variation in wind directions around buildings and trees, as expected.

Several points can be made in regard to the quality of the data. One of these points is that new and estimated data had to be recorded into datapoints after returning to the lab because several people used the wrong setting to record dew point. This could have been corrected with keys to the symbols on the weather meter screen being used by everyone or better yet taped to the meters themselves. Because of this mistake, the dew point data should not be trusted. Another point that can be made is that some people incorrectly recorded times. These had to be fixed later for standardization. A final point is that many of the fields would not record more that two significant digits even if more were typed in. This creates a problem with many records being rounded down because the third and fourth significant digit was never considered by the application. For example, a 49.9 would be recorded as a 49 instead of a 50 or a 49.9.

Conclusion:

ArcCollector, properly set up with correct field attributes, with a fully charged battery, and can be an amazing too for amassing data. Athough there were troubles with the data in this exercise, it is okay because this is just an exercise and is for learning.

Sources:

http://www.nkhome.com/pdfs/K3000_Instructions_7.23.10_WEB.pdf



Tuesday, March 28, 2017

Low-tech Distance Azimuth Survey Practice

Introduction

Sometimes technology fails. For times when GPS and other technology has failed or cannot be used, low-tech options need to be understood for supplemental implementation. One of these low-tech options is distance azimuth surveying. In its most basic definition this is the collection of a single coordinate point, a distance, an azimuth (bearing), and data about the object being located.

We practiced this method on Putnam Drive, behind the Phillips science building and the Davies student center on the UW - Eau Claire campus. We measured distance of ten trees from each of three points, each point's GPS coordinates, the azimuth of the tree from the point, and each tree's circumference. Though GPS coordinates may not be available for collection of the initial point in situations requiring this type of survey, a landmark could be picked to measure distance and bearings from, and this could be pinpointed later in georeferenced or orthorectified aerial imagery to find the coordinates.

Methods:

Like stated before we went as a class to Putnam Drive to survey in three groups of about 6 each. Trading jobs periodically each member of each group did every job for practice. At one of the three points being surveyed from a compass (of the type that is looked through with one eye to see the azimuth while the other eye is fixated on the appropriate point) and a tape measure were used. At another area, a compass and a distance measurement device with two pieces (one with the measurement reading and one that the other unit used to get its measurement) were used. At a third point being surveyed from a laser gun was used that gave both distance and azimuth readings. At all points being surveyed from a small tape measure was used for measurement of the circumference of each tree. This was measured in centimeters, while the distance to the tree was measured in meters and the azimuth was measured in degrees.

After collecting this data in the field on paper the table was brought into Microsoft Excel. The table created for this project is shown in Figure 1. One important detail with the creation of this table is that capitals and spaces should not be used for the column titles. ESRI software will choke on processing the data later if these are used. Also note that the GPS coordinates observed from the three unique survey points are stored in the x and y fields.
Figure 1
After creation of this table the data could be brought into a new ArcMap map. The table contained within the excel file was brought into the table of contents. In ArcMap a new file geodatabase was created and named the default geodatabase for the map file. This can be seen in Figure 2.
Figure 2
The data was now converted into lines by using the Bearing Distance to Line tool. This was run with the parameters as seen below in Figure 3. All tools used in this project were found using the search function in ArcMap. This function can be seen alongside the tool in Figure 4.
Figure 3
Points were then created from the lines that were created in this last step by using the Feature Vertices to Points tool seen in Figure 4.
Figure 4
After creating these lines the final maps displaying the data were made which can be seen below under results in Figure 5. These were created in ArcMap by using a data frame for each map and adding the layers needed for each in each separate data frame (added by selecting the Insert menu, then Data Frame). Base maps were added by using the Add Data button  down arrow, then Add Basemap, finally selecting the topographic map in the resulting window. Text was added via the draw toolbar, and north arrows, scale bars, titles were added using the Insert menu.

Results:

The resulting maps created are shown below in Figure 5. Clearly seen in the top locator map are the three separate points surveys were conducted from. Below that map are the three separate areas surveyed in at a greater scale.
Figure 5
In reflection on the accuracy of the data, there are a few concerns. First is that after adding the basemap it was apparent that the initial GPS coordinates which marked the points from which the other points were taken (using distance and azimuth) were not accurate. These errors could either be attributed to poor GPS data or inaccurate basemaps, but it is far more likely that the error was caused by the GPS data accuracy. This is speculated because basemaps were created by ESRI, of whom the highest accuracy is expected, and because the data was collected at the base of a steep hill and under dense tree cover, where low GPS accuracy could be a possible issue. Also, the inaccuracy of the GPS coordinates made the points move to both sides of the road on which point one and two were collected from, and if the issue was bad digitization by ESRI the points would likely have only been on one side of Putnam Drive indicating a slight north-south inaccuracy in digitization. A different reason this GPS data could be wrong is that the coordinates were recorded by hand from the GPS unit, and the GPS unit may have simply not updated fast enough to the new location or a user could have written down a wrong digit. This GPS error would not be a problem however if a marked landmark or something else was used instead, possibly one seen from aerial imagery whose coordinates could be found later. In a situation in which you would not have a GPS unit, you would have no GPS coordinates!

Another concern is simple user error in reading the compass bearing. The compass with which one looks through and uses double vision to read is especially concerning. It is slightly difficult to get the correct reading. Instructions worth reading before attempting to use one of these compasses can be found here. This was not a problem however when using the laser distance measuring instrument which displayed a bearing at the same time as the distance to the object being pointed to with a click of a button.

Conclusion:

In a pinch, a distance azimuth survey works! In situations in which GPS technology is not available or cannot be used, and especially if practice has been had with the equipment that would be available,   moderately accurate data can be collected.





Wednesday, March 8, 2017

Pix4D Image Processing

Introduction:

Pix4D is software that can be used to create point clouds and orthomosaic images from UAS aerial imagery. The software uses advanced algorithms to take overlapping images from the data set and build a three dimensional model of the specific area observed.

Pix4D Basics:

What is the overlap needed for Pix4D to process imagery?
One necessity of UAS data for processing of UAS imagery in Pix4D is adequate image overlap. The recommended overlap in a general case is 75% frontal overlap and at least 60% side overlap. Pix4D also recommends that a regular grid pattern is used (as is shown in Figure 1) and that a constant height is used for data collection.
Figure 1


What if the user is flying over sand/snow, or uniform fields?
These conditions make it much more difficult for the software to find matching points in overlapping images for processing complex geometry and large uniform areas. For these conditions a minimum of 85% frontal overlap and 70% side overlap should be used. Also recommended is setting exposure settings accordingly on the sensor to get as much contrast as possible.

What is rapid check?
Rapid check is like regular initial processing but doesn't produce as good of an initial image.It is meant to be faster and just for ensuring that there is enough overlap for full processing later.

Can Pix4D process multiple flights? What does the pilot need to maintain if so?
It can however special care must be taken. The flight patterns of both flights must overlap sufficiently, the two flights must be taken in very similar visual conditions, and the spatial resolution (flight height) must be the same.
Figure 2


Can Pix4D process oblique images? What type of data do you need if so?
Pix4D can process oblique imagery for three dimensional models, but cannot create an orthomosaic in this mode. The software needs imagery at three different heights above the object being modeled, with each raise in elevation corresponding to a decrease in camera angle. This is shown in Figure 3.
Figure 3

Are GCPs necessary for Pix4D? When are they highly recommended?
Ground control points are not necessary just like georeferencing is not, but should be used in high precision georeferencing applications for the making of an orthomosaic. Situations where GCPs should be used are city reconstruction, and mixed nadir and oblique image aided reconstruction.

What is the quality report?
The quality report give you final quality information after processing of data. This report gives statistics and other information that aid the user in determining the quality and adequacy of the images created for their specific use.

Methods:

Figure 4
After opening Pix4D Mapper and clicking "New Project," a new project was made with a descriptive title including the name of the site, date, platform, and altitude, and was saved to my personal folder (Figure 4). Next, the images were added to the project (Figure 5). These came from the "Litchfield" folder including folders for two overlapping flights that was supplied by our professor. Now, due to an error not yet fixed in Pix4D the shutter of the camera model used and detected by Pix4D is set to global shutter when in fact it is a rolling shutter. This needed to be changed in the "Edit Camera Model" window so that the settings looked as shown in Figure 6. Clicking on, output coordinate system settings were not changed, and the "3D Maps" "Processing Options Template" was chosen. Now, processing steps 2 and 3 were deselected as shown in Figure 7. The "Processing Options" button also seen in Figure 7 was then clicked and under the third processing step tab the Triangulation method option was selected before the initial processing was started.
Figure 5

Figure 6
Figure 7
Now, after initial processing was completed and its resulting quality report examined (Figures 8-15), processing steps two and three were selected, "Initial Processing" was deselected, and final processing was run (Figure 16)In the quality report, good overlap was seen everywhere but the edges, where there was understandably less overlap. All images were used by the software.
Figure 8

Figure 9

Figure 10

Figure 11

Figure 12

Figure 13

Figure 14

Figure 15

Figure 16
After processing steps two and three finished, experimentation with the resulting DSM display options could be done. Turning off and on individually the tie points, point cloud, and triangle mesh, the various views were examined. A view of the triangle mesh is seen below in Figure 17. 
Figure 17
Another way to display the the resulting DSM was a flythrough video animation. By using the button highlighted in Figure 18, clicking the user recorded views button shown in Figure 19, recording individual points, and then using the parameters shown in Figures 20-21, I rendered a video. The video is shown under these.

Figure 18
Figure 19

Figure 20

Figure 21
Results:

After creating the video (shown below) maps could be created. These are shown in Figures 22-23.


Figure 22 (vertical unit is meters)

Figure 23
These two maps show the data in different ways and can be cross referenced for clarification about certain areas of the map.

In discussion of the data mapped, a few faults were found, however they are far from serious enough for the resulting three dimensional images to not be used for representation of the mine. First of all, the data is not of high enough quality to successfully recreate cars, tractors, and other machinery that was at the site. This can be attributed to the orientation and distance from these objects that the images were taken from. If recreation of these complex geometrical objects was the goal, oblique imagery from multiple angles and heights for the objects would have to be taken, and even then errors may occur. An example (a tractor) is shown below in Figure 24.
Figure 24
Another point of discussion of this data is that there were no ground control points taken or used, and thus the resulting image from the process is not an orthomosaic and simply a georeferenced image. To dramatically increase data quality, GCPs should be used.

One final point of discussion is that in the maps produced there was data that software tried to interpolate in the south portion and the north-west portion of the image mosaic that shouldn't have been processed. This could potentially be cut out by the clip tool in ArcGIS Pro or ArcMap.

Conclusion:

In conclusion Pix4D is extremely powerful yet fairly simple to use software given adequate knowledge and understanding of the data being used. For example, understanding of the sensor's technical qualities need to be known for fine tuning and a properly executed and planned flight with proper overlap both on the sides and the front are required. In the end data is produced that can be used both by the Pix4D software itself, and other applications such as ESRI ArcGIS applications or CAD applications. This data can be used to create maps of the color ramp symbolized DSMs, ArcScene scenes, or even processed to make hillshade rasters or other images.