gif lidar workshop agendagif.berkeley.edu/documents/lidar/giflidarworkshop2010.pdf ·...
TRANSCRIPT
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley1
Geospatial Innovation Facility Workshop: Introduction to Lidar data and analysisApril 5, 2010
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley2
GIF LidarWorkshop Agenda
PART 1– Introduction to LiDAR
• LiDAR basics
• Types
PART 2– LiDAR analysis
• DEM & Terrain products
• Vegetation products
• Applications and Effects on RS
PART 3– Next steps
• Future of LiDAR?
• Resources
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley3
Lidar Workshop: Part 1Introduction to Lidar
•Lidar basics
•Types of Lidar
•Lidar data
•Application examples
1
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley4
Introduction to LidarLidar = Light Detection and Ranging
Lidar Data:
1. Range. The measurement of the speed which a pulse of light returns to a sensor is converted to elevation above sea level.
R = ½(tc)
• R = range
• t = time
• c = speed of light
2. Intensity. The recorded maximum return from all the returns.
Image modified from Lefsky et al. 2004 with tree graphic from globalforestscience.org.
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley5
Lidar for Data Capture
• Advantages• Collect large data sets quickly and
economically• 100 sq mi ‐ 1,000+ sq miles
• Sensor can be flown at night
• Acquisition in all seasons leaf‐on and leaf‐off
• Can be collected during cloudy conditions IF clouds are above aircraft
• Very accurate
• Disadvantages• Cannot sense through rain, thick clouds,
haze, wind (dust), or smoke
• Limited in thick forest or dense vegetation
• Surface materials may absorb laser
• Large data file sizes
• Small project areas not economical
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley6
Lidar ComponentsLaser & receiver
•Records time to target
•Laser wavelengths can differ (e.g. 1064 nm)
•100 kHz systems available today
IMU Inertial Measurement Unit
•Gyroscopes and accelerometer
•Records roll, pitch, yaw of aircraft
GPS
•Differentially corrected
•Provides cm accuracy of aircraft
•Allows cm accuracy of laser pulse
On board computer
•Records data: Laser distance (intensity); IMU info; GPS info
• Converts into XYZ
•On‐board display
Scanning mechanism
2
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley7
Lidar DifferencesAirborne or ground
Type of scanning mechanism
Single, multiple, or waveform returns
Footprint size
Posting density
Application: atmospheric / terrestrial / bathymetric
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley8
Airborne Scanning LidarThe integration of a device such as an oscillating mirror allows for the development of scanning Lidar.
Oscillating mirror
Oscillating mirror Palmer scanner Rotating polygon Fiber optic array
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley9
Ground‐based LidarGround‐based lidar systems are hemispherical scanning laser range finders that fire millions of laser pulses and records detailed structural information at a range of up to 200 m. Data can be used to derive:
• canopy height
• basal area and stem density
• vertical foliage distribution
• leaf area index
Sources: www.ensisjv.com/; Omasa, et al. 2008.
3
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley10
Lidar ReturnsDiscrete return
• Single
• Multiple
Waveform return
Discrete‐return lidar devices major peaks that represent discrete objects in the path of the laser illumination.
Waveform‐recording systems capture the entire signal trace for later processing.
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley11
Lidar SpecificationsFlying altitude (500‐1500m AGL)
Laser pulse:
• Pulse rate frequency, measured in KHz (as high as 167,000 points per second)
• Laser is coherent light, but does spread
• Beam divergence, measured in milliradians
Scan:
1. Scan angle, measured in degrees
2. Scan frequency, measured in Hz
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley12
Footprint Size & Density
0.5m0.5m0.5m 0.5m0.5m0.5m0.5m0.5m0.5m 0.5m0.5m0.5m0.5m0.5m0.5m
0.5m0.5m0.5m0.5m0.5m0.5m 0.5m0.5m0.5m 0.5m0.5m0.5m0.5m0.5m0.5m
0.5m0.5m0.5m0.5m0.5m0.5m 0.5m0.5m0.5m 0.5m0.5m0.5m0.5m0.5m0.5m
0.5m0.5m0.5m0.5m0.5m0.5m 0.5m0.5m0.5m 0.5m0.5m0.5m0.5m0.5m0.5m
0.5m0.5m0.5m0.5m0.5m0.5m 0.5m0.5m0.5m 0.5m0.5m0.5m0.5m0.5m0.5m
1 m1 m
The laser footprint is approximately circular on the ground.
Footprint size is a function of:
Beam divergence
Flying ht/speed
Scan angle
Posting density is a function of:
Flying ht/speed
Scan angle
Scanning frequency
Pulse repetition
4
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley13
Terrestrial LidarLasers for terrestrial applications generally have wavelengths in the range of 900–1064 nanometers, where vegetation reflectance is high.
One drawback of working in this range of wavelengths is absorption by clouds, which impedes the use of these devices during overcast conditions.
terrestrial lidar
Bathymetric lidar
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley14
Bathymetric LidarBathymetric Lidar systems typically use a blue‐green laser centered on 532 nm and a raster scanning mechanism to acquire lidar data to measure bathymetry
Source: NASA Experimental Advanced Airborne Research Lidar (EAARL)
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley15
Lidar Data
Mass points: x, y, z, intensity
5
16
17
18
6
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley19
Lidar Point Density
Visualization of our high density Lidar product from NCALM: Mass points: x, y, z, intensity
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley20
Lidar Workshop: Part 2
PART 2– LiDAR analysis
• DEM & Terrain products
• Vegetation products
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley21
Lidar Data ProductsDEM – Digital Elevation Model
• elevation points over a contiguous area
DTM – Digital Terrain Model
• elevation information about bare‐earth surface without the influence of vegetation or man‐made features
DSM – Digital Surface Model
• elevation information about all features in the landscape, including vegetation, buildings and other structures
CHM – Canopy Height Model
• Height information about vegetation features with elevation removed
Digital Elevation ModelDigital Elevation Model
Canopy Height ModelCanopy Height Model
Digital Terrain ModelDigital Terrain Model
Digital Surface ModelDigital Surface Model
7
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley22
Lidar Data Analysis Flow
Lidar Data (mass points)
Ground
Slope
Aspect
Height to Live Crown
Field Data (from FFEH
plots)
Height profile
Lidar Data (mass points)
Ground Above‐ground
DTM
Trees
Canopy cover
DSM
Tree height
Species
CHM
DBHCrown size
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley23
Digital Elevation Model ‐ DEMAvailable for the Sugar Pine area at 1m resolution.
This can be sampled at coarser resolutions as needed.
Aspect and slope products can be made from these layers.
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley24
Vegetation AnalysisThere are many ways to analyze Lidar data for vegetation information, in general, we can divide these methods into two types:
1.analyzing canopy height model (i.e. the last return data);
2.Analyzing the height profile through the full range of Lidar returns.
Canopy size, tree height, spacing, canopy cover
8
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley25
Vegetation AnalysisThere are many ways to analyze Lidar data for vegetation information, in general, we can divide these methods into two types:
1.analyzing canopy height model (i.e. the last return data);
2.Analyzing the height profile through the full range of Lidar returns.
10th
45th
95th
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley26
Individual Tree Level AnalysisThe canopy height model (CHM) can be used to derive many forest related variables.
Watershed segmentation
LIDAR Data (mass points)
Ground Vegetation
DTM
Trees
Tree height
DSM
Canopy cover
Species
CHM
DBHCrown size
filter
Interpolt
Segmnt
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley27
Treetop DetectionWe also investigated a range of techniques to find tree tops:
• e.g. local maxima approach, shape matching
9
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley28
Individual Tree DetectionTree tops together with segmentation approaches allow us to automatically delineate trees across the landscape
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley29
Sugar Pine lidar products
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley30
Sugar Pine lidar products
10
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley31
Lidar IntensityLidar light has very small bandwidth (2‐5 nm); most multispectral imaging remote sensing bandwidths are often quite wide: 50‐100nm.
The intensity value is not directly comparable to the reflectance from the same object in optical remote sensing due to scattering and other factors.
There are no guidelines on the interpretation of Lidar intensity images.
But intensity still can be useful for analysis.
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley32
Lidar Applications
From Lefsky et al. 2002:
Only a few application areas have been rigorously evaluated, and many other applications are feasible.
Developments in lidar remote sensing are occurring so rapidly that it is difficult to predict which applications will be dominant in 5 years.
Currently ecological applications of lidarremote sensing:
• ground topography,
• 3D structure and function of vegetation canopies,
• forest stand structure attributes.
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley33
Lidar Applications: Vegetation CanopiesFrom Vierling et al. 2008. Lidar: shedding new light on habitat characterization and modeling. Frontiers in Ecology and Environment.
Lidar can provide fine‐grained information about the 3‐D structure of ecosystems across broad spatial extents.
This data can be used to investigate:
• Animal‐habitat relationships;
• Fire behavior;
• Carbon and energy balance;
• Infiltration, canopy capture and sublimation.
Lidar‐derived vertical distribution plots showing the percentage of laser pulse hits that occurred within a particular height classification.This kind of analysis helps us more fully understand the 3‐D structure of vegetation.
High: 100
Low: 0
Percent
Canopy
11
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley34
Lidar Applications: Forest StructureFrom Lucas et al. 2009. Retrieving forest biomass through integration of CASI and Lidar data. IJRS
Lucas et al. 2009 used CASI and Lidardata to map biomass in a complex Australian forest.
leaf
branch
trunk
total
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley35
Lidar Applications: Powerline monitoring
•Power line route planning •Detection of clearance violations •Vegetation encroachment to minimize potential flash over and bushfire risks •Tower location planning and catenarycurves of lines’•Calculate overhead heights for through and high vehicle traffic analysis
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley36
Calculated using 30‐m DEM Calculated using 3‐m Lidar data
Source: ASPRS Special Session on National LidarModerator: Greg Snyder, USGS Land Remote Sensing Program
Why National Lidar is Needed
Beaufort County, North Carolina. Modeling Flood Inundation: if Sea‐Level Rises 1m
Inundation with <= 1m slr
certain uncertain
12
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley37
Effects of LiDAR on RS:DEM/DSM extraction
Complexity of the problem has been essentially eliminated
Accuracy significantly improved
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley38
Lidar Workshop: Part 3
PART 3– Upcoming research in LiDAR
– LiDAR resources
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley39
Why National Lidar is NeededKey national applications:
•Updating USGS NED•Floodplain mapping and risk assessment
•Emergency response, public safety, critical infrastructure
•Urban infrastructure planning and development
•Ecosystem structure, function and health
•Habitat assessment and management
•Water managementSource: ASPRS Special Session on National LidarModerator: Greg Snyder, USGS Land Remote Sensing Program
13
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley40
http://desdyni.jpl.nasa.gov/
NASA DESDynIMission
• Coordinated mission addressing
• earth surface deformation,
• ecosystem structure, and
• the dynamics of ice
• 8‐day revisit frequency • Combines two sensors including,
1. an L‐band Interferometric Synthetic Aperture Radar (InSAR) system with 35 m spatial resolution and multiple polarization
2. a multiple beam Lidar operating in the infrared with 25 m spatial resolution and 1 m vertical accuracy
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley41
More info here:http://desdyni.jpl.nasa.gov/files/DESDynI%20Overview_final.pdf
DESDynI
Ecosystem measurements allow:
•estimation of forest height with meters accuracy
•estimation of three‐dimensional forest structure
•direct measurements of live above ground woody biomass (carbon stock) and structural attributes such as volume and basal area.
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley42
Status of USGS Lidar Advisory Committee
Began October, 2008
Mission: Increase availability and use of consistent, accurate, timely and contiguous national Lidar data
USGS disciplines coordinating to advance: – Data availability, standards and applications
– The Center for Lidar Information Coordination and Knowledge (CLICK)
– A national Lidar program concept – Stakeholder interaction (NSGIC, AASG,
MAPPS, ASPRS, FGDC, IFTN, NDEP)
http://lidar.cr.usgs.gov/
14
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley43
University‐based research
Simple tools for LiDAR data management
No analyses• executables, API, and source code:
lastools.zip
• txt2las.exe
• las2txt.exe
• lasinfo.exe
• las2shp.exe
• shp2las.exe
• las2tin.exe
• las2dem.exe
• las2iso.exe
• lasmerge.exe
• las2las.exe
• lasthin.exe
• lasview.exe
• laszip.exe
http://www.cs.unc.edu/~isenburg/lastools/
•converting, viewing, and compressing LIDAR data in LAS format
LAStools
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley44
LAStools: example
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley45
Software developed in Finland
Based on Bentley’s Microstation
Very good management / analyses
Expensive
http://www.terrasolid.fi
•converting, viewing, and compressing LIDAR data in LAS format
Terrasolid
15
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley46
libLAS is a C/C++ library for reading and writing ASPRS LAS versions 1.0, 1.1 and 1.2 data
Great information on software and resources
http://liblas.org/
•converting, viewing, and compressing LIDAR data in LAS format
libLAS
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley47
LiDAR news:http://lidarnews.com/
General updates on LiDAR in science and otherwiseForum‐basedLists upcoming conferences, events…
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley48
Based on either 3DMax or Maya
Very promising
Learning curve?http://www.alicelabs.com/
Studio Clouds from Alice Labs
16
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley49
Introduction to theories and principles
Airborne and terrestrial laser scanners
Waveform Analysis
Management of data
Filtering
Forest inventory topics
Feature extraction
Much, much more…
Topographic Laser Ranging and Scanningby J. Shan & C. Toth(editors)
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley50
Appendix
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley51
Lidar Applications: TopographyPhotogrammetric techniques have long been used to collect topographic information from stereo imagery.
From Dietrich and Perron, 2006. The search for a topographic signature of life. Nature:
The influence of life on topography is a topic that has remained largely unexplored. Erosion laws that explicitly include biotic effects are needed to explore how intrinsically small‐scale processes can influence the form of entire landscapes, and to determine whether these processes create a distinctive topography.
Dietrich et al. are using Lidar and Radar to compare topographic landscapes on earth and mars.
17
http://gif.berkeley.edu Geospatial Innovation FacilityGeospatial Innovation FacilityCollege of Natural Resources College of Natural Resources -- UC BerkeleyUC Berkeley52
NEXTMap Terrain On Demandhttp://www.terrainondemand.com/
NEXTMap
• Affordable high‐resolution digital geometries and imagery data
• Interferometric Synthetic Aperture Radar (IFSAR) technology produces:– digital elevation models 5m to 25m,
– 1m orthorectified radar images, and
– other value‐added products (contours, TINs)
• NEXTMap California program, based on IFSAR mapping of California in 2004‐2005.
18
You will be working with the following datasets that supplied.
flightLine.las Sample of raw data from the sensor Clip_LiDAR.las Small sample of multiple flight lines combined into a single file (to increase the density of points) Clip_LiDAR_above.las contains only above-ground points from Clip_LiDAR.las Clip_LiDAR_grd.las contains only ground points from Clip_LiDAR.las Naip_2005.tif Imagery collected by the National Agriculture Imaging Program over the same area as the Clip_LiDAR.las
Because LiDAR analysis is still in its infancy, there is not a single tool out there that can “do it all.” Various software packages complement each other in different areas of functionality; for example, PointVue is excellent for visualization but does not offer much analysis options. In the following exercises we will be using the following software programs: PointVue LE, LiDAR Tools plugin for ENVI, and ArcGIS. Visualizing data We will now look at some characteristics of the raw data using PointVue… 1. Open PointVue from the Start menu. 2. From the main menu, choose File / Open, navigate to your working directory and open
flightLine.las. This is the raw data out of the sensor. The program opens up with the rotating tool selected, so you can immediately drag the screen with your mouse to look at the data from different perspectives (hint: use the wheel for zooming).
It’s easy to get carried away with all the spinning and lose your point of reference. Don’t worry; you can always go back to the original view through the menu: View / Top.
Look around the data. Does it contain any obvious artifacts? Use the spinning controls and your mouse to look underneath the point cloud. You will notice outlier points far away from the majority of the data – these are erroneous data called “low points.” It may help to change the displayed point size to make these points more visible: from the main menu, choose View / Settings / General tab / Point Size. Try 2 or 3. Let’s change the depth exaggeration to really see the 3-D nature of the point-cloud data: From the main menu, choose View / Settings / General tab / Depth Exaggeration: change to 5 and click Apply.
19
Recall that there are a number of different LiDAR hardware systems. Can you figure out what type of sensor collected this data? (e.g. Palmer scanner, oscillating mirror, fiber scanner, rotating polygon).
Oscillating mirror Palmer scanner Rotating polygon Fiber optic array
To figure out the type of sensor used to collect this data, use PointVue to zoom in to the edges of the flighline dataset to the level of individual points. Basic .las file information PointVue also allows us to easily obtain some basic information about LiDAR files. This information – including sensor type, projection, etc. – is inherently kept within .las files but may be more or less complete depending on the supplier of the data. To view this information, choose Tools / File Information from the main menu. Note that we can use this information to find out the dimensions of the data (X, Y, Z) as well as how many points comprise the point cloud. Let’s use the above information to calculate the average density of the working file. First, obtain the X and Y dimensions of the file by calculating the difference between the respective maximum and minimum values. Note that the units in FileInfo screen are reported in the projection of the file (UTM / NAD83, Zone 10 N, i.e. meters). Divide the total number of points by the area of the point cloud to find number of points per meter. This is one of the most basic and common ways to describe LiDAR data. Now open Clip_LiDAR.las in PointVue. This is a small clip of the final product delivered by the supplier. Right away you should notice that the point cloud appears more “solid” than the previous file. This is because Clip_LiDAR.las is a combination of a few overlapping flight lines to increase density of the data. Recalculate the density of this dataset using the steps above. LiDAR intensity In addition to the X, Y, and Z position of points, LiDAR sensors are capable of collecting the intensity of the data. We can look at the intensity directly or in combination with the height (rainbow-color) information. Let’s explore some of the advantages of having LiDAR data intensity. With Clip_LiDAR.las still open, make sure that PointVue’s viewing perspective is from “Top.” By quick visual inspection, are there any roads in this dataset?
20
Click the “Apply Intensity” button (black-red gradient) to add the intensity in the view. Notice that the road is now clearly visible. Explore the dataset and notice how the view changes when you apply or deactivate the intensity overlay.
Analyzing the dataIn this portion of the lab we will try to extract trees from clip_LiDAR.las, one of the “basic” utilization of LiDAR. First, let’s create a digital surface model (DSM), a digital earth model (DEM), and crown height model (CHM). We can think of the DSM as an interpolated surface that would result if we spread a malleable sheet over a forest or a city. It would include the outer shapes of trees, buildings, cars, and anything else above the ground. The DEM, conversely, simulates how the earth surface would appear if we removed all above-ground objects, including trees, buildings, etc. A CHM is the difference between the DSM and the DEM; it’s like the DSM without the effect of the topography. We will use a LiDAR plug-in for ENVI to generate these surfaces. ENVI is a comprehensive software package developed for analyses of remotely sensed imagery. In particular, it has extensive tools for analyses and visualization of hyperspectral data. As most (if not all) other major remote sensing / GIS software suites, it was not designed specifically for analyses of LiDAR and for this reason we will use a separate plug-in. The plug-in was pre-installed on the GIF machines but you can freely download it from the ITT website. Generating DSM First, we will create a DSM. Open “ENVI + IDL” from the Start menu. From the main ENVI menu, click LiDAR / Rasterize LiDAR Data, and choose clip_LiDAR_above.las. clip_LiDAR_above.las is a subset of points from the original clip_LiDAR.las file classified as “above ground.” Notice that the working directory also contains clip_LiDAR_grd.las which is a file with only points that were classified as “ground points.” The classification is typically performed by the data provider or software such as TerraScan, both beyond the scope of this lab. The data can then be separated into multiple files based on classes, pulse return numbers, etc using free set of command-line tools known as lastools. Using lastools is slightly more technical so please ask in lab if you’re interested in further details. Make sure the following options are selected: Select return number: All Returns Enter raster spacing [m]: 0.25 Projection: UTM, NAD83, Zone 11 N Select products: [1] Max, [2] min, and [3] mean elevations, [4] slope, [5] aspect, [8] intensity, and [9] Bare Earth Model Create a “results” folder and choose a sensible name for the output (e.g. “DSM_025m.img”) When the algorithm finishes successfully, an “Available Bands List” window will appear. This window contains all files that currently open and can be viewed within ENVI. Notice that image you just generated appears in the list of files and has a subset of layers or “bands”
21
(e.g. Maximum Elevation, Minimum Elevation, etc.). Select the Maximum Elevation band and click the Load Band button below; the interpolated surface will appear in a new window. Double-click inside the “Image window” or select Tools / Cursor Location to view the value under the cursor. Note that the units are meters. Let’s make the visualization more interpretable.
1. Select Tools / Color Mapping / ENVI Color Tables, scroll down and select “Rainbow + white,”
2. Stretch the histogram to the inner 98% of the data by clicking Enhance / [Scroll] Linear 2%.
Take a look at the other “bands,” such as Minimum Elevation and Intensity. Notice how they differ. Maximum Elevation looks as though it may be a good approximation for canopy height. However, note that we can see inherent artifacts from LiDAR data within the canopies. We can “fix” that by applying a spatial median filter…
1. Choose Filter / Convolution and Morphology 2. In the Convolutions Tool, choose Convolutions / Median 3. Apply kernel size of 5 4. Click Apply to File and select DSM_025m.img. Name the output
DSM_025m_median5.img When ENVI finishes filtering the data, the output will appear in Available Bands List window and a new Display Window will be automatically opened. Let’s compare the “cross-sections” of the two filtered and unfiltered rasters.
1. In the window with the original LiDAR raster, choose Tools / Profiles / X Profile. 2. Do the same for the image window that opened after the convolution operation 3. From one of the image windows, choose Tools / Link / Link Displays
This is a good way to make sure that your data is in the ballpark and that it does not contain any astronomical outliers. Note that the tree profiles now look much less noisy. Also note that as you move the cursor, the “Cursor Location / Value” window now shows heights for the two datasets and the two Display windows show identical locations. How do the values from the filtered and unfiltered images compare over tree tops? Use Enhance / [Zoom] Linear to easily find the tree tops. Generating DEM
1. Go to LiDAR / Rasterize LiDAR data and choose clip_LiDAR_grd.las. 2. Choose the correct raster spacing, projection (same parameters as above) 3. Select [2] Minimum Elevation and [9] Digital Surface Model outputs 4. Click OK.
How does the DEM compare to the DSM from above? Compare the X profiles of the two rasters as described in the section above. Generating CHM We will now create the CHM. As mentioned above, the CHM is the difference between the DSM and the DEM (i.e. CHM = DSM – DEM). Matrix algebra is made really simple in ENVI so this difference can be calculated in just a few steps: From the main ENVI menu, go to Basic Tools / Band MathType the following expression: “b1 – b2” (this stands for “band 1 minus band 2”)
22
Click “Add to List” Select the equation from the list and click OK. While “B1 – [undefined]” is selected, choose the Maximum Elevation band from the DSM_025m_median5.img image. Then select B2 and choose the Minimum Elevation from the DEM we just generated. Name the output “CHM_025m” and click OK. The CHM is calculated and will appear in the Available Bands List. Take a look at the output. Change the Color Table to differentiate the elevations. Compare the CHM to the DSM and/or DEM by linking all of the Display windows and looking at the X-profiles. Notice the actual data values underneath the cursor. Extraction of trees We will be using an IDL program written by the Popescu research group from Texas. This is an experimental program written for research purposes and is described in a few journal articles. Because the program has not been developed into commercial-grade software, there are still a number of caveats we must be aware of for it to function as intended. For example, the program expects ENVI-type rasters but does not recognize files with .img extension. Instead, the images must be set of two files: the image file with no extension, and a corresponding header file (e.g. “CHM_025m” and “CHM_025m.hdr”). In our case, I suggest the following steps:
1. In the results directory, copy and paste the “CHM_025m.img” file within the same folder
2. Rename “Copy of CHM_025m.img” to “CHM_025m” This results will now be able to Go to TreeVaW directory and double-click init_treevaw.sav to start the program.
1. Choose “CHM_025.img” for the Image File 2. The output will be a comma-delimited ASCII file so it would make sense to name it
“CHM25_TVoutput.txt” 3. Go to Settings / Set Parameters 4. You can experiment with the parameters to obtain optimal results. For now, let’s
choose Min. Exp. Crown Width of 5. The parameters are explained in the Help file (Help / About TreeVaw).
5. Hit “Save” and then “Execute.” Let’s analyze the results in ArcGIS… Analysis in ArcGIS From the Start menu, open ArcGIS In ArcGIS, add DEM_025m.img Right-click the added layer and choose Properties Go to the Symbology tag and set the following properties: Stretched Band: Maximum Elevation Stretch type: Minimum-Maximum
23
Edit High/Low Values: HIGH: 1700, LOW: 1450 Color Ramp: Precipitation (right-click and uncheck “Graphic”) Invert: Yes Now add CHM_025m.img. We can apply the same Symbology by importing it from the previous layer. In Symbology properties, click “Import” and select the DSM_025m.img layer. change the Min/Max values (0-60m works well). Finallly, we can add the tree detecftion results. Remember that they were outputted in ASCII format, so we will need to add their coordinates semi-manually. Add XY Data… Import CHM25_TVoutput.txt file from TreeVaW Project it to UTM NAD83 Zone 11 N. Export the imported layer as a shape file. Let’s compare the res How do the model trees compare to the CHM? Q12. What kind of errors do you notice in the results? Q13. Use the measuring tool to compare the actual tree crown widths to the attribute table from modeled results? Q14. Any interesting relationship between the height and crown radius in the results? Add NAIP imagery (supplied in your data directory) and compare to the CHM. Q15. How do the two datasets compare / align? Save your work (the next step might kill Arc). Add the clip_LiDAR.las file. Zoom in and analyze the point density (you can probably do by visual inspection). Q16. How does the point density compare in bare vs vegetated areas? What causes the difference?
24