radiometric correction noise removal atmospheric correction seasonal compensation

Download Radiometric correction      Noise removal      Atmospheric correction       Seasonal compensation

Post on 01-Feb-2016

55 views

Category:

Documents

0 download

Embed Size (px)

DESCRIPTION

Radiometric Correction and Image Enhancement. Radiometric correction Noise removal Atmospheric correction Seasonal compensation Image Reduction and Magnification Image Enhancement Radiometric Enhancement - Contrast stretching - PowerPoint PPT Presentation

TRANSCRIPT

  • Radiometric correction Noise removal Atmospheric correction Seasonal compensation

    Image Reduction and Magnification

    Image Enhancement Radiometric Enhancement - Contrast stretching Spatial Enhancement - Filtering - Edge enhancementRadiometric Correction and Image Enhancement

  • Radiometric CorrectionThe repair or adjustment of pixel intensity (DN) values.Three Types Noise Removal Atmospheric Corrections Seasonal Compensation

  • Noise RemovalNoise is the result of sensor malfunction during the recording or transmittal of data and manifests itself as inaccurate gray level readings or missing data. Line Drop occurs when a sensor either fails to function, like a camera flash on your retina. The result is a line, or partial line, with higher DN values. Fixed with a masked averaging, or low pass, filter (see below).Striping occurs when a sensor goes out of adjustment (improper calibration). The result is a striping pattern in which every nth line contains erroneous data. The problem can be fixed with de-striping algorithms.

  • Line DropAfter RepairBefore Repair

  • Atmospheric CorrectionCorrect for atmospheric scattering and absorption effects and restore digital numbers to ground reflectance values

  • Seasonal CompensationThe compensation for differences in sun elevation. In temporal studies with images acquired at different times of the year it is important to make an adjustment for differences in brightness associated with sun elevation. This adjustment is made by dividing each image pixel by the sine of the solar elevation for that scene: New DN = DN of pixel XY / sine(sun elevation).

    WinterSummer

  • Image Reduction

    Also called pyramidal structure for fast display of image

  • Integer Image ReductionAtlanta Downtown Area

  • Image Magnification

    (Or Image expansion)

  • Image MagnificationAtlanta Downtown

  • Image Magnification

  • Image Magnification

  • Contrast StretchingMost satellite sensors are designed to accommodate a wide range of illumination conditions, from dark boreal forest to highly reflective desert regions. Pixel values in most scenes occupy a small range of values. This results in low display contrast. A contrast enhancement expands the range of displayed pixel values and increases image contrast.

  • Linear Contrast StretchGrey level values are expanded uniformly to the full range of an eight bit display device. (0-255).

  • Histogram Equalization StretchGrey level values are assigned to display levels on the basis of their frequency of occurrence.

  • Standard Deviation Contrast Stretch

  • Common Symmetric and Skewed Distributions in Remotely Sensed Data

  • Min-Max Contrast Stretch+1 Standard Deviation Contrast Stretch

  • Contrast Stretch of Charleston, SC Landsat Thematic Mapper Band 4 DataOriginalMinimum-maximum+1 standard deviation

  • Grey Level ThresholdingFeature extraction based on a range (min,max) of gray level values. Either the visual inspection of image DNs or a histogram can be used to determine the minimum and maximum values for the threshold.TM Band 4DNs 1-40 Extracted from TM Band 4

  • Spatial EnhancementModification of pixel values based on the values of surrounding pixels used to adjust spatial frequency.

  • Spatial FrequencyZero:A radiometrically flat image in which every pixel has the same value (DN). Low:An image consisting of a smoothly varying gray-scale across the image.Highest:An image consisting of a checkerboard of black and white pixelsThe difference between the highest and lowest values of a contiguous set of pixels, or the number of changes in brightness value per unit of distance for any particular part of an image. (Jensen, 1986).High:An image consisting of a greatly varying gray-scale across the image.

  • Spatial FilteringThe altering of pixel values based upon spatial characteristics for the purpose of image enhancement. This process is also known as convolution filtering. Low Pass Filters High Pass Filters

  • Image Filtering Kernel (Neighborhood)A matrix, defined in pixel dimensions, which moves over a image grid one pixel at a time performing logical, mathematical, or algebraic functions designed to change the radiometric values (DNs) in an image for some particular purpose.3 x 3 Filter Kernel9 Pixel NeighborhoodPixel to be filtered

    Pixels used in the filter function in Blue and Black. Filter moves left to right - up to down across the image in one pixel increments.

  • Low Pass FilteringDesigned to emphasize low spatial frequency. Useful for showing long periodic fluctuations: trends. Examples: average, median, and mode.3 x 3 Averaging Filter: All the pixels in the neighborhood are weighted to 1 (Original Values), are added together and divided by the number of pixels in the neighborhood: 9. The center pixels DN value is changed to that value.100252001115010021501002520011150812150Before FilterAfter Filter2 + 1 + 200 + 1 + 100 + 150 + 100 + 150 + 25 = 729729 / 9 = 81

  • High Pass FilteringDesigned to emphasize high spatial frequency by emphasizing abrupt local changes in gray level values between pixels. Example: Edge detection filters.3 x 3 Edge Filter: The weighted values in the neighborhood are summed (SW). Next, the pixel DNs are summed based on their weighted value: (SWDN). Finally we divide WDN by SW to find the new value for the center pixel. V = SWDN / SW (Where V = Output Pixel Value)5050505050507550505050505050501005050SW= (-1) + (-1) + (-1) + (-1) + (16) + (-1) + (-1) + (-1) + (-1) = 8SWDN = (-50) + (-50) + (-50) + (-50) + 1200 + (-50) + (-50) + (-50) + (-50) = 800 WDN / SNW = 800 / 8 = 100Before FilterAfter Filter

  • Spatial Filtering to Enhance Low- and High-Frequency Detail and Edges A characteristics of remotely sensed images is a parameter called spatial frequency, defined as the number of changes in brightness value per unit distance for any particular part of an image.

  • Spatial frequency in remotely sensed imagery may be enhanced or subdued using:

    - Spatial convolution filtering based primarily on the use of convolution masks

    Spatial Filtering to Enhance Low- and High-Frequency Detail and Edges

  • A linear spatial filter is a filter for which the brightness value (BVi,j,out) at location i,j in the output image is a function of some weighted average (linear combination) of brightness values located in a particular spatial pattern around the i,j location in the input image.

    The process of evaluating the weighted neighboring pixel values is called convolution filtering. Spatial Convolution Filtering

  • The size of the neighborhood convolution mask or kernel (n) is usually 3 x 3, 5 x 5, 7 x 7, or 9 x 9.

    We will constrain our discussion to 3 x 3 convolution masks with nine coefficients, ci, defined at the following locations:

    c1 c2 c3Mask template = c4 c5 c6c7 c8 c9

    Spatial Convolution Filtering111111111

  • The coefficients, c1, in the mask are multiplied by the following individual brightness values (BVi) in the input image: c1 x BV1 c2 x BV2 c3 x BV3 Mask template = c4 x BV4 c5 x BV5 c6 x BV6 c7 x BV7 c8 x BV8 c9 x BV9

    The primary input pixel under investigation at any one time is BV5

    Spatial Convolution Filtering

  • Various Convolution Mask Kernels

  • Spatial Convolution Filtering: Low Frequency Filter111111111

  • Low Pass Filter

  • Spatial Convolution Filtering: Minimum or Maximum FiltersOperating on one pixel at a time, these filters examine the brightness values of adjacent pixels in a user-specified radius (e.g., 3 x 3 pixels) and replace the brightness value of the current pixel with the minimum or maximum brightness value encountered, respectively.

  • Spatial Convolution Filtering: High Frequency FilterHigh-pass filtering is applied to imagery to remove the slowly varying components and enhance the high-frequency local variations. One high-frequency filter (HFF5,out) is computed by subtracting the output of the low-frequency filter (LFF5,out) from twice the value of the original central pixel value, BV5:

  • Spatial Convolution Filtering: Unequal-weighted smoothing Filter0.250.500.250.5010.500.250.500.25111121111

  • Spatial Convolution Filtering: Edge EnhancementFor many remote sensing Earth science applications, the most valuable information that may be derived from an image is contained in the edges surrounding various objects of interest. Edge enhancement delineates these edges. Edges may be enhanced using either linear or nonlinear edge enhancement techniques.

  • Spatial Convolution Filtering: Directional First-Difference Linear Edge EnhancementThe result of the subtraction can be either negative or possible, therefore a constant, K (usually 127) is added to make all values positive and centered between 0 and 255

  • Spatial Convolution Filtering: High-pass Filters that Sharpen Edges-1-1-1-19-1-1-1-11-21-25-21-21

  • Spatial Convolution Filtering: Edge Enhancement Using Laplacian Convolution MasksThe Laplacian is a second derivative (as opposed to the gradient which is a first derivative) and is invariant to rotation, meaning that it is insensitive to the direction in which the discontinuities (point, line, and edges) run.

  • Spatial Convolution Filtering: Laplacian Convolution Masks0-10-14-10-10-1-1-1-18-1-1-1-11-21-2

    4-2

    1-2

    1

Recommended

View more >