• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/56

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

56 Cards in this Set

  • Front
  • Back

What is remote sensing?

displaying info w/o touching it, getting info about something that we cant physically reach; art & science



Wikipedia: the acquisition of information about an object/phenomenon without making physical contact with the object and thus in contrast to on site observation; using aerial sensor technologies to detect and classify objects on Earth

In what way is remote sensing a science? In what way is it an art?

art because people can interpret different things, some level of user driven interpretation and input that will differentiate from other users; interpretations differ not just because of individual preference, but because that preference has been developed for that individual’s entire lifetime;





science bc of understanding physical targets and energy/wavelengths (example interested in objects w/ chlorophyl, you can develop a sensor that will target that— and there is a science behind the engineering of that sensor, understanding atmospheric modelling; held together by rules, scientific method, uses math & logic, physical sciences, biological sciences, social sciences; 4 developmental stages of science (RS is in stage 2- the rise of the field)

What are in situ measurements and how are they used in remote sensing?

in situ = on the ground,



used to compare images captured from air vs what you see on the ground;



ground reference data; use to train classifications (background knowledge to go back and look at the recorded image and say that’s exactly where that road was, or tree, etc)

How can in situ measurements be collected?

using instruments, transducers (a device that converts variations in a physical quantity, such as pressure or brightness, into an electrical signal, or vice versa);



thermometers, terrestrial lazer scanner, ocular estimation of vegetation cover using a daubenmire frame, spectrometers, etc




should be callibrated in two ways:


-geometrically and radiometrically (to % of reflecatnce) so that RS data can be obtained on diff dates and still be comparable to one anohter


-calibrated with what is on the ground in terms of biophysical or cultural (land cover) characteristics

What types of bias might be introduced during in situ data collection?

user bias (bias interpretating the data collected), subjectivity; user error, and instrument error; uncalibration, etc.


Why is it a misnomer to refer to in situ data as ground truth data instead of ground reference data?


Because there are many biases and errors that can happen on the ground; it may not be perfect fact, but should rather be used as a reference point for comparison.



recognizing that in situ data may contain error

What are some of the limitations and advantages associated with remote sensing?

Advantages: studying large areas simultaneously; can save money, access to areas we can’t physically get to in person/unobtrusive; see in different spectral wavelengths, not just what the human eye can see; can remove user bias from on-ground sampling; provide fundamental biophysical information; 3 D estimations, multitemporal observations




Limitations: really expensive; for areas that are really dynamic, we can’t get that much information from that area with just a few images that cost a lot of money; atmospheric interference & noise can alter image; easy to oversell (you can convince people RS provides the answers but when it comes down to it it may not provide all the answers you need). good for some things, but not all things; little depth penetration (only captures surface, not what’s going on inside the soil for example); active sensors (like LIDAR, RADAR, and SONAR) can be obtrusive; mechanical error (can become uncalibrated)


Describe in your own words, using examples, the remote sensing process.


think about imagery



hypothesis, in situ data collection, remote sensor data is collected passively or actively, in situ and rs data are processed/conversion, accuracy assessments and analyzation



statement of the problem: form hypothesis, select appropriate logic (inductive/deductive/technological), select appropriate model (deterministic/empirical, etc, stochastic)



data collection- in situ, collateral data (maps), RS



data-to-information conversion: Analog/visual image processing; digital image processing, test hypothesis



information presentation: image metadata (sources, processing lineage), accuracy assessment, analog->digital, stats, graphs

Remote sensing industry trends

aging workforce, which means that younger is needed


definitely growing quickly;



a growing need for higher spatial resolution data; a growing need for hyperspectral??,



collecting more and more “real time” data;



greatest shortfall- multilingual and math skills;



aerial has most data collected




need finer resolution data than coarse

Spatial, spectral, temporal and radiometric resolution


Resolution: resolving power, ability of an optical system to distinguish between signals that are spatially near or spectrally similar


ability of an optical system to distinguish between signals that are spaially near or spectrally similar



Spectral: the number and size of spectral regions the sensor records data in (blue, green, red, infrared, microwave)



Spatial: size of field of view



Temporal: how often sensors acquire data



Radiometric: the sensitivity of detectors to small diffs in electromagnetic energy

What are the four components of a remote sensing system?

target, energy source, sensor, transmission path

The electromagnetic spectrum (EMS)

know wavelengths & colors!! just like quiz
A continuum of energy, travels at speed of light
the range of all possible frequencies from radiation
red-->blue
shortest wls- radio/microwaves
longest- gamma (fastest)
order: gamma, x-ray UV, visible, infrared, microwaves



400nm=violet


475nm blue


510=green


570=yellow


590=orange


650=red


Active versus passive remote sensing:

Passive:radiation that eminates naturally from an object; ex: sun, Earth's surface
generally needs daylight
hyperspectral, multispectral



Active: system that emits energy; provides own energy source; can be used day or night; requires lots of energy; ex: Radar, LiDar, Synthetic Aperture Radar (SAR):

Regions of the electromagnetic spectrum used in remote sensing (UV, Visible, NIR, SWIR):

UV: .3-.4um


VIS:.4-.7um;


NIR: .7-1.3um; emittance energy;


SWIR: 1.3-3um

Conversion of wavelengths from nanometers (nm) to micrometers (µm) and vice versa

1 nm = .0001 um


so, 1 um = 1000 nm


10 nm = .01 um


100 nm = .1 um


1000 nm = 1 um


1500 nm = 1.5 um


etc...

The relationship between frequency and wavelengths:

inverse relationship
frequency is inversely proportional to wavelength
so, the shorter the wavelength, the lower the frequency (and vice versa)
frequency-# wls that pass a point per unit of time
wavelength- distance from any point on one cycle/wave to the same position on the next one
a wave that completes one cycle every second has a frequency of one Hz

How are photons and wavelengths related to the detection of electromagnetic radiation?

electromagnetic energy can only be detected as it interacts with matter



a photodetector: photons interact with it and produces an electrical signal that varies in strength, proportional to # photons



measured by two fluctuating fields: electric & magnetic



wave concept- explains how electromagnetic energy moves, but energy can only be detected as it interacts with matter



electromagnetic energy behaves as if it consists of many individual bodies (photons) that have particle-like properties such as energy and momentum



the interaction of photons with a light-sensitive photodectector produces an electrical signal that varies in strength, proportional to the number of photons


What is the relationship b/w an object’s temperature, the amount of NRG exiting the object, and the dominant wavelength at which the NRG is emitted?


blackbody (theoretical substance that absorbs and radiates energy at the max possible rate per unit at each wavelength for a given temp..the perfect abosrber and emitter...



total emitted radiation is proportionate to its absolute temp (stefan-Boltzmann Law))



greater temp = greater amount of radiant energy emitted from object



as temp increases its dominant wl gets shorter


Wien's Displacement Law- determine blackbody's dominant wl

How does electromagnetic energy interact with the atmosphere and earth?


Atmosphere: Scattering (Raleigh, Mie, Nonselective)


Earth: Reflectance, absorption, transmission

Atmosphere: Scattering (Rayleigh, Mie, Non-Selective)


Rayleigh - wavelength dependent, mostly in upper 4.5 km of atmosphere


Scattering by particles that are smaller than the wavelength of visible and near infrared radiation...more scattering at smaller wavelengths....blue sky and red sunsets



Mie-wavelength dependent, longer wl than Raleigh, lower 4.5 km of atmosphere


Particles in the atmosphere are ~equal in size to the wavelength of the scattered radiation... influences longer wavelengths...dust, pollen, smoke...smog is reddish brown



Non-Selective: lower portions of atmosphere


all wavelengths equally affected...particles are larger than the wavelengths...most common...water droplets in clouds...gray haze


Earth: Reflectance, absorption, transmission


Absorption- atmosphere is preventing the transmission of radiation


Ozone (O3 and O2), Water (H2O), and Carbon Dioxide (CO2) are responsible for most of the solar radiation absorption that occurs


absorbed and re-radiated at longer wavelengths


atmospheric windows



Reflectance- occurs when a ray of light is re-directed as it strikes a nontransparent surface


The nature of the reflection depends on sizes of surface irregularities in relation to the wavelength of the radiation considered


- Specular Reflection: reflection off a smooth object (ie. smooth body of water, mirrors)


Diffuse Reflection : off of rough objects (clothes, roadways, etc)



Transmission: occurs when radiation is neither reflected or absorbed, but passes through a substance without significant weakening.


How are remote sensing image data organized and formatted?


Organization:
Header files- samples (columns), lines (rows), bands, data type (byte-0,1 vs integer- whole numbers, floating- decimals), data order


Formats: ways of storing 24-bit images (RGB images)



BSQ (Band Sequential Format) color image output, BSQ is convenient bc three bands will be assigned to RGB; requires that all data for a single band covering the entire scene be written as one file. Thus, if an analyst wanted to extract the area in the center of a scene in four bands, it would be necessary to read into this location in four separate files to extract the desired info.; a method for encoding the actual pixel values of an image in a file. a very simple format; spatial information



BIP: (Band Interleaved by Pixel) convenient for subsets and classification (max likelihood) bc multi band data are required pixel by pixel for the multi-variable processing; one of the earliest digital formats used for satellite data, teats pixels as the spearate storage unit; birghtness values for each pixel are stored one after another. it is practical to use if all bands in an image are to be used; spectral info



BIL: (Band Interleaved by Line) often used in processes that use the whole image data (also BSQ); stored by lines, each line is represented in all four bands before the next line is recorded; similar to BIP, it is usefaul to use if all bands of the image are to be used in the analysis. if some bands are not of interest, the format is inefficient if the data are on tape, since it is necessary to read serially past unwanted data; easy to access both spatial and spectral info
none of these compress data

What is full width half max?

a measure of the width of a distribution, defines band center



(bandwidth: width of the filter/band at 50% peak response); a parameter commonly used to describe the width of a “bump” on a curve or function. it is given by the distance between points on the curve at which the function reaches half its max value



remember that graph


What is radiance and how is it measured:

radiant energy- total energy radiated in all directions towards or away from a surface (Joules);



radiance: the total energy in a certain direction per unit area and solid angle. describes what the sensor measures; the amount of radiation coming from an area



radiant flux- energy in all directions (time/J or W); a system at a given time the power received, emitted, reflected or transmitted by that system at that time in form of EM radiation



radiance of a surface in a given direction are the radiant flux emitted, reflected, transmitted or received by that surface; directional quantities; indicates how much of the power emitted, reflected, transmitted, or received by a surface will be received by an optical system looking at that surface from some angle of view; conserved along lines through empty space;albedo



spectral radiance- a function of frequency (Hz);



radiance flux density emitted from a unit area, unit solid angle, unit wavelength (w/m^2*sr*wl)




don't need to know formulas but know in what unit and what those units mean



measured in Joules, Watts


expressed in per unit wavelength



steradian--- this is what she wants


defines the angle over which reflectance from the earth's surface is measured


unit for a solid angle of a sphere



the concept of radiance leaving a specific projected source area on the ground, in a specific direction, and within a specific solid angle. this is the most precise radiometric measurement used in RS


Supervised and unsupervised classifications (know what they are & examples of each):


Supervised: you define the spectral class- user defined training pixels, pixels where both the spectral values and the class is known
works well with good ground data
excellent feel for data and classification
works well for large, homogenous areas
fairly quick
BUT
% of total class variance goes unknown
areas with spectral indistincness and variation are challenging (hard to identify endmembers)
examples: min. distance, max. likelihood, parallelpiped


spectral signatures are developed from specified locations in the image (training sites) (defined by user)




Unsupervised: a program defines the spectral class- no extraneous data is used, classes are determined purely on difference in spectral values;


basically, like-pixels are clumped together and automatically labeled as the same classification- so two green things would be labeled as vegetation when one might be a green car. or vice versa- too many classifications in heterogeneous pictures-- labeling too many different objects and the user will have to manually go through and combine classifications to make the map more meaningful.


isodata, k-means

Mixed pixel problem

a pixel whose digital number represents the average of several spectral classes within the area that it covers n the ground, each emitted or reflected by a different type of material. this is common along edges of features;
think transition zone from water to ground

Hard versus fuzzy classifications:


hard- each pixel belongs to the class it most clearly resembles, sharper; homogenous areas (croplands, water bodies), dependent upon spatial scale and the variance of classes, low classification accuracy in homogenous areas; isodata, parallelpiped, centroid/k-means, max likelihood, neural networks



fuzzy- softer, each pixel can belong to more than one class and has membership grades for each class; heterogeneous areas relative to spatial scale (residential areas, mixed forests), gradients (forest—>savana, clear —> turbid water); linear mixure modeling,


Training versus validation data


Training data- areas of an image that have a known identity; identity can be based on: analyst knowledge, field observations/data, maps and digital imagery
Characteristics- need to choose at least 100 pixels for each category; homegeneity of the training area (ideally, but difficult to achieve)
used to discover corellations/potential relationships between data
for relatively homogenous data



validation data: control the type of data/values that users enter into a cell. you might want to restrict data entry to a certain range of dates, limit choices by using a list, or make sure that only positive whole numbers are entered (like in ur GIS project)

Pixel unmixing and how it relates to hyperspectral data


pixel unmixing is commonly performed by employing a least squared (LS) errer crierion, making it sensitive to outliers. as an alternative, the least median of squares (LMedS) method is propost- not only is it extremely robust, but it is efficient


hyperspectrial= image spectromoters; hyperspectral data is often used to deterine what materials are present in a scene. materials of interest could include roadways, water bodies, etc. Trivially, each pixel of a hyperspectral image could be compared to a material database to determine the type of material making up the pixel. However, many hyperspectral imaging platforms have low resolution (>5m per pixel) causing each pixel to be a mixture of several materials. The process of unmixing one of these 'mixed' pixels is called hyperspectral image unmixing or simply hyperspectral unmixing.



other supervised classifiers: linear spectral unmixing
endmembers (from a spectral library, image derived or field) are used to “unmix” each pixel into its constituents
extract pure spectra from complex composites of spectra that form the image
assumes linear mixture of the classes in each pixel
requirements- need all pure endmembers
typically used in hyperspectral data but also multispectral


What is map accuracy assessment?

compare RS derived classification map with reference data (that is believed to accurately reflect the true land cover)
be aware of temporal changes- can alter accuracy assessment
should not be based on the training pixels (bc they are not randomly selected and that the classification is not independent of the training pixels..using these would result in an overly optimistic accuracy assessment)
accuracy assessment are summarized using a confusion matrix


Analog versus digital


analog- aerial photographs, any continuous signal for which the time varying feature (variable of the signal is a representation of some other time varying quantity; uses some property of the medium to convey a signal; any information can be conveyed by this; theoretically has infinite resolution
smooth & continuous


more accurate than digital, but suffers more atmospheric interference



digital- a physical signal that is a representatin of a sequence of discrete values
any continuous time waveform signal used in digital communication, representing a bit structure- 0s and 1s
can be sent long distances and suffers less interference than analog
radio signal


digital is more easilier manipulated



analog is waves of energy (lower frequency than your household microwave). The signal alternates between up and down, but the frequencies can vary as to how high up and down the signal goes, and those differing frequencies give the distinct picture you view. When an analog signal is having problems, you will see a fuzzy picture because the microwaves are not as high or low as they should be.

Digital information is broadcast in 1s and 0s. The signal is much simpler in the fact that it is either on (1) or off (0). Digital TV is made up of thousands upon thousands of blocks of color called PIXELS, and the string of 1s and 0s designated to each pixel determine the color. When a digital broadcast is having problems you will see blocks of black color or scrambled colors called PIXELIZATION because the 1s and 0s are either all 0s or out of order.


wave concept


explains how electromagnetic energy moves, but must interact with matter for it to be detected

atmospheric windows


what doens;t get absorbed, wavelengths that are easily transmitted

incident radiation

absorbed + reflected + transmitted radiation; law of conservation energy


emittance energy

mainly derived from shortwave energy from sun that has been absorved, then re-radiated at longer wavelengths; strongest at the far infrared region; reveals info about thermal properties of materials

How do some sensors we discussed in class compare in terms of these resolutions?

Landsat had lower SPECTRAL resolution (wider bandwidths) than the Benchtop field spectrometer, which has the highest spectral resolution. MODIS has greater TEMPORAL resolution than Landsat (MODIS is daily and Landsat is every 14 days). Landsat has higher SPATIAL resolution than MODIS (Landsat has 30 m pixels and MODIS has 250 m pixels). Landsat 8 has greater RADIOMETRIC resolution (11-bit) than Landsat 7 and 5, which record 8-bit intensity information.



Landsat has lower res and wider bandwidths,

How would you plot band statistics and apply a band stretch?


know what it means to generate a histogram: a representation of a frequency distribution by means of rectangles whose widths represent class intervals and whose areas are proportional to the corresponding frequencies; a graphical representation of the pixels exposed in your image



histograms allow us to analyze extremely large datasets by reducing them to a single graph that can show primary, secondary, and tertiary peaks in data as well as give a visual representation of the statistical significance of those peaks



you can connect the shape of a histogram with the mean and median of statistical data that you use to create it-- the relationship between mean and median can help predict shape of histogram



how to plot band stats: tools->Stats->compute stats->select input file->hit ok->choose histogram



stretching: altering the distribution of digital numbers


changes the contrast


How would you apply a filter?

Select a filter to apply-->click convolutions (this is where most kernels can be found)-->edit kernel matrix and save-->restore kernel-->select input file for convolution-->save file-->hit ok and filtering process begins!



filtering improves readability of the image or extract desired info from image




error matrices

consists of n x n array/matrix where n represents the number of categories



y-axis is labeled with categroies from image data
x axis is labeled with the same n cateories but from reference (ground) data



values in the matrix represent number or % of pixels/polygons



the probabiliity % that the classifier has labeled an image pixel into the ground truth grass. it is the probability of a reference pixel being correctly classified

user accuracy

user accuracy: determines the reliability of the map as a predictive device- it will tell the user the chances of, when going into the area on th ground, that hte map will be classified correectly
user’s point of view (think of the accuracy relative to the classified map you use)
commission error = 100% - UA
ex: when the analyst assigned a crop area on the ground to the forest category on the map (the analyst committed an error by assigning a region of forest to the wrong category, forest class received pixels it wasn’t supposed to)

producer's accuracy

tells teh analyst that for an actual landscape represented on the map, a certain amount of that landscape was correctly classified
examined from the analyst’s point of view (thnk of the accuracy relative to the reference data)
omission error = 100% - PA
ex: assignment of forest on the ground to the crop category, )forested area on ground was assigned a crop value on the classified image, forest class did not receive pixels that it was supposed to)
*a commission error in one category will result as an omission error in another category


overall accuracy

the total classification accuracy

How would you calculate scale and building height in an analog photo?

S = Dp / Drw
S = scale of airphoto (nominal only)
Dp = distance between objets on the photo
Drw = Distance between objects in the real world



still need height


(tan(angle) x distance) + eye height



i guess...

digital number

the generic term for pixel values



describe pixel values that have not yet been calibrated into meaningful units


classifications

based on spectral class- group of n- dimensional vectors of a feature, one pixel at a time, work in spectral domain only, very powerful for sorting thru spectral domain
parallelepiped: simplest, quickest classifier, inherently the least accurate
result: unclassified
min. distance (to mean): mean vector of ROI, distance from unknown pixel to mean vector of each of the possible classes, the unnown pixel will be assigned to the nearest class
how is min distance measured? Euclidian Distance (ENVI), City Block Distance, Mahalanobius Distance (ENVI)
ED & CBD are similar in that distance is measured vector-to-mean
Max. Likelihood: generic, classify according to what a particular pixel value is most likely according to the highest class probability, assumes stats for each class normally distributed; uses the variance and covariance and mean vector between training class spectra to classify the data; assumes that the spectral responses ofr a given class fit a normal (gaussian) distribution; computes the statistical probability of a pixel as part of a class; computationally complex
Spectral Angle Mapper (SAM): angle between vectors of endmembers and image pixels are compared (threshold is set by user); insensitive to image brightness/illumination; one output image for each endmember with small values = a closer fit

For which of the following wavelengths, give the region of the electromagnetic spectrum (UV, Vis, NIR, SWIR) and convert wl to um


1550nm


532nm


350nm


1.55um-SWIR


.532um-vis


.35um-UV

EM energy is detected when it interacts with _.


_ describes how electromagnetic energy travels and _ are related to how em radiation is detected and recorded by a sensor

matter, wavelengths, photons

as the sun's energy travels through the atmosphere, it is altered by particles and gases in the atmosphere through scattering and absorption: t/f

true

Wein's displacement law indicates that as temps increase the peak radiance shifts toward:

shorter & cooler wavelengths

incoming shortwave radiation is absorbed by the Earth and re-emitted in the

thermal (dominant at 9.66 um)

which reflectance signature has the lower spectral resulution, Landsat 5 or HyMap

Landsat 5 has lower

the following systems operate through active RS except:


a) sonar


b) radar


c) lidar


d) multipsectral

multispectral

MODIS has a higher temporal resolution than landsat, but a coarser spatial res. t/f

true

t/f: as the temp of an object increases, it dominant wavelength sifts toward longer wavelengths

false (shorter)

all of the folowing are all supervised classifications, except




spectral angle mapper


isodata


max likelihood


decision trees

isodata

according to wien's displacement law, earth's peak wavelength is longer than the sun's peak emittance (t/f)

true