• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/105

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

105 Cards in this Set

  • Front
  • Back
remote sensing
"measurement or acquisition of info of some property of an object or phenomenon, by a recording device that is not in physical or intimate contact with the object or phenomenon under stuy"
-lack of contact with object, sesors, image, image interpretation
simplified info flow
sun-->atmospher-->target-->atmosphere-->sensor-->image-->interpretation-->application
remote sensing (as a tool)
is a tool or technique similar to
mathematics. Using sensors to measure the amount of electromagnetic radiation (EMR) exiting an object or geographic area from a distance and then
extracting valuable information from the data using mathematically and statistically based algorithms is a scientific activity. It functions in harmony with other spatial data-collection techniques or tools of the
mapping sciences, including cartography and geographic information systems
in situ v. remote sensing
-both attempt to observe and/or measure objects and phenomena
-in-situ: physical contact (interaction), instruments for direct measure, ground reference v. ground truth
sensing
data is collected by sensor not in contact with the object
-passive: collection of reflected or emitted energy
-active: generates signal and collects backscatter from interaction with surface
how remote is remote?
there is no single standard...there are tons of platforms at different levels (satellites, aircrafts, balloons, etc)
advantages
-different perspective
-obtain data for large areas
-in single acquisition - efficient (synoptic and systematic)
-obtain data for inaccessible areas
-doesn't affect/interact w/ phenomena of interest
disadvantages
-accuracy and consistency
-artifacts (processing errors)
-scale related (image is too coarse/detailed...moving between scales: image and in situ data)
-high initial outlays for equipment and training
**remote sensing process
1. statement of problem: identification of data requirements
2. data collection
3. data analysis (image processing)
4. presentation of information (maps, charts, statistics, report, graphs, gis layers)
**scale considerations
-spatial/temporal scales & user needs
-spatial scale (spatial resolution)
-temporal scale (frequency of image-temporal resolution)

-consider: what is application? diff processes operate at diff scales...
resolution
-four components:
1. spatial - the size of the field of view (e.g. 10x10m.)
2. spectral - the number and size of spectral regions the sensor records data in, e.. blue, green, red, near-infrared, thermal infrared, microwave (radar)
3. Temporal - how often the sensor acquires data, e.g. every
30 days.
4. Radiometric - the sensitivity of detectors to small differences in electromagnetic energy.
-trade-offs: impossible to max all four elements; meteorlogical satellites acquire image data daily, but at low spatial and spectral resolutions; LANDSAT TM & ETM + satellite acquires imagery at a higer spatial and spectral resolutions, but lower temporal resolution; newer high spatial resolution sensors
-choice of each resolution element depends on goals and objectives
spatial resolution
-indication of how well a sensor records spatial detail
-refers to the size of the smallest possible feature that can be detected as distinct from its surroundings
-function of platform altitude and IFOV
IFOV
instantaneous field of view
-size of the area that the sensor "sees" at a given moment
ground resolution element (GRE)
smallest area resolvable on the ground
GRE=IFOV*H
(H=platform altitude)
pixel
-picture element
-for an object to be detected its size should be greater than or equal to 2x the pixel size
grain
smallest object distinguishable on image (detail): spatial resolution, similar to pixel size
extent
area covered by image
grain v. extent trade off
large spatial scale: Much detail, small area covered
small spatial scale: less detail, larger are covered
temporal resolution
-ability to obtain repeat coverage for an area
-timing is critical for some applications (crop cycles, catastrophic events)
-aircraft (potentially high)
-satellite (fixed orbit, systematic collection, pointable sensors)
spectral resolution
-the number and dimension of the specific EMR wavelength regions to which sensor is sensitive
-broadband (few, relatively broad bands)
-hyper-spectral (many, relatively narrow bands)
spectral resolution bands (visible)
blue band (450-515 nm)
green band (525-605 nm)
red band (640-690 nm)
near-infrared (750-900 nm)
radiometric resolution
-ability of a sensor to distinguish between objects of similar reflectance
-measured in terms of the number of energy levels discriminated (n^n, where n=number of bits...precision level)...for example, i bit data=256 levels of grey=0-255 range (0=black, 255=white)
-affects ability to measure properties of objects
image interpretation - definition
act of examining images for the purpose of identifying and measuring objects and phenomena, and judging their significance
image interpretation tasks
detection, id, measurement, problem-solving
detection
-lowest order
-presence/absence of object or phenomena
-examples: building, water, roads, and vegetation
identification
-more advanced than detection
-labeling or typing of the object/phenomena
-tends to occur simultaneously with detection
-examples: houses, pond, highway, grass/trees
measurement
-quantification of objects/phenomena
-direct physical measurement from the imagery
-examples: inventories (count), length, area and height of objects
problem solving
most complex task...uses info acquired in first three tasks (detection, id, measurement) to put objects in assemblages or associations needed for higher level identification
-w/ experience, recognition becomes more automatic and tasks become less distinct
electromagnetic radiation
is a form of energy emitted and absorbed by charged particles, which exhibits wave-like behavior as it travels through space
particle theory
says that light was composed of corpuscles (particles of matter) which were emitted in all directions from a source
wave theory
energy particle travels through the atmosphere in a wave form
wavelength
distance between successive crests or troughs generally measured in micro-meters and designated by lambda (wishbone greek symbol)
EMR spectrum - wavelengths
UV (.01-.4 micrometer)
Visible (.4-.7)
Near IR (.7-1.5)
Mid IR (1.5-5.6)
Far IR (5.6-1000)
Thermal IR (3-14)
Microwave & Radio > 1000
Wave properties-frequency
-# of wave forms passing through a given point per unit time
-measured as cycle/second (cycle s^-1) or hertz (hz)
--the longer the wavelength, the lower the frequency...it's just not going to get those peaks as often
**wavelength & frequency
c=lambda x v
c=speed of light
lambda=wavelength
v=frequency
-relationship between wavelength and frequency is INVERSE
quantum theory
-wave theory doesn't account for all properties of EMR
-interaction of EMR with matter (absoprtion v. emission)
-EMR is transferred as discrete particles (photons)
planck's law
energy of photon
Q=hv
q=energy of a quantum in Joules
h=Planck's constant (verrrrry tiny) (Js)
v=frequency of EMR wave (Hz)
therefore Q=(very similar) to v
short wavelengths v. long wavelengths
high energy, high frequency v. low energy, low frequency
-so for example, blue has more energy than red light (due to its shorter wavelength)
-ROYGBIC indicates wavelength...red is longest, violet is shortest (and therefore strongest)
energy levels
all objects above -273 degrees Celcius emit electromagnetic energy
-energy radiated is a function of temperature
**stefan boltzman law
describes emitted radiation from a blackbody as a function of temperature (i.e. total energy emitted is temperature to the fourth power...so hotter objects emit more energy)
M=theta T^4
M=total emitted radiation
theta=SB constant
T=temperature
blackbody
an ideal radiator: absorbs and re-emits all energy incident upon it
Wien's displacement law
used to identify wavelength of max energy emission
-inversely related to temperature
=K/T

K=constant 2898
T=temperature
---as temperature increases, max wavelength gets shorter----
atmospheric interactions
-energy detected by sensor is a function of surface properties and atmospheric influences
-atmosphere affects transmission of EMR by: scattering and absorbing
scattering
-redirection of energy with no change in other properties
-3 primary types: rayleigh (molecular), mie (aerosol), non-selective
rayleigh (molecular) scattering
-radiation interacts with molecules/particles whose diameter is much shorter than the wavelength (1/10 to the wavelength)
-ex: atmospheric gases such as oxygen and nytrogen
-amount of scattering is inversely proportional to wavelength raised to the 4th power
-accounts for blue sky and red sunset (blue light gets scattered nine times more than red light...and then at sunset, blue light gets completely scattered out due to the angle of the sun and red light dominates)
mie (aerosol) scattering
-particles with diameter about equal to wavelength
-dust, pollutants, volcanic eruptions, smoke
-affects longer wavelengths than rayleigh scattering
-occurs in lower 4.5 km of atmosphere
-direction/amount of scattering depends on physical characterics of the aerosol...sun angle is important
-shape/size and distribution of particles is affected
why rayleigh in atmosphere?
because particles in atmosphere are smaller than the wavelength of visible light
non-selective scatter
-diameter of particles is much larger than wavelength
-cloud water droplets/ice crystals
-affects all wavelengths, hence "non-selective"
scattering effects
-reduces direct illumination from sun and creates diffuse illumination
-scatter can occur anywhere in information flow (sun->surface-?sensor)
-may add to or reduce signal received by sensor
-filters may be used to reduce effects of haze/scatter
path radiance
-radiation scattered into the sensor from the atmosphere
-mostly shorter wavelengths
contrast reduction
atmospheric effects can reduce image contrast
absorption
radiant energy is taken in by mattery and converted into other forms of energy
-atmospheric gases are selective absorbers with reference to wavelength (co2, water, o3..greenhouse gases)
significant absorbers
ozone-ultraviolet region
watervapor-specific bands in infrared
carbon dioxide-thermal infrared region
radiant energy
capacity of radiation within a spectral band to do work (J)
radiant flux
time rate of energy flow onto, off of, or through a surface per unit time
-measured in J/s or watts (W)
-characteristics of radiant flux and what happens to it as it interacts with earth's surface is the primary focus of remote sensing
regarding radiant flux...why does time matter?
time matters because of the shutter speed...and because it moves so quickly, you want there to be as much energy (light) to be captured as possible to have a clear, focused picture
surface energy partitioning
radiation at surface is partitioned among: absorption, transmission, and reflection
radiation budget equation
dimensionless ratios of the radiant flux absorbed, transmitted, or reflected to the total incident radiant flux (totaling the sum of 1)
-varies w/ target and wavelength
irradiance
radiant flux incident upon a surface per unit area of that surface
--energy that actually hits the surface
radiant exitance
radiant flux leaving a surface per unit area of that surface
--difference between irradiance and radiance exitance will tell you the difference between what was absorbed, reflected, transmitted
radiance
radiant flux leaving a specific projected source area, in a given direction, within a specific solid angle
--what we measure, what gets sent to the sensor: it is the only thing we CAN measure (LAPTOP EXAMPLE...1/2 of class can't see if you move it to the side...water's a natural example)
-you can only hit a solid angle, which is a cone of light that bounces off the object and gets sent to the sensor
-concept of radiance: picture a cone of light (flashlight....that's exactly what the object is radiating and what we can measure
-only the cone!!!!!! you can only measure the cone
properties affecting target radiance
-target-sensor angle
-target/surface (moisture, micro-roughness, terrain roughness)
-EMR (wavelength, angle of incidence)
-interaction (determines target radiance detected by the sensor)
angle of incidence/observer angles
difference in clarity of rye grass due to different sun angles though camera angle remained the same
reflectance
two types of surfaces: specular and diffuse-
specular
the incident energy will leave surface at an equal angle and opposite to the incident energy (associated with smooth surfaces with mirror like reflections such as roads)
diffuse
uniform reflection of incident energy in all directions (associated with large surface height relative to the size of wavelength)
**spectral signature concept
describes spectral reflectance of a target at different wavelengths of EMR (shows how a reflectance from an object varies wavelengths over time)
-spectral reflectance curve graphs reflectance response as a function of wavelength
-key to separating and identifying objects by selecting optimum wavelength bands
-y axis is all objects and how they've been reflected
-x is blue, green ,and red wavelength bands
-white: high reflection and straight line through the wavelengths
-dark: flat line across low reflection
photographic elements-annotation
-date/time
-firm/series
-coordinates
-scale
-altitude
fiducials
-minimum of 4 markers on photo (placed on center of each side of photo or placed in photo corners) to help find the center of the photo
-the very center of the image is called the principal point
-this is what was directly in line w/ camera lens when it was looking down
nadir
point directly below the aircraft
-if nadir and principal are within 3 degrees, it is a true vertical photo (as opposed to oblique)
flightline of aerial photography
-each of the images taken have a 60% overlap as a plane flies over the area being pictured...the 3rd image has at least a bit of a portion from the 1st image
-if you get 60% overlap, you are able to know where the principal point is and figure out where the principal point is for the next image
-the overlapping helps with stereoscopic images
block of aerial photography
-plane flies, overlap overlap overlap turns, and then gets 20-305% SIDE lap...so this is where you can start covering the entire area, match, and put those things together
-because the sun angle changes while you fly, one area will be darker and one area will be lighter (flying one way v. the other)
-the two co-principals are like "two eyes" that are set apart, like ours, which allows us to see 3-d and explains how we can obtain stereo images from principal points
basic photo geometry - height
altitude of the platform (and camera system) above the terrain (so how high the plane is)
focal length
distance from focal point (lens) to film plane
optical axis
line from the focal point to the center of scene
-for a perfectly vertical photo, the optical axis, principal point, and nadir point will line up
scale - flat v. variable terrain
flat terrain: scale can be determined for entire image
variable terrain: scale will vary across image
-depending on the amount of variation, an average terrain value may be used
if photo was taken from 20,000 feet and the image is at 3,000 feet, where was the height of the lens?
7,000 feet
image displacement
-orthographic projection (map)- true position
-central projection (vertical air photo) - object is shifted from true position and displaced; nadir point is only true position; away from nadir, camera increasingly sees object sides
-relief - differences in relative elevation of objects in photo; significant source of image displacement
relief displacement
objects will tend to lean outward, i.e. be radially displaced
-the greater the object is from the principal point, the greater the radial displacement
relief displacement definition
radial distance between an object's image position and its true plan (horizontal) position due to differences in object relief
cause of relief displacement
-all objects on a vertical air photo are positioned as though viewed from the same point
-camera increasingly sees the side of an object the further it is from nadie - objects appear to lean
magnitude of relief displacement
-object height, distance from nadir point, and H
-displacement at nadir=o
-points higher than datum, lean outward
-points below datum, lean inward
stereoscopic viewing
provides 3rd dimension to air photo interpretation (id 3-d form of an object)
-stereopairs (overlapping vertical photographs)
-stereoscopes - used to create synethetic visual response by forcing each eye to look at different view of same terrain
-gives perception of depth (3D)
parallax
-apparent change in relative positions of stationary objects
-caused by change in viewing positions
-air photo: caused by taking photographs of the same object from different positions (relative displacement...which forms the basis for 3-D viewing of successive overlapping photos)
stereoscopic photographic requirements
photos must be taken at same altitude, along with the same flightline
-overlap (60%) and sidelap (20%)
-equipement for viewing in stereo
stereoscopic alignment of air photos
lined up w/ flight line in order of PP, CPP, CPP, PP, in a straight line
-overlap toward center
-amount of separation varies depending on type of photos
-in northern hemisphere, photos are arranged with shadows towards the observer (southa t top)
stereoscopic parallax principles
-all objects in the scene at exactly the same height will have an identical amount of x parallax
-x-parallax is directly related to the elevation of the point above the terrain...greater for high points than for lower points
types of platforms
balloons, blimps, helicopters, ground-based, airborne, satellite
ground based
can calculate incoming and outgoing radiation which makes it possible to calculate reflectance...it does not cover lots of land and it disturbs the land but it's cheap and pretty easy
airborne
-most common platform because they were efficient during wartime and a lot more stable than helicopters, faster, cover more land, and doesn't disturb it
-low altitude (doesn't need pressuraized cabin so imagery can be obtained by shooting out the window) v. high altitude (stable, big engines and can acquire imagery for large areas but you have to cut hole in bottom of plane for camera)
+++you can acquire data as long as you have everything
+you can control platform variables such as altitude
+time of coverage can be controlled and easy to mobilize

+can get images of private properties
---disadvantages: expensive (primarily cost of aircraft)…and less stable than spacecraft
NAPP
standardized set of cloud-free aerial photographs covering the conterminous US
-color infrared: false color (near infrared, red, green)...these are actual pix w/ film and when you are developing the pictures, you have to choose the 3 bands)
-really good spatial resolution
satellite
~numerous programs
~manned and unmanned systems: range is fixed in altitude and inclination (which is very important because they'll burn if they hit the atmosphere)
types of orbits: geosynch, equatorial, polar (is geostationary)
+++advantages: stable, constant altitutde, generaly insignificant relief displacement, repetitive coverage****, they have archives and you can go back in time
---disadvantages: expensive to launch, support programs are expensive, can't control time of imaging, atmospheric effects may be greater than for aircraft platform
cameras
; the bigger the hole to let the light in, the faster the shutter speed can be
-shutter speed: you want to be fast on an airplane
-with larger frames, there will be a huge amount of relief displacement (as you can only see the sides as opposed to seeing them head on, straight down)
-when using film camera, you can only have 3 parts of the electromagnetic spectrum: true color or false color infrared
range
range for spacecraft is determined by orbit, which is fixed in altitude and inclination
-SUN SYNCHORNOUS-near polor, cross equator at approximately same local time each day
-GEOSTATIONARY-fixed orbit over equator; primarily meterological systems
exposure
total amount of light striking film and/or CCD (charge coupled devices)
-film/CCD exposure primarily a function of: scene brightness, diameter of lens opening, exposure time
relative aperture
-diameter of lens opening determined by aperture setting - f/stop
-defined as ration of lens focal length (f) to lens opening diameter (d)
-important for speed of the lens
f/stop, diameter, and exposure
focal length generally fixed, so f/stop varies as function of the diameter of the aperture
-inverse relationship":
-smaller f/stop->wider lens opening and higher exposure (faster shutter speed)
-larger f/stop->smaller lens opening and less exposure
aerial cameras-frame camera
-high geometric image quality
-low f/stop-fast shutter speed
-film-can only be used to look at 3 portions of the electromagnetic spectrum: two major (true color or color infrared)
aerial camers - digital
-during exposure lens focuses light on bank of detectors
-exposure causes an electrical charge that is related to amount of incident energy
-electrical signal (analog) is converted to a digital brightness value
color theory
saturation: purity of color
primary colors: red, blue, green
color characteristics
hue: dominant wavelength (color
saturation: purity
lightness: light/dark
additive process
based on light (tv, computer monitors)..red green blue
-each color gun's intensity in each pixel is modulated based on the amount of primary color present in the scene...so the more primary colors there are, the lighter the color
subtractive process
based on pigments/dyes
cyan, magenta, yellow...the more primary colors there are, the darker the color
true color photograph
blue green red
false color infrared
green, red, near infrared...vegetation turns up red