Introduction to Remote Sensing
In this page we present some concepts related to Remote Sensing and to
images generated by orbital sensors, for one of the main functions of
SPRING lies on the treatment of these images via the functions of
enhancement and classification.
The topics presented herein are:
See also:
About Digital Images.
Reading an Image with IMPIMA.
Performing an Image Registration.
RADAR Image Processing.
RS Image Processing.
Origins and Evolution of RS
The origins of remote sensing dates back to the experiments of
Newton (1822), who discovered that a ray of light (white light) when
traversing a prism would unfold or disperse into a multicolored beam -
a spectrum of colors.
Since then the scientists have been widening the scope of their
studies on so interesting a subject. They verified that white light was
but a synthesis of different types of light, a kind of a vibration
composed of many other different vibrations. Later they discovered that
each color of the decomposed spectrum corresponded to a different
temperature, and that the red light when hitting on the surface of a
body would heat it more that violet light.
Besides
the visible red, there are some radiations that are invisible to the
eye, called infrared waves, rays or radiation. Soon, an experiment by
Titter revealed another type of radiation: ultra-violet. Always taking
their experiments further, scientists managed to prove that the light
wave was in fact just one of the many different types of
electromagnetic wave.
Some authors trace the origins of remote sensing down to the
development of photographic sensors, when aerial photographs were taken
from balloons.
It becomes evident that remote sensing is the fruit of a
multidisciplinary effort that has involved, and still does, many
different advancements in physics, chemistry, biosciences, and
geosciences, computing, mechanics, etc..
The evolution of remote sensing is
connected to some of the following main events:
- 1822 - Development of the theory of light;
Newton -
decomposition of white light;
Use of a
primitive camera.
- 1939 - Development of optical equipment;
Research
of new photosensitive substances.
- 1859 - Use of photographic cameras onboard balloons.
- 1903 - Use of aerial photographs for cartographic purposes.
- 1909 - Aerial photographs taken from planes.
- 1930 - Systematic coverage of the territory for the survey of
natural resources.
- 1940 - Development of radiometers sensitive to the infrared
radiation;
- - Use of infrared film in World War II for the detection of
camouflage.
- 1944 - First experiments with the use of multispectral cameras.
- 1954 - Development of microwave radiometers.
- - Initial tests for the development of side looking radars.
- 1961 - Development of optical and digital processors.
- - First side looking radars.
- 1962 - Development of manned and unmanned spacecraft;
- - Launch of meteorological satellites;
- - First orbital photography from MA-6-Mercury.
- 1972 - Orbital photography taken from the Gemini Program;
- - Other spatial programs are born including natural resources
satellites: SEASAT, SPOT, ERS, LANDSAT.
- 1983 - Launch of Landsat 4, SIR-A, SIR-B, MOMS;
- 1991 - Launch of ERS-1.
-
DEFINITION
-
A definition of remote sensing could be: "It's the use of
sensors for the acquisition of information about objects or phenomena
without a direct contact between them".
Sensors : equipment capable of collecting energy from
the object, converting it into a signal capable of being registered and
presented in a way that is adequate for the extraction of information.
Energy : in the majority of times refers to the
electromagnetic energy or radiation.
A more specific concept could be: "It's the set of activities
related to the acquisition and the analysis of data from remote
sensors", where:
Remote Sensors:
photographic or opto-electronic systems capable of detecting and
registering, as images or not, the flux of radiant energy reflected or
emitted by distant objects.
A flux of electromagnetic radiation while propagating through space
can interact with surfaces and objects, being reflected, absorbed, and
even reemitted by them. The variations that these interactions produce
in the flux are strongly dependent on the physicochemical properties of
the elements on the surface. Ahead we will discuss with more detail the
interactions between radiation and matter.
Everything in nature is in constant vibration, emitting or modifying
electromagnetic waves (energy) and presenting "perturbations" in the
magnetic and gravimetric fields of the earth. All the instruments that
detect and transform this energy could be classified as sensors: radio,
television, photographic camera, etc..
During the data acquisition phase by the sensors we can distinguish
the following basic elements: radiant energy, radiation source, object
(target), trajectory and sensor (optical imaging system and detector).
The following figure presents such elements and exemplifies the many
ways that electromagnetic radiation can take before hitting the sensor
system.
A photographic camera with flash could be given as an example of
sensor system: "when the camera system is activated the flash is
triggered and emits radiation. The radiation travels to the target and
is reflected by it towards the camera's optical system. The reflected
radiation if focused onto the film plane, that is a photochemical
radiation detector. An image of the radiation pattern is recorded on
the film and is later chemically developed."

Trajectories of radiation
Every time that work is done, some type of energy is transferred
from one body to another, or from one place in space to another. Of
every possible forms of energy, one is of special interest to remote
sensing, being the only one that does not need some material
means in order to propagate, that is the radiant or
electromagnetic energy. The most familiar example of radiant energy and
the one of utmost importance is the solar energy, that propagates
through empty space from the Sun to the Earth.
Introduction to Remote
Sensing
Electromagnetic Spectrum
The electromagnetic radiation (waves) is defined by many physical
characteristics (intensity, wavelength, frequency, energy,
polarization,
etc..). However, independently of these characteristics, every
electromagnetic wave is essentially identical, presenting an
independency on relation to the existence or non-existence of a
propagation medium (an important property of this energy transfer
process). This independency is easy to understand in the following
figure, the electric field and the magnetic field are perpendicular to
each other and both oscillate perpendicular to the direction of the
wave
propagation, thus the electric field generates the magnetic field while
the magnetic field generates the electrical field.
Where: E = Electric Field
M = Magnetic Field
The speed of the electromagnetic wave propagation in the vacuum is
the speed of light (3 x 108 m/s). The number of waves that
pass through a point in space in a certain time interval defines the
frequency (f) of the radiation. The frequency of the wave is directly
proportional to its velocity of propagation. The higher the
velocity of propagation, the higher number of waves that will pass
through a certain point in a certain time (t) and the higher it will be
its frequency. The velocity of propagation (v) will be constant in a
certain medium of propagation.
The electromagnetic wave can also be characterized by its wavelength
(lambda) that can be expressed by the equation:
The range of wavelengths or frequencies where we can find
electromagnetic waves is unlimited With the present technology we can
generate or detect electromagnetic waves in a wide range of
frequencies, extending from 3 Hz to 300.000.000 GHz,
or wavelengths in the range between 108 meters to 0.01 A
(angstrons) or 10-12 m.
The spectrum is subdivided in ranges, representing regions that
possess characteristics that are specific to the physical processes,
generators of energy in each range, or to the physical mechanisms
that detect this energy. Depending on the range of the spectrum, we
work with energy (electron-volt), wavelength (micrometer), or frequency
(Hertz). For example: in the range of gamma and cosmic rays we use the
energy; in the range of UV (ultraviolet) or IR (infrared) we use the
wavelength, while in the microwave and radio region we use the
frequency. The main ranges of the spectrum are described below and
represented in the following figure:
Electromagnetic Spectrum.
Radio waves: low frequencies and long wavelengths. The
electromagnetic waves in this range are used for communications over
long distances, for, besides being little attenuated by the atmosphere,
they are reflected by the ionosphere, allowing for their propagation
over range distances.
Microwaves: situated in the range between 1 mm to 30 cm, or
3 x 1011 Hz (300 GHz) to 1010 Hz (10 GHz). In this
range of wavelengths we can create beams
of electromagnetic radiation that are very concentrated, used in
radars. Little atmospheric attenuation, or blockage by clouds, makes
radar a very good observation means in any weather condition.
Infrared: of great importance for Remote Sensing.
Encompasses radiation with wavelengths ranging from 0,75 µm to
1,00 mm. The IR radiation is easily absorbed y many substances (a
warming effect).
Visible: is defined as the radiation capable of
producing the sensation of vision of the normal human eye. Small
variation in the wavelength (380 to 750 nm). Of importance to Remote
Sensing, since images collected in this range, will generally present a
great correlation with the visual experience of the interpreter.
Ultraviolet: wide range of the spectrum (10 to 400 nm).
Photographic film is more sensitive to ultraviolet radiation, than
visible light. Used for the detection of minerals by luminescence and
marine pollution. Strong atmospheric attenuation in this range presents
a great obstacle against its use.
X Rays: range of
1 A to 10 nm (1 A = 10-10 m). Are mainly generated by the
stoppage or disacceleration of high energy electrons. Since they are
constituted by high energy photons, the X-Rays are extremely
penetrating, being thus a powerful tool in the research of the
structure of the matter.
GAMMA Rays: are the most penetrating rays from the
emissions of radioactive substances. There is in principle no upper
limit to the frequency of gamma rays, despite we have an upper
frequency range called cosmic rays.
* The most used range in Remote Sensing lies between 0,3 µm
and 15,0 µm (known as the optical spectrum), for in this range
the optical components of reflection and refraction, such as lenses,
mirrors, prims, etc.., are used to collect and reorient the radiation.
Electromagnetic radiation sources
The sources of Electromagnetic radiation (EMR) can be divided in
natural (Sun, Earth, Radioactivity) and artificial (Radar, Laser,
etc..).
The Sun is the most important
natural source, for its energy, when interacting with the many
substances on the surface of the Earth, originates a series of
phenomena (reflection, absorption, transmission, luminescence, warming,
etc..) that are investigated by Remote Sensing.
Any electromagnetic energy source is
characterized by its spectrum of emission, that can be continuous or
distributed among discrete ranges. The Sun, for example, emits
radiation continuously distributed in the range that spreads from X-Rays
down to the microwave region, though concentrated in the 0,35 µm - 2,5 µm interval.
Any substance with a temperature above absolute zero (0o K
or -273o C) emit electromagnetic radiation, as a result of
its atomic and molecular oscillations. Such radiation can strike the
surface of another substance where it could be reflected, absorbed or
transmitted. In the case of absorption, the energy is usually
reemitted,
usually in a different wavelength.
In practice these four
processes: emission, absorption, reflection, and transmission occur
simultaneously and their relative intensity characterize the substance
under investigation. Depending on its physical and chemical
characteristics, those four processes occur with different intensities
in different regions of the spectrum. Such spectral behavior of the
many substances is called spectral signature and is used by Remote
Sensing to distinguish the many substances from one another.
Atmospheric propagation effects of EMR
When collecting data from a remote sensor, be it onboard a satellite
or airplane, the collected signal, most of the time, is the radiation
from the Sun, that interacts with the atmosphere before striking the
target and returning to the sensor after interacting with the
atmosphere again. Even if the measured signal is the radiation emitted
by the target, it interacts with the atmosphere before reaching the
sensor.
There are regions of the electromagnetic spectrum to which the
atmosphere is opaque, that is, it does not allow that radiation to pass
through. Those regions define the "atmospheric absorption bands". The
regions of the spectrum where the atmosphere is transparent to the
electromagnetic radiation from the Sun are known as "atmospheric
windows".
We should thus always consider the following factors associated to
the atmosphere, since they interfere with Remote Sensing: absorption,
air masses effects, spreading due to gaseous molecules or particles in
suspension, refraction, turbulences, radiation emission by the
atmospheric constituents, etc..
This way we conclude that the
attenuation of radiation is given by:
-
Absorption
-
the energy of an electromagnetic radiation beam is transformed
into other forms of energy. It is a selective attenuation observed in
various constituents, like water vapor, ozone, carbon monoxide, etc..
In many cases it can be ignored, because it is too small.
-
Spreading
-
the energy of a collimated beam of electromagnetic radiation is
removed by a change in the direction. When interacting with the
atmosphere, by the process of spreading, will generate a diffuse field
of light, that will propagate in all directions.
- (c) - Non selective spreading: it occurs when the
diameter of the particles are much greater than the wavelength. The
radiation of different wavelengths will be spread with the same
intensity. The white appearance of the clouds is explained by this
process.
- * The attenuation of the radiation can explain the reddish color of
dusk, that is, due to the greater length of atmosphere that the
radiation has to cross, and where the shorter wavelengths (blue) of the
light are captured, allowing through the red component of the solar
light.
- ** Due to the attenuation factor it is important that a planning
be done before the data acquisition and the interpretation processes.
Sensor Systems
As we've seen every material and natural phenomena absorb, transmit,
reflect and emit selectively the electromagnetic radiation. With the
present development it is possible to measure with reasonable precision
from the distance, the spectral properties of those materials and
phenomena.
Any sensor system presents the following components needed to
capture the electromagnetic radiation (see the figure below).
System Components.
where: collector = receives the energy through a
lens, mirror, antennas, etc..
detector = captures the energy of a certain range of the
spectrum;
processor = the recorded signal is submitted to a processing
(development, amplification, etc..) through which the product is
obtained;
product = contains the information needed by the user.
Sensor Types
The sensors can be classified as a function of the energy source or
as a function of the product type it produces.
As a function of the
energy source:
A-) PASSIVE : does not have an internal source of
radiation. It measures the solar radiation reflected or the radiation
emitted by the targets, for example: photographic systems.
B-) ACTIVE : have their own source of electromagnetic
radiation, working in strict ranges of the spectrum, for example:
radars.
As a function of the
product type:
A-) Non-imagers: do not provide an image of the sensored
surface, for example: radiometers (output in numbers or graphics) and
spectroradiometers (spectral signature).
They are essential for the acquisition of detailed information about
the spectral behavior of the objects on the surface of the earth.
B-) Imagers: give as a result an image of the observed
surface. Provide information about the spatial variation of the
spectral response of the observed surface.
B.1 - framing systems:
acquire the whole image of the scene at one instant
(ex: RBV)
B.2 - scanning
systems: for example: TM - MSS - SPOT.
B.3 - photographic system
The non-photographic imagers (scanning systems) came to fill the gap
left by the then most used optical sensor device: the photographic
camera. This, besides presenting easier operation and lower costs
presents a limitation in the capture of the spectral response, due to
the films that cover only the range between the near ultraviolet and
the thermal infrared. This type of sensor is also limited to the
overpasses hours and due to atmospheric phenomena do not allow the
frequent observation of the ground from high altitudes.
Since the data from these non-photographic sensors are colleted in
the form of electrical signal, they can be easily transmitted to
distant stations, where an electronic process will make its
discriminatory analysis.
The table below presents a comparative analysis of the photographic
sensors and scanning imagers.
|
Photographic Sensors
|
Scanning Imagers
|
Geometric resolution
|
high *
|
medium |
Spectral resolution
|
medium
|
high *
|
Repetitively |
low |
high *
|
Synoptic vision
|
low
|
high *
|
Database |
analogical
|
digital *
|
* greater advantage over the other
Introduction to RS
|