Fundamentals of Remote Sensing

What is Remote Sensing?

So, what exactly is remote sensing? For the purposes of this tutorial, we will use the following definition:
"Remote sensing is the science (and to some extent, art) of acquiring information about the Earth's surface without actually being in contact with it. This is done by sensing and recording reflected or emitted energy and processing, analyzing, and applying that information."
In much of remote sensing, the process involves an interaction between incident radiation and the targets of interest. This is exemplified by the use of imaging systems where the following seven elements are involved. Note, however that remote sensing also involves the sensing of emitted energy and the use of non-imaging sensors.
Remote Sensing Process
1. Energy Source or Illumination (A) - the first requirement for remote sensing is to have an energy source which illuminates or provides electromagnetic energy to the target of interest.
2. Radiation and the Atmosphere (B) - as the energy travels from its source to the target, it will come in contact with and interact with the atmosphere it passes through. This interaction may take place a second time as the energy travels from the target to the sensor.
3. Interaction with the Target (C) - once the energy makes its way to the target through the atmosphere, it interacts with the target depending on the properties of both the target and the radiation.
4. Recording of Energy by the Sensor (D) - after the energy has been scattered by, or emitted from the target, we require a sensor (remote - not in contact with the target) to collect and record the electromagnetic radiation.
5. Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be transmitted, often in electronic form, to a receiving and processing station where the data are processed into an image (hardcopy and/or digital).
6. Interpretation and Analysis (F) - the processed image is interpreted, visually and/or digitally or electronically, to extract information about the target which was illuminated.
7. Application (G) - the final element of the remote sensing process is achieved when we apply the information we have been able to extract from the imagery about the target in order to better understand it, reveal some new information, or assist in solving a particular problem.
These seven elements comprise the remote sensing process from beginning to end. We will be covering all of these in sequential order throughout the five chapters of this tutorial, building upon the information learned as we go. Enjoy the journey!

Did you know?

direct contact with the source of information
Of our five senses (sight, hearing, taste, smell, touch), three may be considered forms of "remote sensing", where the source of information is at some distance. The other two rely on direct contact with the source of information - which are they?

Whiz quiz

acquiring information at a distance
Can "remote sensing" employ anything other than electromagnetic radiation?    The answer is ...

Whiz quiz - Answer

While the term 'remote sensing' typically assumes the use of electromagnetic radiation, the more general definition of 'acquiring information at a distance', does not preclude other forms of energy. The use of sound is an obvious alternative; thus you can claim that your telephone conversation is indeed 'remote sensing'.
acquiring information at a distance

History of Remote Sensing

Modern remote sensing began in 1858 when Gaspard-Felix Tournachon first took aerial photographs of Paris from a hot air balloon. Remote sensing continued to grow from there; one of the first planned uses of remote sensing occurred during the U.S. Civil War when messenger pigeons, kites, and unmanned balloons were flown over enemy territory with cameras attached to them.
The first governmental-organized air photography missions were developed for military surveillance during World Wars I and II but reached a climax during the Cold War.
Today, small remote sensors or cameras are used by law enforcement and the military in both manned and unmanned platforms to gain information about an area. Today's remote sensing imaging also includes infra-red, conventional air photos, and Doppler radar.
In addition to these tools, satellites were developed during the late 20th century and are still used today to gain information on a global scale and even information about other planets in the solar system. For example, the Magellan probe is a satellite that has used remote sensing technologies to create topographic maps of Venus.

Types of Remote Sensing Data

The types of remote sensing data vary but each plays a significant role in the ability to analyze an area from some distance away. The first way to gather remote sensing data is through radar. Its most important uses are for air traffic control and the detection of storms or other potential disasters. In addition, Doppler radar is a common type of radar used in detecting meteorological data but is also used by law enforcement to monitor traffic and driving speeds. Other types of radar are also used to create digital models of elevation. Another type of remote sensing data comes from lasers. These are often used in conjunction with radar altimeters on satellites to measure things like wind speeds and their direction and the direction of ocean currents. These altimeters are also useful in seafloor mapping in that they are capable of measuring bulges of water caused by gravity and the varied seafloor topography. These varied ocean heights can then be measured and analyzed to create seafloor maps.
Also common in remote sensing is LIDAR - Light Detection and Ranging. This is most famously used for weapons ranging but can also be used to measure chemicals in the atmosphere and heights of objects on the ground.
Other types of remote sensing data include stereographic pairs created from multiple air photos (often used to view features in 3-D and/or make topographic maps), radiometers and photometers which collect emitted radiation common in infra-red photos, and air photo data obtained by earth-viewing satellites such as those found in the Landsat program.

Applications of Remote Sensing

As with its varied types of data, the specific applications of remote sensing are diverse as well. However, remote sensing is mainly conducted for image processing and interpretation. Image processing allows things like air photos and satellite images to be manipulated so they fit various project uses and/or to create maps. By using image interpretation in remote sensing an area can be studied without being physically present there. The processing and interpretation of remote sensing images also has specific uses within various fields of study. In geology, for instance, remote sensing can be applied to analyze and map large, remote areas. Remote sensing interpretation also makes it easy for geologists in this case to identify an area's rock types, geomorphology, and changes from natural events such as a flood or landslide.
Remote sensing is also helpful in studying vegetation types. Interpretation of remote sensing images allows physical and biogeographers, ecologists, those studying agriculture, and foresters to easily detect what vegetation is present in certain areas, its growth potential, and sometimes what conditions are conducive to its being there.
Additionally, those studying urban and other land use applications are also concerned with remote sensing because it allows them to easily pick out which land uses are present in an area. This can then be used as data in city planning applications and the study of species habitat, for example.
Finally, remote sensing plays a significant role in GIS. Its images are used as the input data for the raster-based digital elevation models (abbreviated as DEMs) - a common type of data used in GIS. The air photos taken during remote sensing applications are also used during GIS digitizing to create polygons, which are later put into shapefiles to create maps.
Because of its varied applications and ability to allow users to collect, interpret, and manipulate data over large often not easily accessible and sometimes dangerous areas, remote sensing has become a useful tool for all geographers, regardless of their concentration.

Remote Sensing

Use of Remote Sensing

in Natural Resource Management

Prepared and presented by D. Lichaa El-Khoury




1. What is Remote Sensing?

For the purposes of this course, we will use the following general definition: “Is the technology of measuring the characteristics of an object or surface from a distance”.
In the case of earth resource monitoring, the object or surface is on the land mass of the earth or the sea, and the observing sensor is in the air or space.

In order for an observing sensor to acquire knowledge about remote object, there must be a flow of information between the object and the observer. There has to be a carrier of that information. In our case, the carrier is the electromagnetic radiation (EMR).



Figure 1: Electromagnetic radiation


Hence, the main elements in the process of data collection in remote sensing are the object to be studied, the observer or sensor, the EMR that passes between these two, and the source of the EMR.

The process of remote sensing involves an interaction between incident radiation and the targets of interest. The figure above shows the imaging systems where the following seven elements are involved. Note, however that remote sensing also involves the sensing of emitted energy and the use of non-imaging sensors.

Figure 2: Electromagnetic Remote Sensing of the Earth Surface


1.      Energy Source or Illumination (A) - the first requirement for remote sensing is to have an energy source which illuminates or provides electromagnetic energy to the target of interest.
2.      Radiation and the Atmosphere (B) - as the energy travels from its source to the target, it will come in contact with and interact with the atmosphere it passes through. This interaction may take place a second time as the energy travels from the target to the sensor.
3.      Interaction with the Target (C) - once the energy makes its way to the target through the atmosphere, it interacts with the target depending on the properties of both the target and the radiation.
4.      Recording of Energy by the Sensor (D) - after the energy has been scattered by, or emitted from the target, we require a sensor (remote - not in contact with the target) to collect and record the electromagnetic radiation.
5.      Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be transmitted, often in electronic form, to a receiving and processing station where the data are processed into an image (hardcopy and/or digital).
6.      Interpretation and Analysis (F) - the processed image is interpreted, visually and/or digitally or electronically, to extract information about the target which was illuminated.
7.      Application (G) - the final element of the remote sensing process is achieved when we apply the information we have been able to extract from the imagery about the target in order to better understand it, reveal some new information, or assist in solving a particular problem.


2. What are the different remote sensing systems?

So far, we know that remote sensing imply a sensor fixed to a platform (usually satellite or aircraft), which detect and record radiation reflected or emitted from the earth’s surface.

The sensor mechanisms are very variable, and each has a distinct set of characteristics, but the main points are:

a)        The sensor can be an active system (where the satellite or the aircraft provides the source of illumination, this technique is used when no suitable natural source of radiation exists), or be a passive system (the source of the object illumination is independent of the sensor and it is a natural source).

b)        A variety of different parts of the electromagnetic spectrum can be used including:

·        Visible wavelengths and reflected infrared (imaging spectrometer): Remote sensing in the visible and near infrared (VNIR) wavelengths usually falls into the passive category. Here the sun is the source of the irradiance on the object being observed. The sensor collects the solar radiation which is reflected by the object. Active remote sensing occurs at these wavelengths only in the rare case where an aircraft carries a laser as the source of illumination.


Blue, green, and red are the primary colors or wavelengths of the visible spectrum. They are defined as such because no single primary color can be created from the other two, but all other colors can be formed by combining blue, green, and red in various proportions.

·        Thermal sensors: Remote sensing in the thermal-infrared wavelengths usually falls into the passive category, but in this case, the source of the radiation is the object itself. There is no irradiance and the sensor detects radiation which has been emitted by the object.

·        Microwaves (radar): These are used in the active remote sensing systems. The satellite or aircraft carries an antenna which emits a microwave signal. This signal is reflected by the ground and the return signal is detected again by the antenna.

The figure below shows the sections of the electromagnetic spectrum most commonly used in remote sensing




Figure 3: The electromagnetic spectrum


Each part of the spectrum has different characteristics and gives rather different information about earth’s surface. In addition, different surface covers (vegetation, water, soil, etc) absorb and reflect differently in different parts of the spectrum. Different wavebands in the electromagnetic spectrum therefore tend to be useful for different purposes.

c)        The sensor may be sensitive to a single portion of the electromagnetic spectrum (e.g. the visible part of the spectrum, like panchromatic film which is sensitive to the same wavebands as our eyes). Alternatively, it may be able to detect several parts of the spectrum simultaneously. This latter process is called multispectral sensing.

d)        Sensor equipment takes many shapes and forms, such as cameras, scanners, radar,…
               
Figure 4: Active versus Passive sensors

 

3. Radiation target interaction


Radiation that is not absorbed or scattered in the atmosphere can reach and interact with the Earth's surface. There are three forms of interaction that can take place when energy strikes, or is incident upon the surface. These are absorption, transmission, and reflection. The total incident energy will interact with the surface in one or more of these three ways. The proportions of each will depend on the wavelength of the energy and the material and condition of the feature.

Absorption occurs when radiation is absorbed into the target, while transmission occurs when radiation passes through a target. Reflection occurs when radiation "bounces" off the target and is redirected. In remote sensing, we are most interested in measuring the radiation reflected from targets.

Let's take a look at a couple of examples of targets at the Earth's surface and how energy at the visible and infrared wavelengths interacts with them.

Vegetation:
Chlorophyll strongly absorbs radiation in the red and blue wavelengths but reflects green wavelengths. Leaves appear "greenest" to us in the summer, when chlorophyll content is at its maximum. In autumn, there is less chlorophyll in the leaves, so there is less absorption and proportionately more reflection of the red wavelengths, making the leaves appear red or yellow (yellow is a combination of red and green wavelengths). The internal structure of healthy leaves act as excellent diffuse reflectors of near-infrared wavelengths. If our eyes were sensitive to near-infrared, trees would appear extremely bright to us at these wavelengths. In fact, measuring and monitoring the near-IR (NIR) reflectance is one way that scientists can determine how healthy (or unhealthy) vegetation may be. Vegetation could be differentiated using NIR sensors, e.g. deciduous trees have a higher reflectance than the coniferous in NIR.

Figure 5: Vegetation reflectance in VNIR

Water:
In the visible region of the spectrum, the transmission of the water is significant and so both the absorption and the reflection are low. The absorption of water rises rapidly in the NIR where both transmission and reflection are low.

Soil:
Soil has very different characteristics in the VNIR. The increase of reflection with wavelength in the visible is consistent with the human eye’s observation that soils can have a red or brown color to them.

We can see from these examples that, depending on the complex make-up of the target that is being looked at, and the wavelengths of radiation involved, we can observe very different responses to the mechanisms of absorption, transmission, and reflection. By measuring the energy that is reflected (or emitted) by targets on the Earth's surface over a variety of different wavelengths, we can build up a spectral response for that object. By comparing the response patterns of different features we may be able to distinguish between them, where we might not be able to, if we only compared them at one wavelength. For example, water and vegetation may reflect somewhat similarly in the visible wavelengths but are almost always separable in the infrared. Spectral response can be quite variable, even for the same target type, and can also vary with time (e.g. "green-ness" of leaves) and location. Knowing where to "look" spectrally and understanding the factors which influence the spectral response of the features of interest are critical to correctly interpreting the interaction of electromagnetic radiation with the surface.



Figure 6: Spectral curve


Figure 7: Coniferous versus deciduous trees spectral curves VNIR


4. How data is recorded?

The detection of the electromagnetic energy can be performed either photographically or electronically:

·        The photographic process uses chemical reactions on the surface of light-sensitive film to detect and record energy variations.

·        An image refers to any pictorial representation, regardless of what wavelengths or remote sensing device has been used to detect and record the electromagnetic energy

It is important to distinguish between the terms images and photographs in remote sensing.

A photograph refers specifically to images that have been detected as well as recorded on photographic film. The black and white photo below was taken in the visible part of the spectrum. Photos are normally recorded over the wavelength range from 0.3 mm to 0.9 mm - the visible and reflected infrared. Based on these definitions, we can say that all photographs are images, but not all images are photographs. Therefore, unless we are talking specifically about an image recorded photographically, we use the term image.



A photograph could also be represented and displayed in a digital format by subdividing the image into small equal-sized and shaped areas, called picture elements or pixels, and representing the brightness of each area with a numeric value or digital number. Indeed, that is exactly what has been done to the photo above. In fact, using the definitions we have just discussed, this is actually a digital image of the original photograph! The photograph was scanned and subdivided into pixels with each pixel assigned a digital number representing its relative brightness. The computer displays each digital value as different brightness levels. Sensors that record electromagnetic energy, electronically record the energy as an array of numbers in digital format right from the start. These two different ways of representing and displaying remote sensing data, either pictorially or digitally, are interchangeable as they convey the same information (although some detail may be lost when converting back and forth).


5. On the ground, in the air, in space

We learned so far some of the fundamental concepts required to understand the process that encompasses remote sensing. Now we will take a brief look at the characteristics of remote sensing platforms.

Ground-based sensors are often used to record detailed information about the surface which is compared with information collected from aircraft or satellite sensors. In some cases, this can be used to better characterize the target which is being imaged by these other sensors, making it possible to better understand the information in the imagery. Sensors may be placed on a ladder, tall building, cherry picker, crane, etc.

Aerial platforms are primarily stable wing aircraft, although helicopters are occasionally used. Aircraft are often used to collect very detailed images and facilitate the collection of data over virtually any portion of the Earth's surface at any time.

In space, remote sensing is sometimes conducted from the space shuttle or, more commonly, from satellites. Satellites are objects which revolve around another object - in this case, the Earth. For example, the moon is a natural satellite, whereas man-made satellites include those platforms launched for remote sensing, communication, and telemetry (location and navigation) purposes. Because of their orbits, satellites permit repetitive coverage of the Earth's surface on a continuing basis.


Figure 8: Different remote sensing platforms


6. Aerial photography

The figure below shows how air photograph are obtained.  To cover the required area of ground photos are taken consecutively along flight lines, which are usually parallel to each other across the area.

Figure 9: Obtaining aerial photographs

Two important things are to be noticed here:
1. Adjacent prints overlap with each other
è      Along the flight lines by about 60%
è      Between the flight lines by up to 20%

Why?

è      There is no gaps in the cover
è      Enables the photo to be viewed stereoscopically

2. The photographs are usually obtained with the camera pointing vertically down to the ground.

Why?

è      Simpler geometry than those of oblique angle
è      Calculate the inherent distortions and eliminate them.


7. Satellite imaging systems

Here we will discuss briefly some of the currently orbiting satellites of commercial use.

Landsat 7 (ETM+):

American, is the seventh of the Landsat series that were the first satellites to be launched in the early seventies (1972).

Spectral bands & pixel size

blue                                        30x30m
green                                     30x30m
red                                          30x30m
near infrared                       30x30m
2 mid-infrared                     30x30m
thermal infrared                 120x120m
panchromatic                    15x15m

Revisit period: 16 days
Scene size: 185 x 185 Km

Figure 10: Landsat system characteristics
SPOT:

French, was launched in 1986. Is more accurate and with less distortion than the Landsat.

Spectral bands & pixel size

green                                     20x20m
red                                          20x20m
near infrared                       20x20m
panchromatic                    10x10m

Revisit period: 26 days
Scene size: 60x60 Km


IRS-1C:

Indian. It was launched in 1995. The ministry of agriculture has a full set of images for all Lebanon.

Spectral bands & pixel size

green                                     23x23m
red                                          23x23m
near infrared                       23x23m
Shortwave infrared           70x70m
panchromatic                    5.8x5.8m

Revisit period: 24 days


IKONOS


Very recent satellite.

Spectral bands & pixel size

Blue                                        4x4m
green                                     4x4m
red                                          4x4m
near infrared                       4x4m
panchromatic                    1x1m

Revisit period: 3 days
Scene size: 11x11Km

 

KVR sovinformsputnik


Russian, used previously for military purposes. It is launched for special missions, and the film is dropped somewhere in Moscow.  The photos are then rasterized and sold as digital panchromatic images with pixel size of 2x2m.


8. Image analysis

In order to take advantage of and make good use of remote sensing data, we must be able to extract meaningful information from the imagery.

Much interpretation and identification of targets in remote sensing imagery is performed manually or visually, i.e. by a human interpreter. Recognizing targets is the key to interpretation and information extraction. Observing the differences between targets and their backgrounds involves comparing different targets based on any, or all, of the visual elements of tone, shape, size, pattern, texture, shadow, and association.

If a two-dimensional image can be viewed stereoscopically so as to simulate the third dimension of height, visual interpretation will be much easier.

When remote sensing data are available in digital format, digital processing and analysis may be performed using a computer. Digital processing may be used to enhance data as a prelude to visual interpretation. Digital processing and analysis may also be carried out to automatically identify targets and extract information completely without manual intervention by a human interpreter.

Digital image processing may involve numerous procedures including formatting and correcting of the data, digital enhancement to facilitate better visual interpretation, or even automated classification of targets and features entirely by computer. In order to process remote sensing imagery digitally, the data must be recorded and available in a digital form suitable for storage on a computer tape or disk.

At last but not least, an important element in the image analysis is the integration of data. In the early days of analog remote sensing when the only remote sensing data source was aerial photography, the capability for integration of data from different sources was limited. Today, with most data available in digital format from a wide array of sensors, data integration is a common method used for interpretation and analysis. Data integration fundamentally involves the combining or merging of data from multiple sources in an effort to extract better and/or more information. This may include data that are multitemporal, multiresolution, multisensor, or multi-data type in nature.

Figure 11: Data integration from different sources.


9. Applications

Natural resource management is a broad field covering many different application areas as diverse as monitoring fish stocks to effects of natural disasters (hazard assessment).

Remote sensing can be used for applications in several different areas, including:

q       Geology and Mineral exploration
q       Hazard assessment
q       Oceanography
q       Agriculture and forestry
q       Land degradation
q       Environmental monitoring,…

Each sensor was designed with a specific purpose. With optical sensors, the design focuses on the spectral bands to be collected. With radar imaging, the incidence angle and microwave band used plays an important role in defining which applications the sensor is best suited for.

Each application itself has specific demands, for spectral resolution, spatial resolution, and temporal resolution.

For a brief, spectral resolution refers to the width or range of each spectral band being recorded. As an example, panchromatic imagery (sensing a broad range of all visible wavelengths) will not be as sensitive to vegetation stress as a narrow band in the red wavelengths, where chlorophyll strongly absorbs electromagnetic energy.

Spatial resolution refers to the discernible detail in the image. Detailed mapping of wetlands requires far finer spatial resolution than does the regional mapping of physiographic areas.

Temporal resolution refers to the time interval between images. There are applications requiring data repeatedly and often, such as oil spill, forest fire, and sea ice motion monitoring. Some applications only require seasonal imaging (crop identification, forest insect infestation, and wetland monitoring), and some need imaging only once (geology structural mapping). Obviously, the most time-critical applications also demand fast turnaround for image processing and delivery - getting useful imagery quickly into the user's hands.

Let as consider an application, in concrete the use of remote sensing in the forest inventory. Forest inventory is a broad application area covering the gathering of information on the species distribution, age, height, density and site quality.

For species identification, we could use imaging systems or aerial photos.
For the age and height of the trees, radar could be used in combination with the species information assessed at a first stage.
Density is achieved mainly by an optical interpretation of aerial photos and/or high-resolution panchromatic images.
As for site quality, is one of the more difficult things to assess. It is based on topological position, soil type and drainage and moisture regime. The topological position can be estimated using laser or radar. However, the soil type and drainage and moisture regime could be more profitably collected using ground data.

The use of Remote Sensing in Crop monitoring (real case)

The countries involved in the European Communities (EC) are using remote sensing to help fulfill the requirements and mandate of the EC Agricultural Policy, which is common to all members. The requirements are to delineate, identify, and measure the extent of important crops throughout Europe, and to provide an early forecast of production early in the season. Standardized procedures for collecting this data are based on remote sensing technology, developed and defined through the MARS project (Monitoring Agriculture by Remote Sensing).
The project uses many types of remotely sensed data, from low resolution NOAA-AVHRR, to high-resolution radar, and numerous sources of ancillary data. These data are used to classify crop type over a regional scale to conduct regional inventories, assess vegetation condition, estimate potential yield, and finally to predict similar statistics for other areas and compare results. Multisource data such as VIR and radar were introduced into the project for increasing classification accuracies. Radar provides very different information than the VIR sensors, particularly vegetation structure, which proves valuable when attempting to differentiate between crop types.

One the key applications within this project is the operational use of high resolution optical and radar data to confirm conditions claimed by a farmer when he requests aid or compensation. The use of remote sensing identifies potential areas of non-compliance or suspicious circumstances, which can then be investigated by other, more direct methods.
As part of the Integrated Administration and Control System (IACS), remote sensing data supports the development and management of databases, which include cadastral information, declared land use, and parcel measurement. This information is considered when applications are received for area subsidies.
This is an example of a truly successfully operational crop identification and monitoring application of remote sensing.

Link: http://staff.aub.edu.lb/~webeco/rs%20lectures.htm