An Introduction to Geomatics - Islamic University of...

Post on 13-Jun-2020

4 views 0 download

transcript

An Introduction to Geomatics

الجيوماتكسخاص بطلبة مساق مقدمة في علم

Prepared by:

Dr. Maher A. El-Hallaq Associate Professor of Surveying – IUG

1

Airborne Imagery

Dr. Maher A. El-Hallaq Associate Professor of Surveying The Islamic University of Gaza 2

Part One

Airborne Imagery

Photogrammetry

3

Ba

ckg

round

4

Background

5

Backgro

und

6

• Aerial photography has a history dating back to the

mid 1800s, when balloons and even kites were used

as camera platforms.

• In 1908, photographs were taken from early aircraft.

• Aerial photography became accepted technique for

collecting mapping and other ground data from 1930s

to the present.

• Photogrammetry is the science of making

measurements from aerial photographs.

Background

7

• Measurements of horizontal distances and elevations

form the backbone of this science.

• These capabilities result in the compilation of

planimetric maps or orthophoto maps showing the

horizontal locations of both natural and cultural

features, and topographic maps showing spot

elevations and contour lines.

• Both black and white panchromatic and color film

are used in aerial photography.

• Color film has three emulsions: blue, green and red

light sensitive.

Background

8

Background

9

Background

The more recent topic

of airborne imagery

is digital imagery (2000).

The most common detector

is the Charge-Coupled

Device (CCD)

10

Background

Old

New 11

Background

Elements of an aerial

mapping camera

12

Background

Orthographic versus perspective projection

13

Photogrammetric Process

Photogrammetric mapping is achieved through four general

processes :

Imagery Acquisition.

Ground Control Acquisition.

Accurate Adjustment of the Imagery to the Earth.

Feature Collection.

14

Photogrammetric Process

15

Photogrammetric Process

Imagery Acquisition.

Imagery Types and Uses

Imagery Type General Purposes

Black and White Aerial

Photography Topographic and Planimetric Mapping

Natural Color Aerial

Photography Topographic and Planimetric Mapping

Infrared Aerial Photography Vegetation Analysis, Land use

Satellite Imagery Small Scale Mapping, Vegetation

Analysis, Land use/Land Classification

Microwave Groundwater 16

Photogrammetric Process

Imagery Acquisition.

17

Photogrammetric Process

Imagery Acquisition.

18

Photogrammetric Process

Imagery Acquisition.

Scale of an aerial photograph 19

Photogrammetric Process

Imagery Acquisition.

Flight Plan

20

Photogrammetric Process

Imagery Acquisition.

Flight Plan

21

Photogrammetric Process

Example:

22

Photogrammetric Process

Example:

23

Photogrammetric Process

Example:

24

Photogrammetric Process

Ground Control Acquisition.

Ground control methods

25

Photogrammetric Process Accurate Adjustment of the Imagery to the Earth.

The process of adjusting the aerial photography to the earth is

critical to the accuracy of final mapping.

most projects are adjusted using aerotriangulation methods.

These methods require fewer ground control points than

conventional adjustment methods

Aerotriangulation methods are accomplished with computer

software. The software is very efficient and allows for quality

control checks throughout the process. 26

Photogrammetric Process

Feature Collection.

Photogrammetric mapping feature collection can

generally be divided into four categories:

1. Topographic Features (DEM and TIN models)

2. Planimetric Features

3. Orthophotography

4. Land use.

These feature types can be collected accurately using

stereo imagery and stereo viewing equipment. 27

Photogrammetric Process

Feature Collection.

3. Orthophotography

28

Height Determination

29

Height Determination

30

Height Determination

Note:

If hB is not known, it is sufficiently accurate to use (havg)

instead of hB especially if H is large.

31

Ground Coordinates

32

33

Thank you

Any Question?

Satellite Imagery

Dr. Maher A. El-Hallaq Lecturer of Surveying The Islamic University of Gaza

34

Part Two

Satellite Imagery

Remote Sensing

35

Background

Remote sensing derives from photography, optics and

spectrometry; its history is deeply entwined with the

domains of electromagnetic spectrum and aeronautics.

36

Background

37

Background

38

• Remote sensing is a broad research field with a wide

range of applications.

• Technically the term means acquisition of

information without being in direct contact with

the object that is studied.

• This will typically imply detection of some kind of

radiation. The detected radiation is either emanating

from the object itself or is reflected by it.

• The remote sensing principle, using waves of the

electromagnetic spectrum.

Background

39

Background

40

Electromagnetic Spectrum

The radiation propagates through a vacuum with

the speed of light, c, at about 300000 km/second.

41

Electromagnetic Spectrum

42

Electromagnetic Spectrum

43

Ele

ctro

magnetic

Sp

ectru

m

44

• Passive remote sensing instruments develop images

of the ground surface as they detect the natural

energy that is either reflected (if the sun is the signal

source) or emitted from the observed target area.

• Active remote sensing instruments (for example,

radar and lidar) transmit their own electromagnetic

waves and then develop images of the earth's surface

as the electromagnetic pulses (known as backscatter)

are reflected back from the target surface.

Techniques of Remote Sensing

45

Optical: spectral range in the interval 0.3–15 μm,

typical of passive remote sensing, identified by the

sensors:

– panchromatic: one band including the visible range and in some cases part of the near infrared;

– multispectral: 2–9 spectral bands;

– super-spectral: 10–16 spectral bands;

– hyperspectral: more than 16 spectral bands;

The increase of the number of bands in general improves

the bandwidth (bandwidth is more common) and the

spectral interval.

Spectrum Sensors

46

Radar: microwaves ranging from 1 mm to 1 m, typical

active remote sensing tool, that can operate, with single

or multi-polarization and with single or multiple incidence

angle, in:

– single frequency;

– multi-frequency.

Technical problems of acquisition and representation are

currently being operatively solved, while problems still

exist in understanding the characteristics of these

techniques by decision makers and administrators at

national, regional, provincial and city level.

Spectrum Sensors

47

Techniques of Remote Sensing

48

Techniques of Remote Sensing

49

Techniques of Remote Sensing

50

• One important way of discriminating objects in a

remotely sensed scene is by means of examining

their spectral signatures, a spectral response.

• Each material has its own spectral signature

constituting of the spectral distributions of its

emittance and reflectance.

Spectrum Signature

51

Spectrum Signature

Spectral signature of some artificial materials 52

Spectrum Signature

53

• Electromagnetic sensors are designed to detect only

radiation in a limited wavelength-range; a band.

• The reason for this is that the emitted energy within a

narrow band tells us more about the reflectance of an

object than an average over a wide band

• When the satellite image is received and processed on the

ground, bands from several sensors may be combined.

This will generally simplify the interpretation of a satellite-

scene

Sensor Fusion

54

• Some object features stand out in one band while

other features are spotted in another band.

• The combination of information from several sensors

is usually termed sensor fusion.

• The combination or fusion of several bands is similar

to the approach used by the human vision system to

create colors.

Sensor Fusion

55

• Satellite systems differ from the human vision system

in that each satellite has its own set of sensors,

constituting a unique set of bands and will thus require

interpreting software specially adapted for it.

Sensor Fusion

56

Image Characteristics

57

Image Characteristics

58

Image Characteristics small equal-sized and shaped areas, called picture

elements or pixels, and

representing the brightness

of each area with a numeric

value or digital number

59

• Image Resolution defines the ability of a

sensor to distinguish between spatial

characteristics of objects on the earth’s

surface.

• It can change due to sensor design, detector

size, focal length, satellite altitude and time.

Image Resolution

60

• Spatial Resolution

• Spectral Resolution

• Radiometric Resolution

• Temporal Resolution

Image Resolution

61

• The greater the altitude of the sensor, the larger the

area seen but the ability to distinguish some detail

may be lost.

• Some sensors have greater ability to see details,

greater spatial resolution.

• The ground sample distance GSD (Pixel Size in

image)

• IFOV: Instantaneous Field Of View

• GSD (m) = IFOV (radians) × Altitude (m)

Spatial Resolution

62

Spatial Resolution

63

• Surface features can be identified by analyzing the

spectral responses over distinct wavelength changes.

• The higher the spectral resolution of the sensor, the

more distinctions that can be made of surfaces

materials.

• Multispectral vs. hyperspectral sensors.

Spectral Resolution

64

• Refers to the sensors ability to detect small changes

in energy and reflects the number of bits available

for each pixel.

• Imagery data are represented by positive digital

numbers like binary format. Each bit records an

exponent of 2.

• 2-bit is 22=4, 8-bit is 28=256, 10-bit is 210=1024

• 8-bit resolution of a region holds finer details than 2-

bit resolution.

• Multispectral vs. hyperspectral sensors.

Radiometric Resolution

65

Radiometric Resolution

66

• The surface of the earth is always changing, slowly or

rapidly.

• The time period required to achieve repeat coverage

of the same surface is called the revisit time.

• Temporal variations in surface features can be used

to identify some features and to track systematically

the changes in other features.

Temporal Resolution

67

Temporal Resolution

68

R.S.Satellites

Satellite Launch

Date

Sensor

Data

Altitude

km

Swath

km

Orbit

Type

Orbit

Period

Revisit

time

ERS-2 20/5/95 SAR C band 785 105.5 Near polar; sun

synchronous 100 min 35 days

IKONOS-2 24/9/99 MSS

panchronmatic 681 20

Near polar; sun

synchronous 98 min 2.9 days

IRS 2000 Panchronmatic/

hyperspectral 146 22 days

Landsat 5 1/3/84 TM and MSS 705 185 Near polar; sun

synchronous 99 min 16 days

Landsat 7 15/5/99 ETM MSS 830 185 Near polar; sun

synchronous 99 min 16 days

Radarsat-1 20/12/95 SAR C band 798 50-500 Circular; sun

synchronous

100.7

min 24 days

1 m

30 m

69

R.S.Satellites

Satellite Launch

Date

Sensor

Data

Altitude

km

Swath

km

Orbit

Type

Orbit

Period

Revisit

time

QUICKBIRD 2000 MSS

panchronmatic 470

Not sun

synchronous

Less than

5 days

SPOT 1 22/2/86 MSS

panchronmatic 830 60-120

Near polar; sun

synchronous 101 min 26 days

SPOT 2 21/1/90 MSS

panchronmatic 830 60-120

Near polar; sun

synchronous 101 min 26 days

SPOT 4 24/3/98 MSS

panchronmatic 822 60-120

Near polar; sun

synchronous 101 min 26 days

SPOT 5 2001

TERRA 18/12/99 MODIS

ASTER 705 2100

Near polar; sun

synchronous

96.5

min 16 days

0.6 m

70

R.S.Satellites

71

R.S.Satellites

80 m

72

R.S.Satellites

30 m

73

R.S.Satellites

20 m 10 m

74

R.S.Satellites

75

Feature Extraction

76

Feature Extraction

• Unsupervised: relies on color and tone as well as statistical

clustering to identify features.

• Supervised: requires comparative examples of imaging for

each ground feature category.

• Hybrid: is a combination of the first two.

• Classification and regression tree: using binary partitioning

software to analyze and arrive eventually at a best estimate

about the ground feature identification.

77

78

Thank you

Any Question?