Wednesday, August 31, 2016

Software Scenario Functions: Watershed modelling

Watershed is a concept in hydrology that refers to the topographical boundary dividing two adjacent catchment basins. A watershed is an area of land that catches rain and snow and drains or seeps into a marsh, stream, river, lake or groundwater. Homes, farms, cottages, forests, small towns, big cities and more can make up watersheds. They come in all shapes and sizes and can vary from millions of acres, to a few acres that drain into a pond.

Modelling is the process of representing a real world object or phenomenon as a set of mathematical equations.

Watershed models study natural processes of flow of chemicals and microorganisms while determining the impact of human activities on these processes. Watershed modelling is an important tool to focus efforts to solve watershed based water resource, environmental, social and economic problems.

A watershed model can be used for:

  • Water resources planning, development, design, operation and management
  • Flooding
  • Droughts
  • Upland erosion
  • Stream bank erosion
  • Coastal erosion
  • Sedimentation
  • Non point source pollution
  • Water pollution from industrial, domestic and agricultural sources
  • Migration of microbes
  • Deterioration of lakes
  • Desertification and degradation of land
  • Irrigation of agricultural lands
  • Conjunctive use of surface and groundwater, etc

Watershed models are classified into

  • Black Box models that mathematically describe the relation between variables. 
           Ex: Unit hydrograph approach, ANN, Rational formula etc.
  • Lumped models that lie between the Black Box models and Distributed models. 
           Ex: Stanford watershed model, etc
  • Distributed models that are based on complex physical theory on the solution of real governing equation.
           Ex: St. Venant equations for watershed modelling, etc
GIS plays an important role in watershed modeling.
The areas in which GIS is applied in watershed modeling are:

  • Hydrologic assessment
  • Model setup
  • Parameter determination and
  • Modeling

Hydrologic assessment involves using GIS for the analysis of various hydrologic factors for the purpose of risk assessment or susceptibility to pollution, flood, drought, erosion, etc.

Model setup involves defining topography, boundaries and drainage networks of a watershed so as to form the basic framework for applying both lumped and distributed watershed models. DEM is the main data structure used for this work.
In the context of hydrologic assessment and model setup, GIS provides several valuable tools for data creation and management, automated feature extraction and watershed delineation.

Data creation is done by collecting elevations using GPS or digital contour maps to generate new DEMs where no data exists for the aera of interest. Sometimes, contour data on paper-based maps can be converted to digital format using GIS digitizing tools.

Automated feature extraction is performed by various GIS software packages that offer automated routines for delineating watershed boundaries and draining divides. GIS software can also be used for extracting surface drainage channel networks and generating other hydrography data from DEMs.   Ex: WMS and Archydro.

The application of watershed models with GIS requires data from a variety of sources in different formats into a common coordinate space for efficient processing or display. Most GIS software provides tools that assist transforming datasets into a common coordinate space.

An important aspect of modeling watershed processes is to determine parameter inputs. The Watershed Modeling System (WMS) is capable of processing both vector and raster data for land use, soil type, rainfall zone and flow path networks to develop important modeling parameters. 

Friday, August 19, 2016

Software Scenario Functions: Environmental modelling

A model is an abstraction of reality. This helps by representing complex reality in the simplest way. A change in any parameter of the model can be used to visualise the impacts on the entire model. This is the purpose of modeling. The best model is always that which achieves the greatest match between model outputs and real-world observations. Modeling is a powerful tool to understanding observations and can be used to develop and test theories. Moreover, it is faster to get a result by modeling than to actually spend time, energy and resources on the field. Environmental modeling is a powerful tool to understand the interactions between the environment, ecosystems and populations of animals. This is essential for monitoring and management of sustainable means of human dependency on environmental systems.

Environmental models integrate both time and space to understand the nature and functioning of the ecosystem under study. Environment models are multi-component in nature requiring the understanding of interactions between the biotic and abiotic systems. The complexity increases with the increasing number of components and an understanding of these systems requires breaking them into manageable components, combining them and explicitly describing the interactions occurring.

Environmental models cannot be built in the laboratory to adequately represent them. Environmental problems are multivariate, non-linear and complex. Modeling provides an integrated framework in which the individual disciplines can work on different aspects of the research problem and provide a module for integrating within the modelling framework.

GIS and environmental modeling have been used for decision making, planning and environmental management. This combination has been used along with environmental models for applications like:

  • monitoring of deforestation
  • agro-ecological zonation
  • ozone layer depletion
  • flood early warning systems
  • climate and weather prediction
  • ocean monitoring and mapping
  • soil mapping
  • wetland degradation
  • natural disaster & hazard assessment and mapping
  • land cover for input to global climate models

GIS models may be varied in space, in time or in state variables. GIS and remote sensing provide tools to extrapolate models in space as well as upscale models to smaller scales.

A few examples of environmental models used in GIS are listed below with brief descriptions:

  1. RUSLE - The Revised Universal Soil Loss Equation was successfully used with GIS. The process uses raster processing capabilities of the Map Analysis and Processing System (MAPS) to overlay data themes containing spatially distributed values for different RUSLE factors. This technique produces a map of relative levels of soil erosion potential caused due to rainfall, soil type, terrain, vegetation and erosion control practice. The terrain factor from DEM helps calculate soil loss potential for large areas.Thus the RUSLE and GIS interface can be used for soil degradation studies over a large scale.
  2. BIOCLIM - The BIOCLIM system determines the distribution of plants and animals based on climatic surfaces. Bioclimatic variables are used in species distribution modeling and related ecological modeling techniques. Worldclim is a set of global climate layers (gridded climate data) with a spatial resolution of about 1 km2. This data can be used for mapping and spatial modeling. GIS can be used in conjunction with BIOCLIM to make grid maps of distribution of biological diversity or it can also be used to find areas that have high, low or complementary levels of diversity. GIS can also be used to map and query climate data. BIOCLIM and GIS can also be used to predict species distribution. 
  3. CART -The Classification And Regression Tree (CART) model is a binary partioning methods yielding a class of models called tree -based models. The method is applied to several environemtal and ecological studies due to its capability of handling both continuous and discrete variables, its ability to model interactions among predictors and its hierarchical structure. When used in combination with GIS, the CART model output was converted into suitability maps that show the abrupt transitions between areas of high and low suitability.
  4. Monte Carlo simulation - The Monte Carlo method involves generation of random number of parameters to explore the behaviour of a complex process. The numbers are generated using a probability distribution function that describes the occurrence probability of an event. The power of this method lies in the number of simulated samples. The Monte Carlo simulation provides an answer to what may happen and the probability associated with each scenario. The Monte Carlo simulation technique is widely used in spatial analysis. It finds applications in spatial data disaggregation and statistical testing.

Wednesday, August 17, 2016

Characteristics of Indian Remote Sensing series of satellites

List of Indian Earth Observation Satellites:

  1. 1C
  2. P3
  3. 1D
  4. P4
  5. OceanSat-1
  6. TES
  7. P6 ResourceSat-1
  8. P5 CartoSat-1
  9. 2A CartoSat-2
  10. P7 OceanSat-2
  11. RISAT-1
  12. ResourceSat-2
  13. Megha-Tropiques
  14. RISAT-2
  15. ResourceSat-3
  16. HyperSpectral Image
  17. OceanSat-3
Current IRS missions:

ResourceSat-1 (IRS-P6)
The main features of this satellite are listed below:
It has a circular polar Sun synchronous orbit
Its orbit height is 821 km at an inclination of 98.76
Its orbital period is 101.35 minutes and it performs 14 orbits per day
Its repetivity (LISS-3) is 24 days and revisit (AWiFS) is 5 days
Its 3-axis body is stabilized using reaction wheels, magnetic torquers and hydrazine thrusters
It is powered by a solar array generating 1250 W (at End of Life) using two 24 Ah Ni-Cd batteries
Its mission life is 5 to 7 years
The IRS-P6 has better radiometric resolution, red instead of pan-chromatic band and only one CCD array leading to better internal geometry
It is suitable for mapping and mobile cell phone planning
The LISS-IV camera can be operated in either monochromatic or multi-spectral mode

The main features of this satellite are listed below:
It has a circular polar Sun synchronous orbit
Its orbit height is 618 km at an orbit inclination of 98.87
Its orbit period is 97 minutes and it performs 15 orbits per day
Its 3-axis body is stabilized using reaction wheels, magnetic torquers and hydrazine thrusters
It is powered by a 5 sq. km solar array generating 1100 W (at EOL) using 24 Ah Ni-Cd batteries
Its mission life is 5 to 7 years
CartoSat has two panchromatic cameras for in-flight stereo viewing and this stereo data is provided to ground stations in real time
Its revisit capability is 5 days
Its swath is 27.5 km
It is capable of providing DEMs of approximately 4m elevation

The main features of this satellite are listed below:
Its orbit height is 630 km at an inclination of 97.91
Its orbit period is 97.4 minutes and it completes 14 orbits per day
Its revisit is 4 days and repetivity is 310 days
Its 3-axis body is stabilized using reaction wheels, magnetic torquers and hydrazine thrusters
It is powered by two 18Ah Ni-Cd batteries that generate 900 W using solar power
Its operational life is 5 years.
Its resolution is 0.81m and swath is about 9.6 km

Future IRS missions are:
ResourceSat-2 that is identical to ResourceSat-1 with a few sensor enhancements
ResourceSat-3 having increased resolution and more spectral bands along with addition of new sensors with 25 km swath
ResourceSat-4 adds new sensors with 12.5 km swath based on 500m optics
CartoSat series of satellites with increased resolution and more spectral bands
RISAT is the first Indian Remote Sensing Synthetic Aperture  Radar (IRS SAR) with:
            - C-band SAR
            - 10 km swath in spot mode and 240 km swath in scan mode
            -1 m to 50 m resolution
            -Single/Dual polarization

Tuesday, August 16, 2016

Interpretation of remote sensing data

The basic principles of image interpretation are:
  1. Location
  2. Size
  3. Shape
  4. Shadow
  5. Tone and Colour
  6. Texture
  7. Pattern
  8. Height and Depth
  9. Situation and Association
Location refers to the geographic location and is an important tool that helps to identify the type of vegetation. This is because any type of vegetation is specific in its requirement of soil, climate and other factors that are typical to a certain location.

Size of objects on images is important with reference to the image scale. Length, width and perimeter are commonly measured. Measuring the size of an unknown object helps the interpreter to rule out possible alternatives. For example, the dimensions of standard objects are known and this makes it possible to determine the size of an unknown object by comparison.

Shape refers to the general form, configuration or outline of individual objects. In case of stereoscopic images, the objects height also defines the shape.

Shadows may either aid or hinder in interpretation. Extended shadows can make it difficult to understand other objects that can be identified easily.  A shadow cast by an object may be a key to the identity of another object. It is always recommended that the photos are oriented so that the shadow falls towards the interpreter otherwise a pseudoscopic illusion is produced leading to low points appearing high and vice versa.

Tone and Colour of all matter refers to different proportions of energy reflected in the blue, green, red and infra-red portions of the electromagnetic spectrum. This can be used as a spectral signature to identify the type of matter. Different shades of a colour are called as tone. The darker an object appears, the less light it reflects. Colour imagery is preferred as humans can detect thousands of colours. Colour help in the process of photo interpretation.

Texture is the frequency of tonal change on an image. It determines the overall smoothness or coarseness of image features. It is defined as the characteristic placement and arrangement of repetitions of tone or colour in an image. As the scale of an image is reduced, the texture of any given object or area becomes progressively finer and ultimately disappears. An interpreter can distinguish between features of similar reflectances based on their textural differences. For example: the contrasting textures of two tree species.

Pattern refers to the spatial arrangement of objects. Objects may be arranged systematically or randomly. A few other patterns are: Circular, Linear, Oval, Rectangular and Curvilinear to name a few. The repetition of a few general forms is characteristic of natural and constructed objects thus forming a pattern that helps the image interpreter in recognizing objects. For example: the spatial arrangement of trees in an orchard versus the random distribution of trees in a forest.

Height and depth is also known as elevation and baythymetry. It is one of the most important important diagnostic element of image interpretation. Any object that rises above local landscape will show some radial relief. This casts a shadow that provides information regarding its height.

Situation and Association Situation refers to the manner in which the objects in the image are organized and situated with respect to each other. Association refers to the fact of finding a particular activity in an image. Location, Situation and Association are normally interrelated to each other in an image. As an example, consider a commercial complex. It has several large buildings, huge parking areas and is usually located near a major road.

Remote sensing data products

The main products of remote sensing satellites are:

  • Sea Surface Temperature (SST), Ocean colour, Ocean winds and Sea surface height measured by satellite sensors.
  • Satellite derived chlorophyll concentration and ocean currents
  • Satellite remote sensing products can be used to generate potential habitat maps of aquatic life

The products of remote sensing data are extensively used in disaster management mainly in the following disasters:

  1. Extreme weather
  2. Floods
  3. Coastal hazards / Tsunamis
  4. Volcanoes
  5. Earthquakes
  6. Landslides
  7. Droughts
  8. Dust storms and
  9. Wild fires
Remote sensing data products have demonstrated their usefulness in combating or long term management of the following problems:
  1. Climate change
  2. Pollution monitoring
  3. Plant health
  4. Land usage
  5. Population density
  6. Deforestation and
  7. Desertification
Remote sensing capabilities can be used to provide situational awareness for a wide area in a very short time frame. The data products of remote sensing are:

For extreme weather:

  • Atmospheric temperature and water vapour profiles are used as input by forecasters
  • Sea surface winds, cloud cover, rainfall and cloud profiles are used as inputs to models
  • Remote sensing imagery is used for tracking storms and damage

For floods:

  • SAR generated DEMs are used to indicate risk areas
  • Weather forecasts can be made to warn the public
  • Satellite imagery is used to assess impact and track recovery
  • Remote sensing data can be used to predict risk due to areal precipitation
  • Remote sensing products such as sea surface temperature and height are used to forecast e Nino
  • Snow cover, surface temperature and rain measurements are used to forecast available water
  • Soil moisture, rainfall and vegetation health are used to observe onset and progress of droughts
  • SAR imagery makes it possible to detect and track oil spills in the ocean
  • Atmospheric pollutants can be detected using Infra-Red (IR) radiation
  • Ocean colour is used to detect red tides

Friday, August 12, 2016



The characteristics of sensors are:
  • Spatial resolution
  • Spectral resolution
  • Radiometric resolution &
  • Temporal resolution
Spatial resolution Spatial resolution describes how much detail in a photographic image is visible to the human eye. The ability to distinguish between small details is one way to describe spatial resolution. Spatial resolution of images obtained from satellites sensor systems is usually expressed in m. 

Spectral resolution EMR patterns are recorded by sensors with separated spectral bands. Spectral reflectance curves or spectral signatures of different types of ground targets provide a knowledge base for extracting information. Spectral responses from ground targets are recorded in separate spectral bands by sensors. Spectral resolution refers to the number of bands and the width of each band. 

Radiometric resolution Radiometric resolution refers to the 'colour depth'. Higher radiometric resolution implies higher sensitivity of the sensor to detect minute changes in electromagnetic energy (sensitivity of sensor to detect differences in reflected or emitted energy). 

Temporal resolution Temporal resolution refers to the time (day or season) of image acquisition. A temporal resolution is helpful in evaluating the change, impacts and severity of the damage, if any. An important aspect in this regard is the revisit period.

Scattering mechanisms for active sensors

Scattering mechanisms for active sensors

The three common scattering mechanisms for active sensors are:
  1. Smooth surface or specuar reflection: This kind of reflection comes from flat terrain like roads or water. In this type of reflection, very less energy of the transmitted radiation returns to the sensor. Pixels appear dark. Typical pixel values are less than -20dB.                                                                        
  2. Rough surface or diffuse scattering: This ype of scattering occurs on rough surfaces typically ploughed farms or vegetation. Radiation is scattered in all directions diffusely. Typical pixel values are greater than -20dB.                                                                                                                                     
  3. Double bounce backscatter: This type of scattering occurs when radiation bounces off a series of objects like vegetation and buildings. The reflected pulse hits one surface after another.Typical pixel values are greater than -10dB.

Thursday, August 11, 2016

Diagrammatic representation of types of sensors

Types of sensors

A sensor is a device that gathers energy (EMR) and converts it to a signal and presents it in a form suitable for obtaining information about the object under investigation.

Remote sensors can broadly be classified as passive sensors and active sensors.
Passive sensors measure the natural light emitted from the sun.
Active sensors have their own source of light and the sensors measure the reflected energy.
The Earth’s surface interacts with the incoming Electro-Magnetic Radiation (EMR) from the Sun.  This is known as incident energy (Ei). The three fundamental interactions with incident energy are:
  1. Reflected energy (Er)
  2. Absorbed energy (Ea) and
  3. Transmitted energy (Et)
 Incident energy formula:
Ei = Er + Ea + Et
Passive sensors measure this natural energy at specific frequencies or wavelengths. Wavelength is conventionally measured in ‘m’ or multiples thereof ‘nm’ etc. The frequencies radiations typically sensed are listed below:
Visible radiation – 390 to 700 nm
Infra-red radiation – 750 to 1 mm
Ultra-violet radiation – 100 to 400 nm
These wavelength ranges are known as “bands

Sensors can have multiple bands (3 to 10 bands) and this is known as MSS (Multi-Spectral Sensing).
Hundereds of finer bands are known as Hyper-spectral imaging.

The solar radiation that is incident on Earth, is reflected back to the passive sensors and this reflected energy is detected by the passive sensors.
Reflected energy formula
Er = Ei - Ea - Et

Different objects on Earth, reflect, transmit and absorb different amounts of energy and this implies that each feature on Earth a unique property called spectral reflectance (p).

Spectral reflectance formula:
p = E/ Ei

Examples of active sensors are:
  1. Radar
  2. Camera with flashlight
Examples of passive sensors are:
  1. Camera without flashlight
  2. ALL remote sensing sensors
Non-scanning or framing sensors: These sensors measure the radiation coming from the entire scene at once.
Examples of non scanning sensors are:

  1. Our eyes
  2. Photographic cameras
Imaging sensors: These sensors form image by collected radiation. They may be scanning sensors or non-imaging sensors. In scanning sensors, the image is sensed point-by-point. These scanners may be along track scanners in which the image is acquired line by line or across track scanners in which the image is acquired pixel by pixel.

Non imaging sensors: These type of sensors do not form the image. They are used to recors spectral quantity as afunction of time.
Examples are: Sensors for temperature measurement, study of the atmosphere, etc.

Image plane scanning: In this type of sensor, the lens is used after the scan mirror to focus the light on the detector

Object plane scanning: In this type of sensor, the lens is placed before the scan mirror to focus the light on the detector.

Most of the active sensors operate in the microwave portion of the electromagnetic spectrum. A few of the active sensors are listed below:

  1. Laser Altimeter
  2. Radar
  3. Lidar
  4. Ranging Instrument
  5. Scatterometer and
  6. Sounder
Passive sensors include different types of radiometers and spectrometers. Passive remote sensors in remote sensing operate in the visible, infrared, thermal infrared and microwave regions of the electromagnetic spectrum. Passive remote sensors used are listed below:
  1. Accelerometer
  2. Hyperspectral radiometer
  3. Imaging radiometer
  4. Radiometer
  5. Sounder
  6. Spectrometer and
  7. Spectroradiometer

Thursday, August 4, 2016

Interaction of EMR with Earth's surface

The interaction of Electro-magnetic radiation (EMR) with the atmosphere and the earth's surface play a very important role in Remote sensing. Each molecule has a set of absorption bands in the electromangetic spectrum. Hence, only the wavelength regions outside the absorption bands of atmospheric gases can be used for remote sensing. These regions are known as Atmospheric Transmission Windows.
These windows are found in the visible, near infrared, certain bands of thermal infrared and microwave region.
The figure below shows the fate of atmospheric radiation
The gases in the atmosphere interact with solar irradiation and with the radiation reflected from the Earth's surface. The electromagnetic radiation (EMR ) will experience varying degrees of transmission, absorption, emittance and/or scattering.

Software Scenario Functions: Visibility Analysis

The portion of the terrain that can be seen from any particular elevation is known as viewshed. The process of visibility and intervisibility at a particular point on a topographic surface is called viewshed analysis or visibility analysis.

Some of the uses of this technique are:

  1. Siting television, radio and cellular phone transmitters and receiving stations
  2. Locating towers for observing forest fires
  3. Routing highways that are not visible to nearby residents
Visibility analysis is useful in planning that requires features to be either visible or concealed .

The simplest method is to connect an observer location to every possible target location.
In the next step, ray tracing is carried out. This is done by following the line from the target point back to the observer point. Higher points obstruct the observer's view.
Among the many possible ways to determine intervisibility, the ray tracing technique is simple and useful, although it is less accurate.

Visibility analysis requires the use of a Triangulated Irregular Network (TIN) data model in which the surface is defined by triangular vertices.

Example: Consider a builder constructing houses at the foothills of a mountain range and desires to present a beautiful view of the landscape from the location of each house. After shortlisting the potential locations, the TIN model for each location is used by the GIS software to look in all directions at the vertices for a view from the vertices of the model. The software retrieves the elevation values and compares these values with the elevation of potential building sites. All the areas higher in elevation are classified as invisible. The resulting polygon map shows visible areas for each coverage tested.
Raster methods of visibility analysis are similar except that they are less elegant and more computationally expensive.

Tuesday, July 26, 2016

Electromagnetic radiation and its characteristics

Electromagnetic radiation: Electromagnetic radiation (EMR) is a a form of energy propogated through free space (vaccum) or a medium in the form of electromagnetic waves.EMR is termed as such because it is composed of an electric field and a magnetic field that oscillate simultaneously in planes mutually perpendicular to each other as well as to the direction of propogation of the radiation.

The two defining characteristics of electromagnetic radiation are its:

  1. Frequency and
  2. Wavelength
Frequency is the number of waves that pass a point in a specified time. It is measured in Hertz (Hz) or cycles per second.
Wavelength is the distance between two successive peaks of a wave. It is measured in meters (m) or its multiples (nm, mm, cm etc)

The range of electromagnetic waves is called electromagnetic spectrum.

(Velocity of light (C)~ 3*10^8 m/s)
Velocity, wavelength and frequency are related by the equation: C = frequency * wavelength
It is evident from this equation that frequency and wavelegth are inversely proportional
This follows that:
  • a wave with a longer wavelength has lower frequency and thus lower energy
  • a wave with a shorter wave wavelength has higher frequency and thus higher energy.
The electromagnetic spectrum is divided into seven regions. They are:
  1. Radio waves
  2. Microwaves
  3. InfraRed (IR) waves
  4. Visible light
  5. UltraViolet (UV) rays
  6. X rays and
  7. Gamma rays
Usually. low energy radiation (Radio waves) is expressed as wavelengths while microwaves, infrared (IR), visible and ultraviolet (UV)  radiations are expressed as frequencies.
Radio waves
Radio waves are at the lowest range of the EM spectrum, with wavelengths greater than about 10 mm. Radio is used primarily for communications including voice, data and entertainment media.

Microwaves have wavelengths of about 10 mm to 100 micrometers (μm). Microwaves are used for high-bandwidth communications, radar and as a heat source for microwave ovens and industrial applications.

Infrared is in the range of wavelengths of about 100 μm to 740 nanometers (nm). IR light is invisible to human eyes, but we can feel it as heat if the intensity is sufficient.

Visible light
Visible light is found in the middle of the EM spectrum, between IR and UV. It has wavelengths of about 740 nm to 380 nm. Visible light is defined as the wavelengths that are visible to most human eyes.

Ultraviolet light is in the range of the EM spectrum between visible light and X-rays. It has wavelengths of about 380 nm to about 10 nm. UV light is a component of sunlight; however, it is invisible to the human eye. It has numerous medical and industrial applications, but it can damage living tissue.

X-rays are roughly classified into two types: soft X-rays and hard X-rays. Soft X-rays comprise the range of the EM spectrum between UV and gamma rays. Soft X-rays have wavelengths of about 10 nm to about 100 picometers (pm). Hard X-rays occupy the same region of the EM spectrum as gamma rays. The only difference between them is their source: X-rays are produced by accelerating electrons, while gamma rays are produced by atomic nuclei.

Gamma-rays are in the range of the spectrum above soft X-rays. Gamma-rays have wavelengths of less than 100 pm (4 × 10−9 inches). Gamma radiation causes damage to living tissue, which makes it useful for killing cancer cells when applied in carefully measured doses to small regions. Uncontrolled exposure, though, is extremely dangerous to humans.

Maps - Basic components, Types of maps & Map analysis

Basic components of a map:
The basic components of any map are listed below:

  1. The Title of the map indicates what the map is trying to show
  2. The Key explains the symbols shown in the map
  3. The Scale gives the relationship between distance on the map to the actual distance on the Earth.
  4. Tha map shows the Latitudes (parallels N or S of the equator) and Longitudes (meridians E or W of the prime meridian)
  5. Compass rose showing the directions on a map.
The types of maps are:
  1. Physical map
  2. Political map
  3. Thematic map
  4. Cartogram and
  5. Flow-line map

Map Analysis involves answering questions based on:
  1. Title of the map
  2. Type of map
  3. Location of an object, area or phenomena
  4. Meanings of the symbols and inferences from patterns
  5. Relationship between locations and events over a period of time
  6. The main idea or theme being represented by the map.

GIS-Unit 5-Syllabus-OU


Introduction to remote sensing: Electromagnetic radiation, Characteristics, Interaction with Earth's surface, Sensor types, Satellite characteristics, IRS series, Data products, Interpretation of data.

Software Scenario Functions: Watershed modelling, Environmental modelling and Visibility analysis.

Map transformations


Map transformation involves transformation of points from one map to points on another map while minimizing the differences between the two sets of points.

  • The 'Helmert transformation' is the best choice for the vast majority of applications.
  • The 'Helmert transformation' translates the points of one map horizontally and vertically and also rotates and scales the points (it uses 4 parameters).
  • The 'affine transformation' is useful in cases where the paper has pronounced directional shrinking due to orientation of the fibers.
  • The 'affine transformation' is also useful to compensate for some shearing in the map or for computing the shearing angle.
  • The 'affine transformation' with five parameters translates points in the x direction and y direction, a rotation and two scale factors (one in the x direction and one in the y direction)
  • The 'affine transformation' with six parameters consists of:
    • translation in x direction
    • translation in y direction
    • two rotation and two scale factors (both axes are rotated and scaled separately)
Other map transformations used are:

  1. Robust Helmert
  2. Huber estimator
  3. Vestimator and
  4. Hampel estimator