SBL is in the fore front of using the latest of mapping technologies such as Photogrammetry and Remote sensing to cater to the GIS services demands in the world-wide industry. As a leading GIS Services provider SBL has executed several complex Aerial Photogrammetry projects towards the fulfilment of Photogrammetry mapping demands.
The following is a brief introduction to Photogrammetry and Remote Sensing for those who are new with the technology.
Photogrammetry, as its name implies, is a 3-dimensional coordinate measuring technique that uses photographs as the fundamental medium for metrology (or measurement). The fundamental principle used by Photogrammetry is triangulation or more specifically called Aerial Triangulation. By taking photographs from at least two different locations, so-called “lines of sight” can be developed from each camera to points on the object. These lines of sight (sometimes called rays owing to their optical nature) are mathematically intersected to produce the 3-dimensional coordinates of the points of interest.
The expression Photogrammetry was first used by the Prussian architect Albrecht Meydenbauer in 1867 who fashioned some of the earliest topographic maps and elevation drawings. Photogrammetry services in topographic mapping is well established but in recent years the technique has been widely applied in the fields of architecture, industry, engineering, forensic, underwater, medicine, geology and many others for the production of precise 3D data.
Branches of photogrammetry
There are two broad based branches in Photogrammetry
- Metric Photogrammetry : Deals with the precise measurements and computations on photographs regarding the size, shape, and position of photographic features and/or obtaining other information such as relative locations (coordinates) of features, areas, volumes, These photographs are taken using a metric camera and is mostly used in the engineering fields e.g. surveying etc
- Interpretive Photogrammetry: Deals with recognition and identification of the photographic features on a photograph such as shape, size, shadow, pattern etc to add value and intelligence to information seen on the photograph (annotation).,
Remote Sensing is a closely aligned technology to Photogrammetry in that it also collects information from imagery. The term is derived from the fact that information about objects and features is collected without coming into contact with them. Where remote sensing differs from Photogrammetry is in the type of information collected, which tends to be based on differences in color, so land use and land cover is one of the primary output of remote sensing processing. Remote sensing was originally conceptualized to exploit the large number of color bands in satellite imagery to create 2D data primarily for GIS. Nowadays remote sensing tools are used with all types of imagery to assist in 2D data collection and derivation, such as slope. Software tools today tend to hold a much wider range of image technologies such as image mosaicing, 3D visualisation, GIS, radar as well as softcopy Photogrammetry.
- Spatial resolution.
- Radiometric resolution.
- Spectral resolution.
- Temporal resolution
- Spatial resolution describes the ability of a sensor to identify the smallest size detail of a pattern on an image. In other words, the distance between distinguishable patterns or objects in an image that can be separated from each other and is often expressed in meters.
- Spectral resolution is the sensitivity of a sensor to respond to a specific frequency range (mostly for satellite and airborne sensors). The frequency ranges covered often include not only visible light but also non-visible light and electromagnetic radiation. Objects on the ground can be identified by the different wavelengths reflected (interpreted as different colours) but the sensor used must be able to detect these wavelengths in order to see these features.
- Radiometric resolution is often called contrast. It describes the ability of the sensor to measure the signal strength (acoustic reflectance) or brightness of objects. The more sensitive a sensor is to the reflectance of an object as compared to its surroundings, the smaller an object that can be detected and identified.
- Temporal resolution depends on several factors–how long it takes for a satellite to return to (approximately) the same location in space, the swath of the sensor (related to its ‘footprint’), and whether or not the sensor can be directed off-nadir. This is more formally known as the ‘revisit period’