Analysing Drone and Satellite Imagery using Vegetation Indices

A majority of our ecosystem monitoring work involves acquiring, analysing and visualising satellite and aerial imagery. Creating true-colour composites, using the Red, Green and Blue (RGB) bands, allows us to actually view the land areas we’re studying. However, this is only a first step; creating detailed reports on deforestation, habitat destruction or urban heat islands requires us to extract more detailed information, which we do by conducting mathematical operations on the spectral bands available from any given sensor. For example, we can extract surface temperature from Landsat 8 satellite data, as detailed in a previous blogpost.

A true-colour composite image created using data from Landsat 8 bands 2, 3 and 4.

As you may imagine, understanding how much vegetation is available in any given pixel is essential to many of our projects, and for this purpose, we make use of Vegetation Indices. In remote sensing terms, a Vegetation Index is a single number that quantifies vegetation within a pixel. It is extracted by mathematically combining a number of spectral bands based on the physical parameters of vegetation, primarily the fact that it absorbs more more light in the red (R) than in the near-infrared (NIR) region of the spectrum.  These indices can be used to ascertain information such as vegetation presence, photosynthetic activity and plant health, which in turn can be used to look at climate trends, soil quality, drought monitoring and changes in forest cover. In this blogpost, we’re going to provide a technical overview of some of the vegetation indices available for analysing both aerial and satellite imagery. We’ve included the basic formulae used to calculate the indices, using a bracketing system that allows for the formulae to be copy-pasted directly into the Raster Algebra (ArcMap) and Raster Calculator (QGIS) tools; don’t forget to replace the Bx terms with the relevant band filenames when doing the calculations! We’ve also noted down the relevant band combinations for data from Landsat 8’s Operational Land Imager and both the Sentinel-2’s MultiSpectral Instruments.

We’ve created maps for most of the vegetation indices described below, using data from Landsat 8 acquired over Goa, India on the 28th of December 2018. Each band was clipped to the area of interest and the Digital Numbers were rescaled to calculate Top-of-Atmosphere radiance values. All the index calculations were then executed on these clipped and corrected bands. We used a single min-max stretched red-to-green colour gradient to visualise each index. For actual projects, we’d then classify each image to provide our partners with meaningful information.

The Basic Vegetation Indices

Ratio Vegetation Index

One of the first Vegetation Indices developed was the Ratio Vegetation Index (RVI) (Jordan 1969) which can be used to estimate and monitor above-ground biomass. While the RVI is very effective for the estimation of biomass, especially in densely-vegetated areas, it is sensitive to atmospheric effects when the vegetation cover is less than 50%, (Xue et al. 2017).


Sentinel 2: B4 / B8

Landsat 8: B4 / B5


Difference Vegetation Index

The Difference Vegetation Index (DVI) (Richardson et al. 1977) was developed to distinguish between soil and vegetation, and as the name suggests, is a simple difference equation between the red and near-infrared bands.


Sentinel 2: B8 - B4

Landsat 8: B5 - B4

Normalised Difference Vegetation Index

The Normalised Difference Vegetation Index (NDVI) (Rouse Jr. et al. 1974) was developed as an index of plant “greenness” and attempts to track photosynthetic activity. It has since become one of the most widely applied indices. Like the RVI and the DVI, it is also based on the principle that well-nourished, living plants absorb red light and reflect near-infrared light. However, it also takes into account the fact that stressed or dead vegetation absorbs comparatively less red light than healthy vegetation, bare soil reflects both red and near-infrared light about equally, and open water absorbs more infrared than red light. The NDVI is a relative value and cannot be used to compare between images taken at different times or from different sensors. NDVI values range from -1 to +1, where higher positive values indicate the presence of greener and healthier plants. The NDVI is widely used due to its simplicity, and several indices have been developed to replicate or improve upon it.

NDVI = NIR - R / NIR + R

Sentinel 2: B8 - B4 / B8 + B4

Landsat 8: B5 - B4 / B5 + B4


Synthetic NDVI

Synthetic NDVI

The Synthetic NDVI is an index that attempts to predict NDVI values using only Red and Green bands. Hence it can be applied to imagery collected from any RGB sensor., including those used on consumer-level drones. Like the NDVI, its values also range from -1 to +1, with higher values suggesting the presence of healthier plants. However, it is not as accurate as the NDVI and needs to be calibrated using ground information to be truly useful. It is also known as the Green Red Vegetation Index (GRVI) (Motohka et al. 2010).

Synthetic NDVI = ( G - R ) / ( G + R )

Sentinel 2: ( B3 - B4 ) / ( B3 + B4 )

Landsat 8: ( B3 - B4) / ( B3 + B4 )


Visible Difference Vegetation Index

Similarly, the Visible Difference Vegetation Index (VDVI) (Wang et al. 2015) can also be calculated using information from only the visible portion of the electromagnetic spectrum. Some studies indicate that VDVI is better at extracting vegetation information and predicting NDVI than other RGB-only indices,.

VDVI = ( (2*G) - R - B ) / ( (2 * G) + R + B )

Sentinel 2:  ( ( 2 * B3 ) - B4 - B2 ) / ( (2 * B3 ) + B4 + B2 )

Landsat 8: ( ( 2 * B3 ) - B4 - B2 ) / ( ( 2 * B3 ) + B4 + B2 ) 


Excess Green Index

The Excess Green Index (ExGI) contrasts the green portion of the spectrum against red and blue to distinguish vegetation from soil, and can also be used to predict NDVI values. It has been shown to outperform other indices (Larrinaga et al. 2019) that work with the visible spectrum to distinguish vegetation.

ExGI = ( 2 * G ) - ( R + B )

Sentinel 2: ( 2 * B3) - ( B4 + B2 )

Landsat 8: ( 2 * B3 ) - ( B4 + B2 )

Green Chromatic Coordinate

The Green Chromatic Coordinate (GCC) is also an RGB index (Sonnentag et al. 2012) which has been used to examine plant phenology in forests.

GCC = G / ( R + G + B )

Sentinel 2: B3 / ( B4 + B3 + B2 )

Landsat 8: B3 / ( B4 + B3 + B2 )

One of the primary shortcomings of the NDVI is that it is sensitive to atmospheric interference, soil reflectance and cloud- and canopy- shadows. Indices have thus been developed that help address some of these shortcomings.

Indices that address Atmospheric (and other) Effects

Enhanced Vegetation Index

The Enhanced Vegetation Index (EVI) was devised as an improvement over the NDVI (Heute et al. 2002) to be more effective in areas of high biomass, where it is possible for NDVI values to become saturated. The EVI attempts to reduce atmospheric influences, including aerosol scattering, and correct for canopy background signals. In remote sensing terms, a saturated index implies a failure to capture variation due to the maximum values being registered for some pixels. 

EVI = 2.5 * ( ( NIR - R ) / ( NIR + (6 * R) - ( 7.5 * B ) + 1 ) )

Sentinel 2: 2.5 * ( ( B8 - B4) / ( B8 + ( 6 * B4) - ( 7.5 * B2 ) + 1) )

Landsat 8: 2.5 * ( ( B5 - B4) / ( B5 + ( 6 * B4) - ( 7.5 * B2 ) + 1 ) )


Atmospheric Reflection Vegetation Index

The Atmospheric Reflection Vegetation Index (ARVI) was developed specifically to eliminate atmospheric disturbances (Kaufman et al. 1992).  However, for a complete elimination of aerosols and the ozone effect, the atmospheric transport model has to be implemented, which is complicated to calculate and for which the data is not always easily available.  Without integrating this model into the calculation, the ARVI is not expected to outperform the NDVI in terms of accounting for atmospheric effects, but can still be useful as an alternative to it.

ARVI (w/o atmospheric transport model) = ( NIR – ( R * B ) ) / ( NIR + (R * B) )

Sentinel 2: ( B8 - ( B4 * B2 ) ) / ( B8 + ( B4 * B2 ) )

Landsat 8: ( B5 - ( B4 * B2) ) / ( B5 + (B4 * B2 ) )


Green Atmospherically Resistant Index

The Green Atmospherically Resistant Index (GARI) was also developed to counter the effects of atmospheric interference in satellite imagery. It shows much higher sensitivity to chlorophyll content (Gitelson et al. 1996) and lower sensitivity to atmospheric interference.

GARI = ( NIR – ( G – ( γ * ( B – R ) ) ) ) / ( NIR + ( G – ( γ * ( B – R ) ) ) )

Sentinel 2: ( B8 – ( B3 – ( γ * ( B2 – B4 ) ) ) ) / ( B8 + ( B3 – ( γ * (B2-B4) ) ) )

  Landsat 8: ( B5 – ( B3 – ( γ * ( B2 – B4 ) ) ) ) / ( B5 + [ B3 – ( γ * ( B2 – B4) ) ) )

In the formula above, γ is a constant weighting function that the authors suggested be set at 1.7 (Gitelson et al. 1996, p 296) but may have to be recalibrated in areas of complete canopy coverage. For this image, we used a γ value of 1.


Visible Atmospherically Resistant Index

The Visible Atmospherically Resistant Index (VARI) can be used to account for atmospheric effects in RGB imagery.

VARI = ( G - R) / ( G + R - B )

Sentinel 2: ( B3 - B4 ) / ( B3 + B4 - B2 )

Landsat 8: ( B3 - B4 ) / ( B3 + B4 - B2 )

Addressing Soil Reflectance

As in the case of atmospheric effects, indices were also developed to address the effects of varying soil reflectance.

Soil Adjusted Vegetation Index

The Soil Adjusted Vegetation Index is a modified version of the NDVI designed specifically for areas with very little vegetative cover, usually less than 40% by area. Depending on the type and water content, soils reflect varying amounts of red and infrared light. The SAVI accounts for this by suppressing bare soil pixels.

SAVI = [ ( NIR – R ) / ( NIR + R + L ) ] * (1 + L)

Sentinel 2: [ ( B8 – B4 ) / ( B8 + B4 + L ) ] * (1 + L)

Landsat 8: [ ( B5 – B4 ) / (B5 + B4 + L ) ] * (1 + L) 

In the above equations, L is a function of vegetation density; calculating L requires a priori information about vegetation presence in the study area. It ranges from 0-1 (Xue et al. 2017) with higher vegetation coverages resulting values approaching 1.  


The Modified Chlorophyll Absorption in Reflectance Index (MCARI) was developed as a vegetation status index. The Chlorophyll Absorption in Reflective Index (Kim 1994) was initially designed to distinguish non-photosynthetic material from photosynthetically active vegetation. The MCARI is a modification of this index and is defined as the depth of chlorophyll absorption (Daughtry et al. 2000) in the Red region of the spectrum relative to the reflectance in the Green and Red-Edge regions.  

MCARI = (Red-Edge - R ) - 0.2 * ( Red-Edge - G) * ( Red-Edge / Red )

Sentinel 2: ( B5 - B4) - 0.2 * ( B5 - B3) * ( B5 / B4)

 Landsat 8: No true equivalent

The Structure Insensitive Pigment Index (SIPI) is also a vegetation status index, with reduced sensitivity to canopy structure and increased sensitivity to pigmentation. Higher SIPI values are strongly correlated with an increase in carotenoid pigments, which in turn indicate vegetation stress. This index is thus very useful in the monitoring of vegetation health.

SIPI = (800nm - 445nm) / (800nm - 680nm)

Sentinel 2: (B8 - B1) / (B8 - B4)

Landsat 8: (B5 - B1 ) /( B5 - B4)

Agricultural Indices

Some indices that were initially designed for agricultural purposes can also be used for the ecological monitoring of vegetation.

Triangular Greenness Index

The Triangular Greenness Index (TGI) was developed to monitor chlorophyll and indirectly, the nitrogen content of leaves (Hunt et al. 2013) to determine fertilizer application regimes for agricultural fields. It can be calculated using RGB imagery and serves as a proxy for chlorophyll content in areas of high leaf cover.

 TGI = 0.5 * ( ( ( λR - λB ) * ( R - G) ) - ( ( λR - λG ) * ( R - B ) ) )

Sentinel 2A: 0.5 * ( ( ( 664.6 - 492.4 ) * ( B4 - B3 ) ) - ( ( 664.6 - 559.8) * ( B4 - B2 ) ) )

Sentinel 2B: 0.5 * ( ( ( 664.9 - 492.1 ) * ( B4 - B3 ) ) - ( ( 664.9 - 559.0 ) * ( B4 - B2 ) ) )

Landsat 8: 0.5 * ( ( ( 654.59 - 482.04 ) * ( B4 - B3 ) ) - ( ( 654.59 - 561.41 ) * ( B4 - B2 ) ) )

In the above equations, λ represents the center wavelengths of the respective bands; the central wavelengths of Sentinel 2A and Sentinel 2B vary slightly.


Normalised Difference Infrared Index

The Normalised Difference Infrared Index (NDII) uses a normalized difference formulation instead of a simple ratio. It is a reflectance measurement that is sensitive to changes in the water content of plant canopies, and higher values in the index are associated with increasing water content. The NIDI can be used for agricultural crop management, forest canopy monitoring, and the detection of stressed vegetation.

NDII = ( NIR - SWIR ) / (NIR + SWIR )

Sentinel 2 : ( B8 - B11 ) / ( B8 + B11 )

Landsat 8: ( B5 - B6) / ( B5 + B6 )

Green Leaf Index

The Green Leaf Index (GLI) was originally designed for use with a digital RGB camera to measure wheat cover. It can also be applied to aerial and satellite imagery.

GLI = ( ( G - R ) + ( G - B ) ) / ( ( 2 * G ) + ( B + R ) )

Sentinel 2: ( ( B3 - B4 ) + ( B3 - B2 ) ) / [ ( 2 * B3) + ( B2 + B4 ) )

Landsat 8:  ( ( B3 - B4 ) + ( B3 - B2 ) ) / [ ( 2 * B3) + ( B2 + B4 ) )


Task-specific Vegetation Indices

As we can see, one index might be more appropriate than another based on the purpose of your study and the source of the imagery. The following section lists indices developed to meet the needs of specific research requirements.

Transformed Difference Vegetation Index

The Transformed Difference Vegetation Index (TDVI) was developed to detect vegetation in urban settings where NDVI is often saturated.

TDVI = 1.5 * ( NIR - R ) / √( NIR^2 + R + 0.5)]

Sentinel 2: 1.5 * ( B8 - B4 ) / sqrt( B8^2 + B4 + 0.5)

Landsat 8: 1.5 * ( B5 - B4 ) / sqrt( B5^2 + B4 + 0.5)

Calculating square roots in QGIS Raster Calculator and ArcMap’s Raster Algebra have different syntaxes; QGIS uses ‘sqrt’ while ArcMap uses ‘SquareRoot’.

The Leaf Chlorophyll Index (LCI)  was developed to assess chlorophyll content in areas of complete leaf coverage.

LCI= ( NIR − RedEdge) / (NIR + R)

Sentinel 2: ( B8 - B5 ) / ( B8 + B4 )

Landsat 8: No true equivalent

Vegetation Fraction

The Vegetation Fraction is defined as the percentage of vegetation occupying the ground area; since it’s calculated using values generated from a NDVI, it is subject to the same errors. It’s a comprehensive quantitative index in forest management and an important parameter in ecological models, and can also be used to determine the emissivity parameter when calculating Land Surface Temperature.

Vegetation Fraction: [ NDVI - NDVI(min) ] / [ NDVI(max) - NDVI(min) ]

In this blogpost, we’ve listed down and organised the vegetation indices that we’ve found while improving our ecological monitoring techniques. We make extensive use of both satellite and drone imagery, and will be using this blogpost internally as a quick reference guide to vegetation indices.

Find us on Twitter @techforwildlife if you have any questions or comments, or email us at contact@techforwildlife.com. We’ve also opened up the comments for a few days, so please feel free to point out any errors or leave any other feedback!

P.S.: Hat-tip to Harris Geospatial (@GeoByHarris) for a comprehensive list of vegetation indices, which can be found here.

P.P.S.: We’ll be updating this post with Sentinel-2A imagery in the next few days.


·      C. F. Jordan (1969) Derivation of leaf-area index from quality of light on the forest floor. Ecology, vol. 50, no. 4, pp. 663–666, 1969

·      Daughtry, C. S. T., Walthall, C. L., Kim, M. S., De Colstoun, E. B., & McMurtrey Iii, J. E. (2000). Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sensing of Environment, 74(2), 229-239.

·      Gitelson, A., Y. Kaufman, and M. Merzylak. (1996) Use of a Green Channel in Remote Sensing of Global Vegetation from EOS-MODIS. Remote Sensing of Environment 58 (1996): 289-298.

·      Huete, A., et al. (2002) Overview of the Radiometric and Biophysical Performance of the MODIS Vegetation Indices." Remote Sensing of Environment 83 (2002):195–213.

·      Hunt, E. Raymond Jr.; Doraiswamy, Paul C.; McMurtrey, James E.; Daughtry, Craig S.T.; Perry, Eileen M.; and Akhmedov, Bakhyt, (2013) A visible band index for remote sensing leaf chlorophyll content at the canopy scale. Publications from USDA-ARS / UNL Faculty. 1156.

·      J. Richardson and C. Weigand, (1977) Distinguishing vegetation from soil background information. Photogrammetric Engineering and Remote Sensing, p. 43, 1977.

·      Jinru Xue and Baofeng Su. (2017) Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications, Journal of Sensors, vol. 2017, Article ID 1353691, 17 pages, 2017.

·      Kim, M. S. (1994). The Use of Narrow Spectral Bands for Improving Remote Sensing Estimations of Fractionally Absorbed Photosynthetically Active Radiation. (Doctoral dissertation, University of Maryland at College Park).

·      Larrinaga, A., & Brotons, L. (2019). Greenness Indices from a Low-Cost UAV Imagery as Tools for Monitoring Post-Fire Forest Recovery. Drones, 3(1), 6.

·      Motohka, T., Nasahara, K. N., Oguma, H., & Tsuchida, S. (2010). Applicability of green-red vegetation index for remote sensing of vegetation phenology. Remote Sensing, 2(10), 2369-2387.

·      Sonnentag, O.; Hufkens, K.; Teshera-Sterne, C.; Young, A.M.; Friedl, M.; Braswell, B.H.; Milliman, T.; O’Keefe, J.; Richardson, A.D. (2012) Digital repeat photography for phenological research in forest ecosystems. Agric. For. Meteorol. 2012, 152, 159–177

·      X. Wang, M. Wang, S. Wang, and Y. Wu. (2015) Extraction of vegetation information from visible unmanned aerial vehicle images. Nongye Gongcheng Xuebao/Transactions of the Chinese Society of Agricultural Engineering, vol. 31, no. 5, pp. 152–159, 2015. 

·      Y. J. Kaufman and D. Tanré. (1992) Atmospherically Resistant Vegetation Index (ARVI) for EOS-MODIS. IEEE Transactions on Geoscience and Remote Sensing, vol. 30, no. 2, pp. 261–270, 1992.

A first person account of the capture of a tiger in Northern India: Four elephants, a bulldozer and a drone.

(Cross-posted from WildLabs)

In February 2017, a tiger killed two people within a span of 3 days near the Pilibhit Tiger Reserve in Indian province of Western Uttar Pradesh, and was declared a man-eater. With state elections around the corner, and local villagers threatening to boycott the polls unless the tiger was removed from the area, the Uttar Pradesh Forest Department (UPFD) began an operation with the objective of capturing or killing the tiger. They also called in a drone team to be a part of the operation, primarily to have a highly visible way of broadcasting to the local communities that something was being done to catch the tiger.

Drones are still a novelty in India; the Directorate General for Civil Aviation banned their use by civilians in October 2014 till further regulations were issued, which haven’t arrived till date. However, there are civilian companies who provide drone services, bypassing the regulatory ban through the use of waivers from the authorities, or by working directly for government agencies as was the case in this operation. I was tasked with coordinating between the drone team and the UPFD, and we arrived in Pilibhit on the afternoon of the 10th of February.


The tiger had been located in a sugarcane patch the previous evening, but had managed to give its hunters the slip. On the day we arrived, extensive search operations were on over a large area to locate the general whereabouts of the tiger. While waiting for information, we demonstrated the use of the drones to the UP Forest Department staff. Both were quadcopters, that is, drones with four rotors that are capable of taking off vertically and of hovering in mid-air like helicopters. One was the consumer-level DJI Phantom 4, while the other was the professional-level DJI Inspire - both are equipped with controllable cameras, and are commonly used for videography purposes. It turned out that the Forest Department also had a Phantom 4 of their own, which they’d brought down from Dudhwa Tiger Reserve. Our drone operators used the afternoon to conduct basic training, showing the UP Forest Department staff how to fly their drone safely and use it for surveillance.

Tiger capture operations can last anywhere from a few days to a few weeks, and often end inconclusively. So our initial plan was to spend at least three days in the area conducting drone operations. Subsequently, depending on how things played out, and how useful the drones were perceived to be, we’d either head back to Delhi or extend our stay in the area.

As it turned out, this tiger really was a man-eater; it made its third kill in 5 days in the early morning of the 11th of February. We reached the kill site, in a village to the west of Pilibhit, shortly after we heard the news and saw that a large crowd of people had gathered. One group of people surrounded the Forest Department staff who were interviewing the victim’s brother, while others surrounded local headmen who were giving interviews to the press. There was also a continuous flow of movement as people went to view the body of the last victim, which lay in a sugarcane patch nearby. On the ground, next to a pile of sand not far from the body, was a clearly defined pugmark.

Shortly after we got to the kill site, the Forest Department received information that fresh tiger pugmarks had been found about 2km north of the kill. All the action quickly re-centered itself; two trained Forest Department elephants were summoned and we headed out to that area to join the operation. We sent up the Phantom 4 to scan two large sugarcane patches where the tiger could potentially be hiding, with the camera pointing downwards. We weren’t really expecting to see anything through the dense greenery, and we didn’t. However, while there was a chance that we’d actually find something, these flights also served to keep the crowds that had gathered distracted and away from the elephants, which were searching some distance away.

Most people in India haven’t seen drones in action; I live in Delhi and work on drone policy issues, but even I’d only seen them used twice in India before this operation. While one does eventually get used to them, there’s something fascinating about watching these small robots take flight, and the local residents who’d come out in droves to watch the tiger being captured weren’t immune. The open-top safari jeep we were operating the drones out of was constantly surrounded by people, and it’s the closest I’ve ever been to feeling like a movie star. Since there were so many people in close proximity, we were launching the Phantom 4 off the hood of the vehicle but landing it by direct hand-capture, which is a very showy manoeuver. We did this for about 20 minutes, and then word came that the elephants had pinpointed the square plot of sugarcane the tiger was actually hiding in.

We headed there and kept the drones out of the air till they were called for, watching as the operation unfolded. Forest Department staff set up nets around the outer perimeter, guarded by the veterinarians and forest guards armed with tranquilizer, and regular, guns riding on elephant-back within the sugarcane patch itself. Shortly after the nets were put up, there was the sudden trumpeting of an elephant. We heard that the tiger had charged at one of them, and scratched it near its right eye and on its trunk. The operation then halted for a while as everyone waited for two more trained elephants, a bulldozer and a truck, carrying a cage, to arrive on site.

Once all the resources were in place, the bulldozer began spiraling inwards into the sugarcane patch, gradually removing the tiger’s cover while leaving a thin fence of sugarcane along the perimeter. We sent up the larger drone at this point, both to document the operation and to keep it ready in case the Forest Department staff wanted to try and use it to flush the tiger out of the sugarcane.

From our vantage point outside the sugarcane patch, we could see the top of the bulldozer as it slowly mowed down the sugarcane, followed by a view of the elephants, with their riders, moving placidly through the sugarcane patch. It could have been any other calm afternoon, but the peace was suddenly disrupted by a swift burst of confused action. We saw a sudden burst of motion from the elephants, and I heard trumpeting, a roar and two distinct gunshots. However, later review of the video footage from the drone made the sequence of events much clearer. The bulldozer had removed most of the sugarcane, leaving only a small central patch standing, and once it was done, the four elephants took a circuit of the central patch. At the point the action had begun, two of the elephants either sensed the tiger, or their riders spotted it. Either way, both elephants wheeled to the left and charged into the sugarcane patch, and the other two followed. All four flailed around in the sugarcane. A little distance away, there was movement in the underbrush, and then the tiger burst out into the circle cleared by the bulldozer, and then dashed back into cover in the sugarcane left standing along the perimeter.

As it turned out, the veterinarians had managed to shoot the tiger with at least one tranquilizer dart. The elephants and their riders were pulled back as everyone waited to make sure that the tranquilizers had taken effect. In the meanwhile, the tiger doubled back into the sugarcane patch and then passed out. Two of the elephants then went back in to the patch, and once the personnel on elephant-back confirmed that the tiger had been incapacitated, they called the truck with the cage in. They dismounted from the elephants, quickly carried the tiger to the truck, pushed it into the cage and locked it. 

It’s at this point that the surrounding crowds stormed the site and climbed onto the truck, snatched the keys from its hapless driver and slashed its tires. Newspaper reports of the day claim that the ‘angry locals’ also tried to set it on fire, but I didn’t see any evidence of that. Also, while I’m sure that there was anger and resentment on the part of the local communities against the man-eating tiger, the Forest Department and the State Government, I don’t think that the crowd itself was angry. I’ve grown up in Kolkata when it was under Communist rule, and I’ve seen angry mobs on the streets. This, however, seemed to be a crowd composed primarily of young men who were torn between wanting to hurt the tiger, see the tiger or merely be a part of the giant party in progress. The abundance of freshly-bulldozed sugarcane proved to be attractive to the mob - while some of it was being gnawed upon by those on the fringes of the mob, a lot of it was thrown at the truck, at other people or into the air. The elephants with their mahouts were still on the field near the truck, being used to help control the crowd. When one of the airborne sugarcane sticks went very close to one of the elephants, its mahout glared at the section of people from where it had come, and that was the end of the sugarcane throwing.

The Forest Department staff behaved admirably; once it was clear that the crowd couldn’t actually get to the tiger within the cage and harm it, they pulled the elephants out to one side, and kept them facing away from the crowd. They’d sent for a tractor to pull the disabled truck with the cage away, and in the meantime let the crowd spend its passion and energy climbing all over the truck and the cage. It was only when the tractor arrived that the Forest Department staff, and some police who’d been deputised to help, set up a cordon around the truck and made a real effort to push the people off. The truck was then attached to the tractor by ropes, and towed away with its captive and unconscious inhabitant.

The removal of the tiger from the site marked the end of the operation.  It was successfully captured alive and sent to the Lucknow zoo. The elephants, their mahouts and the drone operators were thanked for their service, and everyone went on their way. While the drone deployment provided a useful record, and a unique perspective on the events of the day, it was the tried and tested age-old technique of hunting tigers, using beaters (or in this case, a bulldozer) and riders on elephant-back, that resulted in a successful resolution to the story of one of the man-eaters of Pilibhit in 2017.

With thanks to Ayush, Shakti, Harshad, Mudit, Naresh and Jaspreet.

Let’s open up the skies for drones

(Cross-posted from the Hindu Business Line, October 19th 2015)

Unmanned aerial vehicles are flying robots that provide some of the benefits of manned flight without its attendant risks and inconveniences. Commonly known as drones, they proved their worth on the battlefield during the 1973 Yom Kippur and 1982 Lebanon wars, after which numerous military forces began implementation of their surveillance and weaponised drone programmes. Today, India is reported to have some 200 Israeli-made drones in service, and is in the process of developing indigenous ones for military use. Civilians, however, are banned from flying drones.

Drones are not just used for military purposes; they have also been used by civilians around the world for a diverse set of non-conflict use cases. These include assisting aid agencies during humanitarian crises, helping farmers with their fields, providing a new perspective to journalists, letting conservationists rapidly monitor wildlife and conduct anti-poaching patrols, as well as simple recreational activity; flying a drone can be a lot of fun.

Drones, thus, have commercial value; they provide a much cheaper alternative to manned flight, and enable applications that were impossible earlier. Unfortunately, most new technologies come with their own dangers, and drones are no exception. They can occasionally crash. This matters most when the drone being flown is large and heavy, as a crash can damage property and harm people. Drones also occupy airspace that is used by manned aircraft, and an in-air collision or even a near-miss, could be disastrous. These are dangers that could occur unintentionally. However, there is also the fear that drones could be used to intentionally cause harm.

For these reasons, the relevant regulatory bodies of some countries have limited the public use of drones until these concerns can be addressed. In India, the Directorate General for Civil Aviation (DGCA) completely banned their use by civilians as of October 7, 2014. However, the authorities in other countries haven’t gone as far; in the US, the Federal Aviation Agency (FAA) allows the civilian use of drones with caveats, while their commercial use is licensed. While the various countries of the European Union (EU) currently have multiple regulations covering drone flights, the European Aviation Safety Agency intends to create common drone regulations, with the intention of permitting commercial operations across the EU starting in 2016.

The regulatory authorities of these countries have understood that drones are here to stay, and that their use can be extremely beneficial to the economy. A report by the Association for Unmanned Vehicle Systems International (AUVSI), a non-profit trade organisation that works on “advancing the unmanned systems community and promoting unmanned systems”, states that by 2025, the commercial drone industry will have created over 100,000 jobs in the US alone, with an economic impact of $82 billion. Drones can also contribute to the export market. For example, in Japan, where commercial drones have been licensed since the 1980s, Yamaha Corp has been producing drones for aerial spraying for agricultural purposes which are now exported to the US, South Korea, and Australia, generating $41 million in revenue for Yamaha in 2013-14. That’s small change compared to the current global market leader’s expected sales for 2015. SZ DJI Technology Co Ltd of Shenzen, China, was only founded in 2006, but by 2015, they controlled 70 per cent of the global commercial drone market and a higher percentage of the consumer drone market, for an estimated revenue of $1 billion.

These countries and companies have addressed the inherent dangers of drone technology by looking at technology- and policy-based solutions. The FAA and the UK’s Civil Aviation Agency (CAA) prohibit the flying of drones within five km of an airport or other notified locations, and drone manufacturers like DJI and Yamaha could enforce these rules by incorporating them into the drone’s control software. This means that a drone will be inoperable within these restricted zones. Outside these zones, drone misuse can be treated as a criminal offence. In the US, two individuals were recently arrested for two separate drone-related incidents: in one, the operator’s drone crashed into a row of seats at a stadium during a tennis match and in the other, the operator flew his drone near a police helicopter.

In India, the DGCA’s October 2014 public notification states that due to safety and security issues, and because the International Civil Aviation Organisation (ICAO) hasn’t issued its standards and recommended practices (SARPs) for drones yet, civilian drone use is banned until further notice. One year later, there are still no regulations available from the DGCA; the ICAO expects to issue its initial SARPs by 2018, with the overall process taking up to 2025 or beyond. Meanwhile, the loss to India’s economy, and the threat to its national security, will be enormous. Today, it is still possible to import, buy, build or fly small drones in India, despite the DGCA’s ban. This means that drone-users in India currently exist in an illicit and unregulated economy, which is far more of a threat to the nation than regulated drone use could ever become.

Finally, flying drones safely in India will require research and development to understand how they can be best used in India’s unique landscape. Such R&D occurs best in a market-oriented environment, which will not happen unless civilian drone use is permitted. Building profitable companies around drone use can be complicated when the core business model is illegal.

Like civil aviation regulators in other countries, the DGCA should take a pro-active role in permitting civilian use of drones, whether for commercial use or otherwise. Creating a one-window licensing scheme at the DGCA, where drone users only have to apply for permission from the ministries of defence and home affairs in special circumstances, would be a useful first step. Setting up a drone go/no-go spatial database would allow the DGCA to discriminate between these use cases and could also be mandatorily encoded into drone systems by their manufacturers. The DGCA should also discriminate between drones based on their size and weight; the smaller and lighter the drone, the less risk it poses. This should be recognised while regulating drones.

Whether it is to assist fishermen with finding shoals off the Indian coastline or conducting rapid anti-poaching patrols in protected areas across the country, mapping refugee settlements in Assam and Bengal for better aid provision or assessing the quality of national highways, drones can transform the way we conduct operations in India. Thus, a blanket ban on civilian drones in India is more of a hindrance to development than a solution to a problem. Drones are here to stay and the sooner India’s civilians are allowed to use them, the faster we can put them to work.

[ This article was commissioned by the Centre for the Advanced Study of India at the University of Pennsylvania and is also available on their blog. ]