spatial

From the Space Shuttle to a block of wood

Creating a 3D model of the Nanda Devi Sanctuary using SRTM data and a CNC router

NandaDevi_Zoom.jpg

I’ve had Nanda Devi and the Sanctuary surrounding her in my thoughts for a very long time, and she seemed like a fitting first attempt to bring spatial data out of the digital world and into reality. For the uninitiated, Nanda Devi is a mountain in the Indian Himalaya, and she’s always referred to as she: the goddess in the clouds. Surrounded by a protective ring of mountains, she towers over them all, and this space between the ring and the central peak is known as the Nanda Devi Sanctuary. Due to this ring, the first entry into the Sanctuary was only made in 1934, by Shipton, Tilman and their three porters, who entered via the gorge of the Rishi Ganga. The mountain herself was first summited in 1936 (see- Nanda Devi: Exploration and Ascent, by Shipton and Tilman).

The geography of the region is fascinating ( and the history as well; there’s a nuclear-powered CIA device somewhere inside the Sanctuary!) and the heights and depths of the various relief features make it a joy to visualise. In this post, I’m going to describe, in brief, the steps I used to get from the data to the final model in wood. While I’m sure most of this can be done using open-source tools, as a result of my current University of Cambridge student status and my @cammakespace membership, I have access to (extremely expensive) ESRI and Vectric software, which I’ve used liberally.

Relief map of the Nanda Devi Sanctuary and the Rishi Ganga gorge. The lighter the colour, the higher the elevation.

Relief map of the Nanda Devi Sanctuary and the Rishi Ganga gorge. The lighter the colour, the higher the elevation.

I have a repository of digital elevation data collected by the Space Shuttle Endeavour in 2000 (STS-99; Shuttle Radar Topography Mission). It’s freely available from CGAIR-CSI (http://srtm.csi.cgiar.org/) and is not difficult to use. In QGIS, it was cut and trimmed down to my area of interest around Nanda Devi; I was looking for a rough crop that would include the peak, the ring and the Rishi Ganga gorge. This relief map was exported as a GeoTIFF, and opened up in ArcScene, which is ESRI’s 3D cartography/analysis workhorse. ArcScene allowed me to convert the raster image into a multipoint file; as the tool description states, it “converts raster cell centers into multipoint features whose Z values reflect the raster cell value.” For some reason, this required a lot of tweaking to accurately represent the Z-values, but I finally got the point cloud to look the way I wanted it to in ArcScene.

 

The point cloud (red dots), overlaid on the relief map in ESRI ArcScene

The point cloud (red dots), overlaid on the relief map in ESRI ArcScene

I then exported the 3D model of the point cloud in the .wrl format (wrl for ‘world’) which is the only 3D format ArcScene knows, and used MeshLab, which is an open source Swiss-knife type tool for 3D formats, to convert the .wrl file into a stereolithographic (.stl) file which the next tool in the workflow, Vectric Cut3D, was very happy with. As a side note, Makerware was also satisfied with the .stl file, so it is 3D-print ready.

The final model in Vectric Cut3D, ready to be sent to the CNC router for carving.

The final model in Vectric Cut3D, ready to be sent to the CNC router for carving.

More tweaking in Cut3D to get the appearance right, and the toolpaths in order, and I was ready to actually begin machining. After an abortive first attempt where the router pulled up my workpiece and ate it, I spent some more time on the clamping for my second attempt. First, I used the router to cut out a pocket in a piece of scrap plywood to act as my job clamp; this pocket matched the dimensions of my workpiece exactly. After a bit of drilling, I had my workpiece securely attached to the job clamp, which was screwed into the spoilboard on the router.

The CNC router doing its thing

The CNC router doing its thing

For the actual routing itself, I used two tools; a 4mm ballnose mill and a 2mm endmill for the roughing and finishing respectively. It took about 45 minutes for the CNC router to create this piece. I love the machine, and am very grateful to the Cambridge Makespace for the access I have to it. In the near future, I’m going to try and use different CNC router tools and types of woods to make the final product look neater; specifically, a 1mm ballnose tool for the finishing toolpath would be very nice! I’m also going to try and make relief models of a few other interesting physical features.

The final product: A model of the Nanda Devi sanctuary in wood, based on data from the Space Shuttle and carved using a CNC router.

The final product: A model of the Nanda Devi sanctuary in wood, based on data from the Space Shuttle and carved using a CNC router.

 

 

S696, Maps and Atlases: The Cambridge Map Room

I’ve re-entered the academic world as a student at the University of Cambridge in the United Kingdom, and one of the benefits I’m enjoying the most is near-unlimited access to one of the world’s largest repositories of recorded information; the Cambridge University Library. Commonly known as the UL, this is a copyright library which means that under British rules on legal deposit, the library has the right to request a copy of any work published in the UK free of charge. Currently, the UL has over 8 million items, which includes books, periodicals, magazines and of course, maps.

 

The Map Room in the UL is a fascinating place; it functions as the reading room for the Map Department, which holds over a million maps (as the librarian told me; Wikipedia claims it has 1.5 million). It’s not a very large room, as reading rooms go, but is a beautiful space and is very well managed. Everything is catalogued very efficiently with a filing card system, and there’s one card with the name, date of publication and classmark (UID/coordinates) for each map.  Visitors are not allowed to simply browse through the map collections; to refer to a map, one must fill out a request form with the appropriate details and submit this form to the library assistants, who will then pull out the required map folio from its storage location. The title of this post comes from the fact that  map holdings with classmarks beginning with ‘S696′, ‘Maps’ or ‘Atlases’ are held in the Map Room, in various drawers and cabinets.

The Map Room is a pen-free zone; if you’re writing something, use a pencil. Smartphones and hand-held cameras are allowed, but under UL policy photos cannot be taken of the building itself. With prior permission however, it is possible to take images of material in the UL, which I did. The first series is from a map on display in the UL; titled “A map containing the towns villages gentlemen’s houses roads river and other remarks for 20 miles around London“, it was printed for a William Knight in 1710 and is a wonderful piece of cartography. The second series is from a map I requested using the card-index system; this map dates back to 1949 and beautifully illustrates tea-growing regions in the Indian-subcontinent.

If there’s a map in the UL you want an image of (for non-commercial or private-study purposes only!), I’d be happy to do what I can to help; I would actually be very grateful for an excuse to spend an afternoon looking at maps.

 

Using data from the Landsat 8 TIRS instrument to estimate surface* temperature

The Thermal InfraRed Sensor (TIRS) is a new instrument carried onboard the Landsat 8 satellite that is dedicated to capturing temperature-specific information.  Using radiation information from the two electromagnetic spectral bands covered by this sensor, it is possible to estimate the temperature at the Earth’s surface (albeit at a 100m resolution, compared to the 30m resolution of the other instrument, the Operational Land Imager).

The city-state of Delhi, India as visualised from band 10 of Landsat 8's TIRS instrument.

The city-state of Delhi, India as visualised from band 10 of Landsat 8's TIRS instrument.

I used data from the TIRS to estimate the surface temperature in the city-state of Delhi, India as of the 29th of May, 2013.  The relevant tarball file containing the data was downloaded using the United States’ Geological Survey’s (USGS) EarthExplorer tool; the area of interest was encompassed by [scene identifier: path 146 row 040] in the WRS-2 scheme. I think I’ll leave the specific explanations describing WRS-2, path/row values and the other miscellaneous small data-management operations for a later post. For now, I’ll just explain that these are important things to know when in the process of actually obtaining this data. When the tarball is unpacked fully, the bands from the TIRS instrument are bands 10 and 11;  the relevant .tif files are [“identifier”_B10.tif] and [“identifier”_B11.tif], and these were clipped to the administrative boundary of Delhi. There’s also a text file containing metadata: [“identifier”_MTL.txt] is essential for the math we’re going to do on these two bands. 

Each pixel of processed Level 1 (L1) Landsat 8 ETM+ data is stored as a Digital Number (DN) with a value between 0 and 2^15** . This means that when you visualise the two TIRS bands in your GIS/image-manipulation program of choice, you’re going to see a grey-scale image, where dark is cool and white is hot. The bands differ very slightly from each other; this is because each registers a different band of radiation, which is useful for correcting for atmospheric distortions.

To obtain actual surface temperature information, we need to first a) convert these DNs to top-of-the-atmosphere (ToA) radiance values, and then secondly b) convert the ToA radiance values to ToA brightness temperature in Kelvin.  The interesting thing is that the satellite stores information as radiance values; it’s converted to DNs when processed from L0 (raw data) to L1. We’re essentially reconverting this information to its original format.

Since the math does get a bit technical, I've posted the details at the end of this blogpost.

Since the TIRS has two bands, I conducted the mathematical operation on both, and then averaged the results. I know that there are more sophisticated ways of using both bands to accurately estimate surfacte temperature; as the USGS comment to this post has noted, I’ve only derived ToA brightness temperature here. Another step is required to calculate the actual surface temperature. In fact, a fellow Landsat 8 data user, Valerio de Luca, has informed me that there’s an equation of the split-window algorithm quadratic form that can be used for this purpose; thank you! I’ll update this post again once I test that. Finally, the conversion from degrees Kelvin to degrees anything else has been sufficiently well-documented that I’m going to leave it out; I’ve converted this map into degrees Celsius.

AvgTemp_Delhi-854x1024.jpg

An optional step in the middle is to correct for atmospheric distortion and solar irradiance by converting the DNs to ToA planetary reflectance values and then converting these to ToA radiance values. This requires information such as Earth-Sun distance and the solar angle, some of which is in the metadata file. I’ll be experimenting with this later as well.

Please pitch in if you have any suggestions on improving these techniques; I would definitely welcome points on better ways to use data from the TIRS instrument!

*Update 1 (245am IST on the 9th of August) : Firstly, I’d like to thank the USGS Landsat team, who’ve commented on this post; I’m very grateful for their encouragement and have updated the post accordingly. Please read the map title as ToA brightness-temperature in Delhi; another step is required before it accurately depicts the actual surface temperature values.

**Update 2 (2105 GMT on the 2nd of November) – Landsat 8 ETM+ Digital Number range has been corrected to reflect the fact that the ETM+ sensor stores 16-bit data; thanks for the comment Ciprian!

***Update 3 (0110 IST on the 5th of November 2017) - This article has been cross-posted from geohackers.in

I’ve referred to the documents below while figuring the math and writing this blogpost; they’re very useful.

USGS equations re: Landsat 8 data

https://landsat.usgs.gov/Landsat8_Using_Product.php

Converting Digital Numbers to Kelvin

http://www.yale.edu/ceo/Documentation/Landsat_DN_to_Kelvin.pdf

Information re: ToA reflectance

http://www.pancroma.com/Landsat%20Top%20of%20Atmosphere%20Reflectance.html

GRASS methodology

http://grass.osgeo.org/grass64/manuals/i.landsat.toar.html

More information on the inverted Planck function

http://www.yale.edu/ceo/Documentation/ComputingThePlanckFunction.pdf

Markham et al. (2005): Processing data from the EO-1 ALI instrument

http://www.asprs.org/a/publications/proceedings/pecora16/Markham_B.pdf

Converting from spectral radiance to planetary reflectance

http://eo1.usgs.gov/faq/question?id=21

<math>

a)             Digital Numbers to ToA radiance values

 = (ML * Qcal) + AL – - equation I

where:

Lλ        =          TOA spectral radiance

ML       =          Band-specific multiplicative rescaling factor

AL        =          Band-specific additive rescaling factor

Qcal        =        Quantized and calibrated standard product pixel values (DN)

Human-readable format:

Spectral Radiance = Band-specific multiplicative rescaling factor into Digital Number plus Band-specific additive rescaling factor from the metadata1. From the Markham et. al. (2005) paper on processing data from the EO-1 ALI instrument, I’m led to understand that the multiplicative and additive scaling factors result in ‘better preservaton of the ALI’s precision and dynamic range in going from raw to calibrated data’; this probably holds true for the L8 instruments as well but do correct me if I’m wrong.

In practise, this is:

[ Spec_rad_B“X”.tif ] = RADIANCE_MULT_BAND_”X”  * [“identifier”_B”X”.tif] + RADIANCE_ADD_BAND_”X”

where “X” is the specific band number, the RADIANCE values are obtained from the metadata file and Spec_rad_B”X”.tif is the output spectral radiance file name (user-determined).

b)            ToA radiance values to ToA brightness-temperature in Kelvin

T = K2 / (ln (K1/ Lλ +1))[2]   – equation II

(The physics-inclined will realise that equation II above is the inverted form of the Planck function.)

where:

T          =          ToA brightness temperature (K)
Lλ        =          TOA spectral radiance
K1        =          Band-specific thermal conversion constant
K2        =          Band-specific thermal conversion constant

 Human-readable format:

Temperature (Kelvin) = Band-specific thermal conversion constant I by the natural log of (Band-specific thermal conversion constant II by spectral radiance plus 1)

In practise, this is:

[Temp_B”X”.tif] = K2_CONSTANT_BAND_”X” / ln((K1_CONSTANT_BAND_”X” / [Spec_rad_B”X”.tif]) + 1)

where “X” is the specific band number, the CONSTANT values are extracted from the metadata file and the Temp_B”X”.tif file is the output temperature file name (user-determined).

 </math>

Landsat 8 data and maps of Delhi

This will be a relatively short post; I’ve been working with Landsat data for a few years now, and I find it absolutely fascinating. The new Landsat satellite, initially named the Landsat Data Continuity Mission and now known as Landsat 8, is actually the 7th in the series; Landsat 6 never made it to orbit. When Landsat 8 was launched on the 11th of February 2013, I was really anxious and excited and when it made it to orbit successfully, I was ecstatic. I downloaded my first set of Landsat data (Path146/Row040, covering the Indian city of Delhi) off the USGS EarthExplorer website last week, and have been tinkering with it ever since.

State_of_Delhi_L8_imagery-1024x724.jpg

 

The map above depicts various band-combinations; each band covers a discrete section of the electromagnetic spectrum. While bands 2, 3 and 4 lie within the visible spectrum and correspond to the colours blue, green and red, the others extend into the non-visible electromagnetic spectrum. A variety of mathematical operations can be conducted using different band combinations to obtain meaningful information about the Earth’s surface. For example, the following maps depict the surface temperature in Delhi, with a close-up of the Yamuna, on the 29th of May. A longer post on Landsat 8, its instrumentation and the various use-cases will follow once I’ve had my fill of playing with this data.

Mapping Mandis

I was employed as a spatial data and cartographic consultant on a project to analyse specific agricultural commodities and Agricultural Produce Marketing Committees (APMCs) in the Indian states of Karnataka and Madhya Pradesh. The final product was a set of maps for various publications, as well as the clean datasets themselves.

This animated map depicts wheat procurement data from the state of Madhya Pradesh in India over a five year period.

Agricultural market datasets for the states of Karnataka and Madhya Pradesh were obtained for the purposes of spatial visualisation; these contained information on wheat procurement in Madhya Pradesh (2008 – 2012), tuar production in Karnataka (2007 – 2009) and the locations and categories of APMCs in both these states. Some of the data was linked to district names, while the rest was geocoded using a free online geocoding service. I used Quantum GIS, TextEdit and Microsoft Excel extensively for this project; Excel and TextEdit are invaluable when processing CSV files, and QGIS is where all the actual mapping itself takes place.

This map was published in the article titled 'States of Wheat' in Vol. 47, Issue No. 52 of the Economic and Political Weekly.

The actual process itself involved lots of data-cleaning and a little bit of mapping. First, for the geocoding, I ran the column containing the village names through the geocoder thrice; at each repetition, I tweaked the names a little more to get more accurate coordinate results. I then had to similarly tweak the district names to get them to match up with my source shapefiles; fixing bad spellings can be a LOT of work. In its entirity, this was a tedious process that involved organising, cleaning and validating four distinct datasets with both automated and manual operations. However, the final products were datasets that were clean, had accurate spatial locations and could easily be used to produce analytically valuable maps.