Showing posts with label GIS4035. Show all posts
Showing posts with label GIS4035. Show all posts

Tuesday, November 5, 2013

Supervised Classification

Supervised Classification using Bands 4, 5, and 6
The final regular season lab for Remote Sensing was a supervised classification of a developing city (Germantown in Maryland).  We began with an aerial image of the town along with coordinates for 12 known land cover types.  After locating these points in the image, I used the technique of growing spectral signatures from a seed.  This involves setting a Spectral Euclidean Distance (mostly an estimate at first) and letting ERDAS Imagine build a polygon of pixels starting at the point you designate and emanating out to the spectral distance you selected.  This produces very interesting results and it is both fun and useful to play a bit with the distance to get the best signature of the known terrain.

Once the signatures are complete, it is important to verify that there is no spectral confusion (overlap) between signatures.  I found the widest separation in the signatures was found in Bands 4, 5, and 6, so I used these bands to create the supervised image.  I also merged the like signatures (we had multiple signatures for the same land cover, like "agriculture") into single classifications. Creating a set of classes also allows one to estimate the area dedicated i the image to each class.  The final supervised classification is presented above.

Thursday, October 31, 2013

Unsupervised Classification

Unsupervised Classification of the UWF Campus
This week in Remote Sensing we used an aerial image of the UWF campus to try some unsupervised classification techniques.  Unsupervised classification is when we rely on the computer to group like pixels together and make assumptions that these represent similar features.  In the image above, we started by using ERDAS Imagine tools to create 50 unsupervised classes.  Then I manually assigned these 50 classes to 5 categories (Buildings/Roads, Trees, Grass, Shadows, and Mixed).  This process actually is pretty easy and goes quickly with the ERDAS tools.  This was one of the first assignments where I felt like the ERDAS tools were easier to use for this task than a similar task in ArcMap.  From the 5 new classifications, I was able to create an estimate of the permeable versus impermeable surfaces that make up the UWF campus.

Tuesday, October 22, 2013

Thermal Imagery and Analysis

Thermal Imagery using ETM+ Thermal and TM True Color

This week in Remote Sensing, we focused on Thermal Imagery.  Thermal imagery has all sorts of uses, from military to environmental.  I chose an environmental perspective for this lab project.

I started by creating ETMcomposite.img, a combination of 8 ETM images using ArcMap’s Composite Bands tool.  I wanted to work directly with the thermal band, so I started out with just band 6 of ETMcomposite selected.  I selected a color ramp that intuitively matches what we expect for cool to warm colors (blue to red).  By adjusting the Symbology/ Histogram breakpoints, I was able to accentuate the range of values in the overall image, really increasing the contrast between warm and cool areas.  This is really when I noticed Santay Island for the first time.  It was so cool (as in cold) in the middle of this large urban area and really not very far away physically.  It seemed to make a good ecological point that these small refuges can exist within urban areas if we take care of them.  Unfortunately, it looks like someone is already clear cutting in the middle of the island and that can be seen clearly in the thermal image.

Tuesday, October 15, 2013

Multispectral Analysis


This week in Remote Sensing we experimented with multispectral analysis.  Using a LandSat image of western Washington state (my home state!), we attempted to determine what features were causing various spikes on histogram representations of the image.  In the first image, there was a prominent spike at the dark side of Band 4, indicating to me that this must represent all the water in the arms of Puget Sound.  I used TM False Natural Color to best present the water feature.


In the second image, we had smaller spikes in the bright end of the histogram.  This appears to be the snow on the peaks of the Olympic range.  True Color seemed the best representation for white snow.


Finally, some bodies of water appeared lighter and brighter than others. I captured this in the bottom map of Grays Harbor.  We can see that either a lot of sediment or algae is making this image appear lighter.  TM False Color IR really helps this brighter feature stand out in this image.


Friday, October 11, 2013

Spatial Enhancements


This week we experimented with spatial enhancements on some imagery.  The original image was a panchromatic layer that exhibited pretty distinct banding.  Through the use of a Fourier transform and some experimentation with different convolutions, I was able to produce the image above.  This image attempts to minimize banding while finding a happy medium between being too generalized (low pass convolution) and too "edgy" (high pass convolution).  In the end, the banding is generalized so it is not distracting and the urban features remain relatively sharp and distinct.

Monday, September 23, 2013

Intro to ERDAS Imagine


This week in Remote Sensing we were introduced to ERDAS Imagine.  ERDAS Imagine is a geospatial image processing application.  We started out with a pretty simple introduction to the application's menus and functions, working with an existing raster and creating a new raster from a subset selection.  The simple navigation and manipulations we did today were fairly easy, we'll see what comes next.  For the final product (above) we had to move back to ArcGIS to create our map.

I'll be honest, I'm not super excited about jumping into a new, complicated, and apparently buggy tool this late in the program.  I feel like we were just getting to the good stuff in ArcGIS and I'd much rather come away feeling like we plumbed the depths there.  Hopefully the pay-off will be worth this detour.

Sunday, September 22, 2013

Ground Truthing


Our Remote Sensing lab this week focused on accuracy and ground truthing the classification that we made last week.  We started with our existing LULC classifications and then created a set of sample points.  I recalled an ArcGIS function that creates a fishnet grid over a map and decided to try that as a means to implement systematic sampling.  When the grid was created (as a polyline) ArcGIS also created a set of points in the center of each cell as a separate shapefile.  This, it turns out, was perfect for what I needed as a largely random set of sample points.  I then used Google Street View to zoom in and verify that each point was indeed of the classification that I had assigned.

I ended up with about 90% success and 10% mis-classifications.  The bad classifications tended to be the Industrial areas (not so industrial).  Overall, I think it went pretty well, however and showed that we can pretty well discern the land use classifications of an urban area.

Wednesday, September 18, 2013

LULC Analysis


This week's assignment in Remote Sensing focused on land use/land cover analysis of an aerial photo.  Our job was to analysis apparent land use and classify it according to a USGS two-level classification code.

The map seems to have two main “super-areas” – the bay and the urban land.  I began by creating a polygon for the whole bay (the shore and frame edges).  I then created a series of polygons for the wetland islands.  Later I would “erase” the bay with the islands to create a shapefile that was just the water areas of the bay (avoiding polygon overlap).

In the urban region, I began by identifying “natural” areas like the rivers, estuaries, lakes, and small forested areas. There didn’t seem to be any agricultural areas in this photo. These natural areas were pretty easy to identify, though I could see some difference of opinion on forest types and various classifications of wetland, estuary, and streams.  Similar to how the islands were handled in the bay, the deciduous forest was “unioned” and then the lakes “erased” so that the forest and lake polygons wouldn’t overlap.

Next, I started classifying the urban areas.  Several of these were quite easy to identify after the lecture and text descriptions – industrial areas, schools, and retail areas.  I was also quite happy to recognize the cemetery in the lower right.   The most difficult was the area I classified as “Commercial and Services” region that borders the highway.  There is quite a mix in there and some may even be residential.

Finally, everything that wasn’t otherwise classified in the Urban region, I deemed “Residential”. This is all the single residence housing that fills the urban region. To create this region I used the Urban Land polygon and “erased” a union of all the urban classified sub-regions out of it.

Tuesday, September 17, 2013

Visual Interpretation


For our first Remote Sensing lab assignment, we began with the basics of the visual interpretation of aerial photographs. In our first exercise, we simply identified regions in the image (above) that fall along a five-step scale in tone (light to dark colors) and texture (smooth to rough).  These two features (tone and texture) can help us understand what we are looking down on in aerial images.  In the image above, for example, the very smooth area (lower left, in blue) is water while a very "rough" area is the section of residential housing left of center.


The second part of the exercise had us identifying objects using one of several strategies: Shape & Size, Shadows, Patterns, or Association.  Some objects are quite obvious just by their shape and size.  I happen to be quite familiar with docks and piers, for example, and the pier in the image above was easy for me to identify by shape.  Similarly, the water tower may not be easily identifiable with only a top-down view, but the shadow is quite distinct to anyone who has ever lived in the mid-west.  The parking lot appears to have a distinctive herringbone pattern to it.  Finally, we can use association to identify features.  The beach is not particularly distinctive in this view.  However, we know we have water in the image (from the wave pattern) so the association of this feature adjacent to the water is easy to make.