Table of Contents for
QGIS: Becoming a GIS Power User

Version ebook / Retour

Cover image for bash Cookbook, 2nd Edition QGIS: Becoming a GIS Power User by Alexander Bruy Published by Packt Publishing, 2017
  1. Cover
  2. Table of Contents
  3. QGIS: Becoming a GIS Power User
  4. QGIS: Becoming a GIS Power User
  5. QGIS: Becoming a GIS Power User
  6. Credits
  7. Preface
  8. What you need for this learning path
  9. Who this learning path is for
  10. Reader feedback
  11. Customer support
  12. 1. Module 1
  13. 1. Getting Started with QGIS
  14. Running QGIS for the first time
  15. Introducing the QGIS user interface
  16. Finding help and reporting issues
  17. Summary
  18. 2. Viewing Spatial Data
  19. Dealing with coordinate reference systems
  20. Loading raster files
  21. Loading data from databases
  22. Loading data from OGC web services
  23. Styling raster layers
  24. Styling vector layers
  25. Loading background maps
  26. Dealing with project files
  27. Summary
  28. 3. Data Creation and Editing
  29. Working with feature selection tools
  30. Editing vector geometries
  31. Using measuring tools
  32. Editing attributes
  33. Reprojecting and converting vector and raster data
  34. Joining tabular data
  35. Using temporary scratch layers
  36. Checking for topological errors and fixing them
  37. Adding data to spatial databases
  38. Summary
  39. 4. Spatial Analysis
  40. Combining raster and vector data
  41. Vector and raster analysis with Processing
  42. Leveraging the power of spatial databases
  43. Summary
  44. 5. Creating Great Maps
  45. Labeling
  46. Designing print maps
  47. Presenting your maps online
  48. Summary
  49. 6. Extending QGIS with Python
  50. Getting to know the Python Console
  51. Creating custom geoprocessing scripts using Python
  52. Developing your first plugin
  53. Summary
  54. 2. Module 2
  55. 1. Exploring Places – from Concept to Interface
  56. Acquiring data for geospatial applications
  57. Visualizing GIS data
  58. The basemap
  59. Summary
  60. 2. Identifying the Best Places
  61. Raster analysis
  62. Publishing the results as a web application
  63. Summary
  64. 3. Discovering Physical Relationships
  65. Spatial join for a performant operational layer interaction
  66. The CartoDB platform
  67. Leaflet and an external API: CartoDB SQL
  68. Summary
  69. 4. Finding the Best Way to Get There
  70. OpenStreetMap data for topology
  71. Database importing and topological relationships
  72. Creating the travel time isochron polygons
  73. Generating the shortest paths for all students
  74. Web applications – creating safe corridors
  75. Summary
  76. 5. Demonstrating Change
  77. TopoJSON
  78. The D3 data visualization library
  79. Summary
  80. 6. Estimating Unknown Values
  81. Interpolated model values
  82. A dynamic web application – OpenLayers AJAX with Python and SpatiaLite
  83. Summary
  84. 7. Mapping for Enterprises and Communities
  85. The cartographic rendering of geospatial data – MBTiles and UTFGrid
  86. Interacting with Mapbox services
  87. Putting it all together
  88. Going further – local MBTiles hosting with TileStream
  89. Summary
  90. 3. Module 3
  91. 1. Data Input and Output
  92. Finding geospatial data on your computer
  93. Describing data sources
  94. Importing data from text files
  95. Importing KML/KMZ files
  96. Importing DXF/DWG files
  97. Opening a NetCDF file
  98. Saving a vector layer
  99. Saving a raster layer
  100. Reprojecting a layer
  101. Batch format conversion
  102. Batch reprojection
  103. Loading vector layers into SpatiaLite
  104. Loading vector layers into PostGIS
  105. 2. Data Management
  106. Joining layer data
  107. Cleaning up the attribute table
  108. Configuring relations
  109. Joining tables in databases
  110. Creating views in SpatiaLite
  111. Creating views in PostGIS
  112. Creating spatial indexes
  113. Georeferencing rasters
  114. Georeferencing vector layers
  115. Creating raster overviews (pyramids)
  116. Building virtual rasters (catalogs)
  117. 3. Common Data Preprocessing Steps
  118. Converting points to lines to polygons and back – QGIS
  119. Converting points to lines to polygons and back – SpatiaLite
  120. Converting points to lines to polygons and back – PostGIS
  121. Cropping rasters
  122. Clipping vectors
  123. Extracting vectors
  124. Converting rasters to vectors
  125. Converting vectors to rasters
  126. Building DateTime strings
  127. Geotagging photos
  128. 4. Data Exploration
  129. Listing unique values in a column
  130. Exploring numeric value distribution in a column
  131. Exploring spatiotemporal vector data using Time Manager
  132. Creating animations using Time Manager
  133. Designing time-dependent styles
  134. Loading BaseMaps with the QuickMapServices plugin
  135. Loading BaseMaps with the OpenLayers plugin
  136. Viewing geotagged photos
  137. 5. Classic Vector Analysis
  138. Selecting optimum sites
  139. Dasymetric mapping
  140. Calculating regional statistics
  141. Estimating density heatmaps
  142. Estimating values based on samples
  143. 6. Network Analysis
  144. Creating a simple routing network
  145. Calculating the shortest paths using the Road graph plugin
  146. Routing with one-way streets in the Road graph plugin
  147. Calculating the shortest paths with the QGIS network analysis library
  148. Routing point sequences
  149. Automating multiple route computation using batch processing
  150. Matching points to the nearest line
  151. Creating a routing network for pgRouting
  152. Visualizing the pgRouting results in QGIS
  153. Using the pgRoutingLayer plugin for convenience
  154. Getting network data from the OSM
  155. 7. Raster Analysis I
  156. Using the raster calculator
  157. Preparing elevation data
  158. Calculating a slope
  159. Calculating a hillshade layer
  160. Analyzing hydrology
  161. Calculating a topographic index
  162. Automating analysis tasks using the graphical modeler
  163. 8. Raster Analysis II
  164. Calculating NDVI
  165. Handling null values
  166. Setting extents with masks
  167. Sampling a raster layer
  168. Visualizing multispectral layers
  169. Modifying and reclassifying values in raster layers
  170. Performing supervised classification of raster layers
  171. 9. QGIS and the Web
  172. Using web services
  173. Using WFS and WFS-T
  174. Searching CSW
  175. Using WMS and WMS Tiles
  176. Using WCS
  177. Using GDAL
  178. Serving web maps with the QGIS server
  179. Scale-dependent rendering
  180. Hooking up web clients
  181. Managing GeoServer from QGIS
  182. 10. Cartography Tips
  183. Using Rule Based Rendering
  184. Handling transparencies
  185. Understanding the feature and layer blending modes
  186. Saving and loading styles
  187. Configuring data-defined labels
  188. Creating custom SVG graphics
  189. Making pretty graticules in any projection
  190. Making useful graticules in printed maps
  191. Creating a map series using Atlas
  192. 11. Extending QGIS
  193. Defining custom projections
  194. Working near the dateline
  195. Working offline
  196. Using the QspatiaLite plugin
  197. Adding plugins with Python dependencies
  198. Using the Python console
  199. Writing Processing algorithms
  200. Writing QGIS plugins
  201. Using external tools
  202. 12. Up and Coming
  203. Preparing LiDAR data
  204. Opening File Geodatabases with the OpenFileGDB driver
  205. Using Geopackages
  206. The PostGIS Topology Editor plugin
  207. The Topology Checker plugin
  208. GRASS Topology tools
  209. Hunting for bugs
  210. Reporting bugs
  211. Bibliography
  212. Index

Chapter 3. Discovering Physical Relationships

In this chapter, we will create an application for a raster physical modeling example. First, we'll use a raster analysis to model the physical conditions for some basic hydrological analysis. Next, we'll redo these steps using a model automation tool. Then, we will attach the raster values to the vector objects for an efficient lookup in a web application. Finally, we will use a cloud platform to enable a dynamic query from the client-side application code. We will take a look at an environmental planning case, providing capabilities for stakeholders to discover the upstream toxic sites.

In this chapter, we will cover the following topics:

  • Hydrological modeling
  • Workflow automation with graphical models
  • Spatial relationships for a performant access to information
  • The NNJoin plugin
  • The CartoDB cloud platform
  • Leaflet SQLQueries using an external API:CartoDB

Hydrological modeling

The behavior of water is closely tied with the characteristics of the terrain's surface—particularly the values connected to elevation. In this section, we will use a basic hydrological model to analyze the location and direction of the hydrological network—streams, creeks, and rivers. To do this, we will use a digital elevation model and a raster grid, in which the value of each cell is equal to the elevation at that location. A more complex model would employ additional physical parameters (e.g., infrastructure, vegetation, etc.). These modeling steps will lay the necessary foundation for our web application, which will display the upstream toxic sites (brownfields), both active and historical, for a given location.

There are a number of different plugins and Processing Framework algorithms (operations) that enable hydrological modeling. For this exercise, we will use SAGA algorithms, of which many are available, with some help from GDAL for the raster preparation. Note that you may need to wait much longer than you are accustomed to for some of the hydrological modeling operations to finish (approximately an hour).

Preparing the data

Some work is needed to prepare the DEM data for hydrological modeling. The DEM path is c3/data/original/dem/dem.tif. Add this layer to the map (navigate to Layer | Add Layer | Add Raster Layer). Also, add the county shapefile at c3/data/original/county.shp (navigate to Layer | Add Layer | Add Vector Layer).

Filling the grid sinks

Filling the grid sinks smooths out the elevation surface to exclude the unusual low points in the surface that would cause the modeled streams to—unrealistically—drain to these local lows instead of to larger outlets. The steps to fill the grid sinks are as follows:

  1. Navigate to Processing Toolbox (Advanced Interface).
  2. Search for Fill Sinks (under SAGA | Terrain Analysis | Hydrology).
  3. Run the Fill Sinks tool.
  4. In addition to the default parameters, define DEM as dem and Filled DEM as c3/data/output/fill.tif.
  5. Click on Run, as shown in the following screenshot:
    Filling the grid sinks

Clipping the grid to study the area by mask layer

By limiting the raster processing extent, we exclude the unnecessary data, improving the speed of the operation. At the same time, we also output a more useful grid that conforms to our extent of interest. In QGIS/SAGA, in order to limit the processing to a fixed extent or area, it is necessary to eliminate those cells from the grid—in other words, the setting cells outside the area or extent, which are sometimes referred to as NoData (or no-data, and so on) in raster software, to a null value.

Tip

Unlike ArcGIS or GRASS, the SAGA package under QGIS does not have any capability to set an extent or area within which we want to limit the raster processing.

In QGIS, the raster processing's extent limitation can be accomplished using a vector polygon or a set of polygons with the Clip raster by mask layer tool. By following the given steps, we can achieve this:

  1. Navigate to Processing Toolbox (Advanced Interface).
  2. Search for Mask (under GDAL | Extraction).
  3. Run the Clip raster by mask layer tool.
  4. Enter the following parameters, keeping others as default:
    • Input layer: This is the layer corresponding to fill.tif, created in the previous Fill Sinks section
    • Mask layer: county
    • Output layer: c3/data/output/clip.tif
  5. Click on Run, as shown in the following screenshot:
    Clipping the grid to study the area by mask layer

Note

This function is not available in some versions of QGIS for Mac OS.

The output from Clip by mask layer tool, showing the grid clipped to the county polygon, will look similar to the following image (the black and white color gradient or mapping to a null value may be reversed):

Clipping the grid to study the area by mask layer

Modeling the hydrological network based on elevation

Now that our elevation grid has been prepared, it is time to actually model the hydrological network location and direction. To do this, we will use Channel network and drainage basins, which only requires a single input: the (filled and clipped) elevation model. This tool will produce the hydrological lines using a Strahler Order threshold, which relates to the hierarchy level of the returned streams (for example, to exclude very small ditches) The default of 5 is perfect for our purposes, including enough hydrological lines but not too many. The results look pretty realistic. This tool also produces many additional related grids, which we do not need for this project. Perform the following steps:

  1. Navigate to Processing Toolbox (Advanced Interface).
  2. Search for Channel network and drainage basins (under SAGA | Terrain Analysis | Hydrology).
  3. Run the Channel network and drainage basins tool.
  4. In the Elevation field, input the filled and clipped DEM, given as the output in the previous section.
  5. In the Threshold field, keep it at the default value (5.0).
  6. In the Channels field, input c3/data/output/channels.shp.

    Ensure that Open output file after running algorithm is selected

  7. Unselect Open output file after running algorithm for all other outputs.
  8. Click on Run, as shown in the following screenshot:
    Modeling the hydrological network based on elevation

The output from the Channel network and drainage basins, showing the hydrological line location, will look similar to the following image:

Modeling the hydrological network based on elevation

Workflow automation with the graphical models

Graphical Modeler is a tool within QGIS that is useful for modeling and automating workflows. It differs from batch processing in that you can tie together many separate operations in a processing sequence. It is considered a part of the processing framework. Graphical Modeler is particularly useful for workflows containing many steps to be repeated.

By building a graphical model, we can operationalize our hydrological modeling process. This provides a few benefits, as follows:

  • Our modeling process is graphically documented and preserved
  • The model can be rerun in its entirety with little to no interaction
  • The model can be redistributed
  • The model is parameterized so that we could rerun the same process on different data layers

Creating a graphical model

  1. Bring up the Graphical Modeler dialog from the Processing menu.
    1. Navigate to Processing | Graphical Modeler.
  2. Enter a model name and a group name.
  3. Save your model under c3/data/output/c3.model.

    The dialog is modal and needs to be closed before you can return to other work in QGIS, so saving early will be useful.

Adding the input parameters

Some of the inputs to your model's algorithms will be the outputs of other model algorithms; for others, you will need to add a corresponding input parameter.

Adding the raster parameter – elevation

We will add the first data input parameter to the model so that it is available to the model algorithms. It is our original DEM elevation data. Perform the following steps:

  1. Select the Inputs tab from the lower left corner of the Processing modeler display.
  2. Drag Raster layer from the parameters list into the modeler pane. This parameter will represent our elevation grid (DEM).
  3. Input elevation for Parameter name.
  4. Click on OK, as shown in the following screenshot:
    Adding the raster parameter – elevation

Adding the vector parameter – extent

We will add the next data input parameter to the model so that it is available to the model algorithms. It is our vector county data and the extent of our study.

  1. Add a vector layer for our extent polygon (county). Make sure you select Polygon as the type, and call this parameter extent.
  2. You will need to input a parameter name. It would be easiest to use the same layer/parameter names that we have been using so far, as shown in the following screenshot:
    Adding the vector parameter – extent

Adding the algorithms

The modeler connects the individual with their input data and their output data with the other algorithms. We will now add the algorithms.

Fill Sinks

The first algorithm we will add is Fill Sinks, which as we noted earlier, removes the problematic low elevations from the elevation data. Perform the following steps:

  1. Select the Algorithms tab from the lower-left corner.
  2. After you drag in an algorithm, you will be prompted to choose the parameters.
  3. Use the search input to locate Fill Sinks and then open.
  4. Select elevation for the DEM parameter and click on OK, as shown in the following screenshot:
    Fill Sinks

Clip raster

The next algorithm we will add is Clip raster by mask layer, which we've used to limit the processing extent of the subsequent raster processing. Perform the following steps:

  1. Use the search input to locate Clip raster by mask layer.
  2. Select 'Filled DEM' from algorithm 'Fill Sinks' for the Input layer parameter.
  3. Select extent for the Mask layer parameter.
  4. Click on OK, accepting the other parameter defaults, as shown in the following screenshot:
    Clip raster

Channel network and drainage basins

The final algorithm we will add is Channel network and drainage basins, which produces a model of our hydrological network. Perform the following steps:

  1. Use the search input to locate Channel network and drainage basins.
  2. Select 'Output Layer' from algorithm 'Clip raster by mask layer' for the Elevation parameter.
  3. Click on OK, accepting the other parameter defaults.
  4. Once you populate all the three algorithms, your model will look similar to the following image:
    Channel network and drainage basins

Running the model

Now that our model is complete, we can execute all the steps in an automated sequential fashion:

  1. Run your model by clicking on the Run model button on the right-hand side of the row of buttons.
  2. You'll be prompted to select values for the elevation and the extent input layer parameters you defined earlier. Select the dem and county layers for these inputs, respectively, as shown in the following screenshot:
    Running the model
  3. After you define and run your model, all the outputs you defined earlier will be produced. These will be located at the paths that you defined in the parameters dialog or in the model algorithms themselves.

Note

If you don't specify an output directory, the data will be saved to the temp directory for the processing framework, for example:

C:\Users\[YOURUSERNAME]\AppData\Local\Temp\processing\

Now that we've completed the hydrological modeling, we'll look at a technique for preparing our outputs for dynamic web interaction.