Table of Contents for
QGIS: Becoming a GIS Power User

Version ebook / Retour

Cover image for bash Cookbook, 2nd Edition QGIS: Becoming a GIS Power User by Alexander Bruy Published by Packt Publishing, 2017
  1. Cover
  2. Table of Contents
  3. QGIS: Becoming a GIS Power User
  4. QGIS: Becoming a GIS Power User
  5. QGIS: Becoming a GIS Power User
  6. Credits
  7. Preface
  8. What you need for this learning path
  9. Who this learning path is for
  10. Reader feedback
  11. Customer support
  12. 1. Module 1
  13. 1. Getting Started with QGIS
  14. Running QGIS for the first time
  15. Introducing the QGIS user interface
  16. Finding help and reporting issues
  17. Summary
  18. 2. Viewing Spatial Data
  19. Dealing with coordinate reference systems
  20. Loading raster files
  21. Loading data from databases
  22. Loading data from OGC web services
  23. Styling raster layers
  24. Styling vector layers
  25. Loading background maps
  26. Dealing with project files
  27. Summary
  28. 3. Data Creation and Editing
  29. Working with feature selection tools
  30. Editing vector geometries
  31. Using measuring tools
  32. Editing attributes
  33. Reprojecting and converting vector and raster data
  34. Joining tabular data
  35. Using temporary scratch layers
  36. Checking for topological errors and fixing them
  37. Adding data to spatial databases
  38. Summary
  39. 4. Spatial Analysis
  40. Combining raster and vector data
  41. Vector and raster analysis with Processing
  42. Leveraging the power of spatial databases
  43. Summary
  44. 5. Creating Great Maps
  45. Labeling
  46. Designing print maps
  47. Presenting your maps online
  48. Summary
  49. 6. Extending QGIS with Python
  50. Getting to know the Python Console
  51. Creating custom geoprocessing scripts using Python
  52. Developing your first plugin
  53. Summary
  54. 2. Module 2
  55. 1. Exploring Places – from Concept to Interface
  56. Acquiring data for geospatial applications
  57. Visualizing GIS data
  58. The basemap
  59. Summary
  60. 2. Identifying the Best Places
  61. Raster analysis
  62. Publishing the results as a web application
  63. Summary
  64. 3. Discovering Physical Relationships
  65. Spatial join for a performant operational layer interaction
  66. The CartoDB platform
  67. Leaflet and an external API: CartoDB SQL
  68. Summary
  69. 4. Finding the Best Way to Get There
  70. OpenStreetMap data for topology
  71. Database importing and topological relationships
  72. Creating the travel time isochron polygons
  73. Generating the shortest paths for all students
  74. Web applications – creating safe corridors
  75. Summary
  76. 5. Demonstrating Change
  77. TopoJSON
  78. The D3 data visualization library
  79. Summary
  80. 6. Estimating Unknown Values
  81. Interpolated model values
  82. A dynamic web application – OpenLayers AJAX with Python and SpatiaLite
  83. Summary
  84. 7. Mapping for Enterprises and Communities
  85. The cartographic rendering of geospatial data – MBTiles and UTFGrid
  86. Interacting with Mapbox services
  87. Putting it all together
  88. Going further – local MBTiles hosting with TileStream
  89. Summary
  90. 3. Module 3
  91. 1. Data Input and Output
  92. Finding geospatial data on your computer
  93. Describing data sources
  94. Importing data from text files
  95. Importing KML/KMZ files
  96. Importing DXF/DWG files
  97. Opening a NetCDF file
  98. Saving a vector layer
  99. Saving a raster layer
  100. Reprojecting a layer
  101. Batch format conversion
  102. Batch reprojection
  103. Loading vector layers into SpatiaLite
  104. Loading vector layers into PostGIS
  105. 2. Data Management
  106. Joining layer data
  107. Cleaning up the attribute table
  108. Configuring relations
  109. Joining tables in databases
  110. Creating views in SpatiaLite
  111. Creating views in PostGIS
  112. Creating spatial indexes
  113. Georeferencing rasters
  114. Georeferencing vector layers
  115. Creating raster overviews (pyramids)
  116. Building virtual rasters (catalogs)
  117. 3. Common Data Preprocessing Steps
  118. Converting points to lines to polygons and back – QGIS
  119. Converting points to lines to polygons and back – SpatiaLite
  120. Converting points to lines to polygons and back – PostGIS
  121. Cropping rasters
  122. Clipping vectors
  123. Extracting vectors
  124. Converting rasters to vectors
  125. Converting vectors to rasters
  126. Building DateTime strings
  127. Geotagging photos
  128. 4. Data Exploration
  129. Listing unique values in a column
  130. Exploring numeric value distribution in a column
  131. Exploring spatiotemporal vector data using Time Manager
  132. Creating animations using Time Manager
  133. Designing time-dependent styles
  134. Loading BaseMaps with the QuickMapServices plugin
  135. Loading BaseMaps with the OpenLayers plugin
  136. Viewing geotagged photos
  137. 5. Classic Vector Analysis
  138. Selecting optimum sites
  139. Dasymetric mapping
  140. Calculating regional statistics
  141. Estimating density heatmaps
  142. Estimating values based on samples
  143. 6. Network Analysis
  144. Creating a simple routing network
  145. Calculating the shortest paths using the Road graph plugin
  146. Routing with one-way streets in the Road graph plugin
  147. Calculating the shortest paths with the QGIS network analysis library
  148. Routing point sequences
  149. Automating multiple route computation using batch processing
  150. Matching points to the nearest line
  151. Creating a routing network for pgRouting
  152. Visualizing the pgRouting results in QGIS
  153. Using the pgRoutingLayer plugin for convenience
  154. Getting network data from the OSM
  155. 7. Raster Analysis I
  156. Using the raster calculator
  157. Preparing elevation data
  158. Calculating a slope
  159. Calculating a hillshade layer
  160. Analyzing hydrology
  161. Calculating a topographic index
  162. Automating analysis tasks using the graphical modeler
  163. 8. Raster Analysis II
  164. Calculating NDVI
  165. Handling null values
  166. Setting extents with masks
  167. Sampling a raster layer
  168. Visualizing multispectral layers
  169. Modifying and reclassifying values in raster layers
  170. Performing supervised classification of raster layers
  171. 9. QGIS and the Web
  172. Using web services
  173. Using WFS and WFS-T
  174. Searching CSW
  175. Using WMS and WMS Tiles
  176. Using WCS
  177. Using GDAL
  178. Serving web maps with the QGIS server
  179. Scale-dependent rendering
  180. Hooking up web clients
  181. Managing GeoServer from QGIS
  182. 10. Cartography Tips
  183. Using Rule Based Rendering
  184. Handling transparencies
  185. Understanding the feature and layer blending modes
  186. Saving and loading styles
  187. Configuring data-defined labels
  188. Creating custom SVG graphics
  189. Making pretty graticules in any projection
  190. Making useful graticules in printed maps
  191. Creating a map series using Atlas
  192. 11. Extending QGIS
  193. Defining custom projections
  194. Working near the dateline
  195. Working offline
  196. Using the QspatiaLite plugin
  197. Adding plugins with Python dependencies
  198. Using the Python console
  199. Writing Processing algorithms
  200. Writing QGIS plugins
  201. Using external tools
  202. 12. Up and Coming
  203. Preparing LiDAR data
  204. Opening File Geodatabases with the OpenFileGDB driver
  205. Using Geopackages
  206. The PostGIS Topology Editor plugin
  207. The Topology Checker plugin
  208. GRASS Topology tools
  209. Hunting for bugs
  210. Reporting bugs
  211. Bibliography
  212. Index

Selecting optimum sites

Optimum site selection is a pretty common problem, for example, when planning shop or warehouse locations or when looking for a new apartment. In this recipe, you will learn how to perform optimum site selection manually using tools from the Processing Toolbox option, but you will also see how to automate this workflow by creating a Processing model.

In the optimum site selection in this recipe, we will combine different vector analysis tools to find potential locations in Wake County that match the following criteria:

  • Locations are near a big lake (up to 500 m)
  • Locations are close to an elementary school (up to 500 m)
  • Locations are within a reasonable distance (up to 2 km) from a high school
  • Locations are at least 1 km from a main road

Getting ready

To follow this exercise, load the following datasets, lakes.shp, schools_wake.shp, and roadsmajor.shp.

As all datasets in our test data already use the same CRS, we can get right to the analysis. If you are using different data, you may have to get all your datasets into the same CRS first. In this case, please refer to Chapter 1, Data Input and Output.

How to do it…

The following steps show you how to perform optimum site selection using the Processing Toolbox option:

  1. First, we have to filter the lakes layer for big lakes. To do this, we use the Select by expression tool from the Processing toolbox, select the lakes layer, and enter "AREA" > 1000000 AND "FTYPE" = 'LAKE/POND' in the Expression textbox, as shown in the following screenshot:
    How to do it…
  2. Next, we create the buffers that will represent the proximity areas around lakes, schools, and roads. Use Fixed distance buffer from the Processing Toolbox option to create the following buffers:
    1. For the lakes, select Distance of 500 meters and set Dissolve result by checking the box as shown in the following screenshot. By dissolving the result, we can make sure that the overlapping buffer areas will be combined into one polygon. Otherwise, each buffer will remain as a separate feature in the resulting layer:
      How to do it…

      Tip

      It's your choice whether you want to save the buffer results permanently by specifying an output file, or you just want to work with temporary files by leaving the Buffer output file field empty.

    2. To create the elementary school buffers, first select only the schools with "GLEVEL" = 'E' using the Select by Expression tool like we did for the lakes buffer. Then, use the buffer tool like we just did for the lakes buffer.
    3. Repeat the process for the high schools using "GLEVEL" = 'H' and a buffer distance of 2,000 meters.
    4. Finally, for the roads, create a buffer with a distance of 1,000 meters.
  3. With all these buffers ready, we can now combine them to fulfill these rules:
    1. Use the Intersection tool from the Processing Toolbox option on the buffers around elementary and high schools to get the areas that are within the vicinity of both school types.
    2. Use the Intersection tool on the buffers around the lakes and the result of the previous step to limit the results to lakeside areas. Use the Difference tool to remove areas around major roads (that is, the buffered road layer) from the result of the previous (Intersection) steps.
  4. Check the resulting layer to view the potential sites that fit all the criteria that we previously specified. You'll find that there is only one area close to WAKEFIELD ELEMENTARY and WAKEFIELD HIGH that fits the bill, as shown in the following screenshot:
    How to do it…

How it works…

In step 1, we used Intersection to model the requirement that our preferred site would be near both an elementary and a high school. Later, in step 3, the Difference tool enabled us to remove areas close to major roads. The following figure gives us an overview of the available vector analysis tools that can be useful for similar analyses. For example, Union could be used to model requirements, such as "close to at least an elementary or a high school". Symmetrical Difference, on the other hand, would result in "close to an elementary or a high school but not both", as illustrated in the following figure:

How it works…

There's more…

We were lucky and found a potential site that matched all criteria. Of course, this is not always the case, and you will have to try and adjust your criteria to find a matching site. As you can imagine, it can be very tedious and time-consuming to repeat these steps again and again with different settings. Therefore, it's a good idea to create a Processing model to automate this task.

The model (as shown in the following screenshot) basically contains the same tools that we used in the manual process, as follows:

  • Use two select by expression instances to select elementary and high schools. As you can see in the following screenshot, we used the descriptions Select "GLEVEL" = 'E' and Select "GLEVEL" = 'H' to name these model steps.
  • For elementary schools, compute fixed distance buffers of 500 meters. This step is called Buffer "GLEVEL" = 'E'.
  • For high schools, compute fixed distance buffers of 2,000 meters. This step is called Buffer "GLEVEL" = 'H'.
  • Select the big lakes using Select by expression (refer to the Select big lakes step) and buffer them using fixed distance buffer of 500 meters (refer to the Buffer lakes step).
  • Buffer the roads using Fixed distance buffer (refer to the Buffer roads step). The buffer size is controlled by the number model input called road_buffer_size. You can extend this approach of controlling the model parameters using additional inputs to all the other buffer steps in this model. (We chose to show only one example in order to keep the model screenshot readable.)
  • Use Intersection to get areas near schools (refer to the Intersection: near schools step).
  • Use Intersection to get areas near schools and lakes (refer to the Intersection: schools and lakes step).
  • Use Difference to remove areas near roads (refer to the Difference: avoid roads step).

This is how the final model looks like:

There's more…

You can run this model from the Processing Toolbox option, or you can even use it as a building block in other models. It is worth noting that this model produces intermediate results in the form of buffer results (near_elementary, near highschool, and so on). While these intermediate results are useful while developing and debugging the model, you may eventually want to remove them. This can be done by editing the buffer steps and removing the Buffer <OutputVector> names.