Table of Contents for
QGIS: Becoming a GIS Power User

Version ebook / Retour

Cover image for bash Cookbook, 2nd Edition QGIS: Becoming a GIS Power User by Alexander Bruy Published by Packt Publishing, 2017
  1. Cover
  2. Table of Contents
  3. QGIS: Becoming a GIS Power User
  4. QGIS: Becoming a GIS Power User
  5. QGIS: Becoming a GIS Power User
  6. Credits
  7. Preface
  8. What you need for this learning path
  9. Who this learning path is for
  10. Reader feedback
  11. Customer support
  12. 1. Module 1
  13. 1. Getting Started with QGIS
  14. Running QGIS for the first time
  15. Introducing the QGIS user interface
  16. Finding help and reporting issues
  17. Summary
  18. 2. Viewing Spatial Data
  19. Dealing with coordinate reference systems
  20. Loading raster files
  21. Loading data from databases
  22. Loading data from OGC web services
  23. Styling raster layers
  24. Styling vector layers
  25. Loading background maps
  26. Dealing with project files
  27. Summary
  28. 3. Data Creation and Editing
  29. Working with feature selection tools
  30. Editing vector geometries
  31. Using measuring tools
  32. Editing attributes
  33. Reprojecting and converting vector and raster data
  34. Joining tabular data
  35. Using temporary scratch layers
  36. Checking for topological errors and fixing them
  37. Adding data to spatial databases
  38. Summary
  39. 4. Spatial Analysis
  40. Combining raster and vector data
  41. Vector and raster analysis with Processing
  42. Leveraging the power of spatial databases
  43. Summary
  44. 5. Creating Great Maps
  45. Labeling
  46. Designing print maps
  47. Presenting your maps online
  48. Summary
  49. 6. Extending QGIS with Python
  50. Getting to know the Python Console
  51. Creating custom geoprocessing scripts using Python
  52. Developing your first plugin
  53. Summary
  54. 2. Module 2
  55. 1. Exploring Places – from Concept to Interface
  56. Acquiring data for geospatial applications
  57. Visualizing GIS data
  58. The basemap
  59. Summary
  60. 2. Identifying the Best Places
  61. Raster analysis
  62. Publishing the results as a web application
  63. Summary
  64. 3. Discovering Physical Relationships
  65. Spatial join for a performant operational layer interaction
  66. The CartoDB platform
  67. Leaflet and an external API: CartoDB SQL
  68. Summary
  69. 4. Finding the Best Way to Get There
  70. OpenStreetMap data for topology
  71. Database importing and topological relationships
  72. Creating the travel time isochron polygons
  73. Generating the shortest paths for all students
  74. Web applications – creating safe corridors
  75. Summary
  76. 5. Demonstrating Change
  77. TopoJSON
  78. The D3 data visualization library
  79. Summary
  80. 6. Estimating Unknown Values
  81. Interpolated model values
  82. A dynamic web application – OpenLayers AJAX with Python and SpatiaLite
  83. Summary
  84. 7. Mapping for Enterprises and Communities
  85. The cartographic rendering of geospatial data – MBTiles and UTFGrid
  86. Interacting with Mapbox services
  87. Putting it all together
  88. Going further – local MBTiles hosting with TileStream
  89. Summary
  90. 3. Module 3
  91. 1. Data Input and Output
  92. Finding geospatial data on your computer
  93. Describing data sources
  94. Importing data from text files
  95. Importing KML/KMZ files
  96. Importing DXF/DWG files
  97. Opening a NetCDF file
  98. Saving a vector layer
  99. Saving a raster layer
  100. Reprojecting a layer
  101. Batch format conversion
  102. Batch reprojection
  103. Loading vector layers into SpatiaLite
  104. Loading vector layers into PostGIS
  105. 2. Data Management
  106. Joining layer data
  107. Cleaning up the attribute table
  108. Configuring relations
  109. Joining tables in databases
  110. Creating views in SpatiaLite
  111. Creating views in PostGIS
  112. Creating spatial indexes
  113. Georeferencing rasters
  114. Georeferencing vector layers
  115. Creating raster overviews (pyramids)
  116. Building virtual rasters (catalogs)
  117. 3. Common Data Preprocessing Steps
  118. Converting points to lines to polygons and back – QGIS
  119. Converting points to lines to polygons and back – SpatiaLite
  120. Converting points to lines to polygons and back – PostGIS
  121. Cropping rasters
  122. Clipping vectors
  123. Extracting vectors
  124. Converting rasters to vectors
  125. Converting vectors to rasters
  126. Building DateTime strings
  127. Geotagging photos
  128. 4. Data Exploration
  129. Listing unique values in a column
  130. Exploring numeric value distribution in a column
  131. Exploring spatiotemporal vector data using Time Manager
  132. Creating animations using Time Manager
  133. Designing time-dependent styles
  134. Loading BaseMaps with the QuickMapServices plugin
  135. Loading BaseMaps with the OpenLayers plugin
  136. Viewing geotagged photos
  137. 5. Classic Vector Analysis
  138. Selecting optimum sites
  139. Dasymetric mapping
  140. Calculating regional statistics
  141. Estimating density heatmaps
  142. Estimating values based on samples
  143. 6. Network Analysis
  144. Creating a simple routing network
  145. Calculating the shortest paths using the Road graph plugin
  146. Routing with one-way streets in the Road graph plugin
  147. Calculating the shortest paths with the QGIS network analysis library
  148. Routing point sequences
  149. Automating multiple route computation using batch processing
  150. Matching points to the nearest line
  151. Creating a routing network for pgRouting
  152. Visualizing the pgRouting results in QGIS
  153. Using the pgRoutingLayer plugin for convenience
  154. Getting network data from the OSM
  155. 7. Raster Analysis I
  156. Using the raster calculator
  157. Preparing elevation data
  158. Calculating a slope
  159. Calculating a hillshade layer
  160. Analyzing hydrology
  161. Calculating a topographic index
  162. Automating analysis tasks using the graphical modeler
  163. 8. Raster Analysis II
  164. Calculating NDVI
  165. Handling null values
  166. Setting extents with masks
  167. Sampling a raster layer
  168. Visualizing multispectral layers
  169. Modifying and reclassifying values in raster layers
  170. Performing supervised classification of raster layers
  171. 9. QGIS and the Web
  172. Using web services
  173. Using WFS and WFS-T
  174. Searching CSW
  175. Using WMS and WMS Tiles
  176. Using WCS
  177. Using GDAL
  178. Serving web maps with the QGIS server
  179. Scale-dependent rendering
  180. Hooking up web clients
  181. Managing GeoServer from QGIS
  182. 10. Cartography Tips
  183. Using Rule Based Rendering
  184. Handling transparencies
  185. Understanding the feature and layer blending modes
  186. Saving and loading styles
  187. Configuring data-defined labels
  188. Creating custom SVG graphics
  189. Making pretty graticules in any projection
  190. Making useful graticules in printed maps
  191. Creating a map series using Atlas
  192. 11. Extending QGIS
  193. Defining custom projections
  194. Working near the dateline
  195. Working offline
  196. Using the QspatiaLite plugin
  197. Adding plugins with Python dependencies
  198. Using the Python console
  199. Writing Processing algorithms
  200. Writing QGIS plugins
  201. Using external tools
  202. 12. Up and Coming
  203. Preparing LiDAR data
  204. Opening File Geodatabases with the OpenFileGDB driver
  205. Using Geopackages
  206. The PostGIS Topology Editor plugin
  207. The Topology Checker plugin
  208. GRASS Topology tools
  209. Hunting for bugs
  210. Reporting bugs
  211. Bibliography
  212. Index

Chapter 6. Estimating Unknown Values

In this chapter, we will use interpolation methods to estimate the unknown values at one location based on the known values at other locations.

Interpolation is a technique to estimate unknown values entirely on their geographic relationship with known location values. As space can be measured with infinite precision, data measurement is always limited by the data collector's finite resources. Interpolation and other more sophisticated spatial estimation techniques are useful to estimate the values at the locations that have not been measured. In this chapter, you will learn how to interpolate the values in weather station data, which will be scored and used in a model of vulnerability to a particular agricultural condition: mildew. We've made the weather data a subset to provide a month in the year during which vulnerability is usually historically high. An end user could use this application to do a ground truthing of the model, which is, matching high or low predicted vulnerability with the presence or absence of mildew. If the model were to be extended historically or to near real time, the application could be used to see the trends in vulnerability over time or to indicate that a grower needs to take action to prevent mildew. The parameters, including precipitation, relative humidity, and temperature, have been selected for use in the real models that predict the vulnerability of fields and crops to mildew.

In this chapter, we will cover the following topics:

  • Adding data from MySQL
  • Using the NetCDF multidimensional data format
  • Interpolating the unknown values for visualization and reporting
  • Applying a simple algebraic risk model
  • Python GDAL wrappers to filter and update through SQLite queries
  • Interpolation
  • Map algebra modeling
  • Sampling a raster grid with a layer of gridded points
  • Python CGI Hosting
  • Testing and debugging during the CGI development
  • The Python SpatiaLite/SQLite3 wrapper
  • Generating an OpenLayers3 (OL3) map with the QGIS plugin
  • Adding AJAX Interactivity to an OL3 map
  • Dynamic response in the OL3 pixel popup

Importing the data

Often, the data to be used in a highly interactive, dynamic web application is stored in an existing enterprise database. Although these are not the usual spatial databases, they contain coordinate locations, which can be easily leveraged in a spatial application.

Connecting and importing from MySQL in QGIS

The following section is provided as an illustration only—database installation and setup are needlessly time consuming for a short demonstration of their use.

Note

If you do wish to install and set up MySQL, you can download it from http://dev.mysql.com/downloads/. MySQL Community Server is freely available under the open source GPL license. You will want to install MySQL Workbench and MySQL Utilities, which are also available at this location, for interaction with your new MySQL Community Server instance. You can then restore the database used in this demonstration using the Data Import/Restore command with the provided backup file (c6/original/packt.sql) from MySQL Workbench.

To connect to and add data from your MySQL database to your QGIS project, you need to do the following (again, as this is for demonstration only, it does not require database installation and setup):

  1. Navigate to Layer | Add Layer | Add vector layer.
    • Source type: Database
    • Type: MySQL, as shown in the following screenshot:
    Connecting and importing from MySQL in QGIS
  2. Once you've indicated that you wish to add a MySQL Database layer, you will have the option to create a new connection. In Connections, click on New. In the dialog that opens, enter the following parameters, which we would have initially set up when we created our MySQL Database and imported the .sql backup of the packt schema:
    • Name: packt
    • Host: localhost
    • Database: packt
    • Port: 3306
    • Username: packt
    • Password: packt, as shown in the following screenshot:
    Connecting and importing from MySQL in QGIS
  3. Click on Test Connect.
  4. Click on OK.
  5. Click on Open, and the Select vector layers to add dialog will appear.
  6. From the Select vector layers dialog, click on Select All. This includes the following layers:
    • fields
    • precipitation
    • relative_humidity
    • temperature
  7. Click on OK.

The layers (actually just the data tables) from the MySQL Database will now appear in the QGIS Layers panel of your project.

Converting to spatial format

The fields layer (table) is only one of the four tables we added to our project with latitude and longitude fields. We want this table to be recognized by QGIS as geospatial data and these coordinate pairs to be plotted in QGIS. Perform the following steps:

  1. Export the fields layer as CSV by right–clicking on the layer under the Layers panel and then clicking on Save as.
  2. In the Save vector layer as… dialog, perform the following steps:
    1. Click on Browse to choose a filesystem path to store the new .csv file. This file is included in the data under c6/data/output/fields.csv.
    2. For GEOMETRY, select <Default>.
    3. All the other default fields can remain as they are given.
    4. Click on OK to save the new CSV, as shown in the following screenshot:
    Converting to spatial format

Now, to import the CSV with the coordinate fields that are recognized as geospatial data and to plot the locations, perform the following steps:

  1. From the Layer menu, navigate to Add Layer | Add Delimited Text Layer.
  2. In Create a Layer from the Delimited Text File dialog, perform the following steps:
    1. Click on the Browse… button to browse the location where you previously saved your fields.csv file (for example, c6/data/output/fields.csv).
    2. All the other parameters should be correctly populated by default. Take a look at the following image.
    3. Click on OK to create the new layer in your QGIS project.
    Converting to spatial format

You will receive a notification that as no coordinate system was detected in this file, WGS 1984 was assigned. This is the correct coordinate system in our case, so no further intervention is necessary. After you dismiss this message, you will see the fields locations plotted on your map. If you don't, right–click on the new layer and select Zoom to Layer.

Note that this new layer is not reflected in a new file on the filesystem but is only stored with this QGIS project. This would be a good time to save your project.

Finally, join the other the other tables (precipitation, relative_humidity, and temperature) to the new plotted layer (fields) using the field_id field from each table one at a time. For a refresher on how to do this, refer to the Table join section of Chapter 1, Exploring Places – from Concept to Interface. To export each layer as separate shapefiles, right-click on each (precipitation, relative_humidity, and temperature), click on Save as, populate the path on which you want to save, and then save them.

The layer/table relations

The newer versions of QGIS support layer/table relations, which would allow us to model the one-to-many relationship between our locations, and an abstract measurement class that would include all the parameters. However, the use of table relationships is limited to a preliminary exploration of the relationships between layer objects and tables. The layer/table relationships are not recognized by any processing functions. Perform the following steps to explore the many-to-many layer/table relationships:

  1. Add a relation by navigating to Project | Project Properties | Relations. The following image is what you will see once the relationships to the three tables are established:
    The layer/table relations
  2. To add a relation, select a nonlayer table (for example, precipitation) in the Referencing Layer (Child) field and a location table (for example, fields) in the Referenced Layer (Parent) field. Use the common Id field (for example, field_id), which references the layer, to relate the tables. The name field can be filled arbitrarily, as shown in the following screenshot:
    The layer/table relations
  3. Now, to use the relation, click on a geographic object in the parent layer using the identify tool (you need to check Auto open form in the identify tool options panel). You'll see all the child entities (rows) connected to this object.
    The layer/table relations

NetCDF

Network Common Data Form (NetCDF) is a standard—and powerful—format for environmental data, such as meteorological data. NetCDF's strong suit is holding multidimensional data. With its abstract concept of dimension, NetCDF can handle the dimensions of latitude, longitude, and time in the same way that it handles other often physical, continuous, and ordinal data scales, such as air pressure levels.

For this project, we used the monthly global gridded high-resolution station (land) data for air temperature and precipitation from 1901-2010, which the NetCDF University of Delaware maintains as part of a collaboration with NOAA. You can download further data from this source at http://www.esrl.noaa.gov/psd/data/gridded/data.UDel_AirT_Precip.html.

Viewing NetCDF in QGIS

While there is a plugin available, NetCDF can be viewed directly in QGIS, in GDAL via the command line, and in the QGIS Python Console. Perform the following steps:

  1. Navigate to Layer | Add Raster Layer.
  2. Browse to c6/data/original/air.mon.mean.v301.nc and add this layer.
  3. Use the path Raster | Miscellaneous > Information to find the range of the values in a band. In the initial dialog, click on OK to go to the information dialog and then look for air_valid_range. You can see this information highlighted in the following image. Although QGIS's classifier will calculate the range for you, it is often thrown off by a numeric nodata value, which will typically skew the range to the lower end.
    Viewing NetCDF in QGIS
  4. Enter the range information (-90 to 50) into the Style tab of the Layer Properties tab.
  5. Click on Invert to show cool to hot colors from less to more, just as you would expect with temperature.
  6. Click on Classify to create the new bins based on the number and color range. The following screenshot shows what an ideal selection of bins and colors would look like:
    Viewing NetCDF in QGIS
  7. Click on OK. The end result will look similar to the following image:
    Viewing NetCDF in QGIS

To render the gridded NetCDF data accessible to certain models, databases, and to web interaction, you could write a workflow program similar to the following after sampling the gridded values and attaching them to the points for each time period.