Table of Contents for
QGIS: Becoming a GIS Power User

Version ebook / Retour

Cover image for bash Cookbook, 2nd Edition QGIS: Becoming a GIS Power User by Alexander Bruy Published by Packt Publishing, 2017
  1. Cover
  2. Table of Contents
  3. QGIS: Becoming a GIS Power User
  4. QGIS: Becoming a GIS Power User
  5. QGIS: Becoming a GIS Power User
  6. Credits
  7. Preface
  8. What you need for this learning path
  9. Who this learning path is for
  10. Reader feedback
  11. Customer support
  12. 1. Module 1
  13. 1. Getting Started with QGIS
  14. Running QGIS for the first time
  15. Introducing the QGIS user interface
  16. Finding help and reporting issues
  17. Summary
  18. 2. Viewing Spatial Data
  19. Dealing with coordinate reference systems
  20. Loading raster files
  21. Loading data from databases
  22. Loading data from OGC web services
  23. Styling raster layers
  24. Styling vector layers
  25. Loading background maps
  26. Dealing with project files
  27. Summary
  28. 3. Data Creation and Editing
  29. Working with feature selection tools
  30. Editing vector geometries
  31. Using measuring tools
  32. Editing attributes
  33. Reprojecting and converting vector and raster data
  34. Joining tabular data
  35. Using temporary scratch layers
  36. Checking for topological errors and fixing them
  37. Adding data to spatial databases
  38. Summary
  39. 4. Spatial Analysis
  40. Combining raster and vector data
  41. Vector and raster analysis with Processing
  42. Leveraging the power of spatial databases
  43. Summary
  44. 5. Creating Great Maps
  45. Labeling
  46. Designing print maps
  47. Presenting your maps online
  48. Summary
  49. 6. Extending QGIS with Python
  50. Getting to know the Python Console
  51. Creating custom geoprocessing scripts using Python
  52. Developing your first plugin
  53. Summary
  54. 2. Module 2
  55. 1. Exploring Places – from Concept to Interface
  56. Acquiring data for geospatial applications
  57. Visualizing GIS data
  58. The basemap
  59. Summary
  60. 2. Identifying the Best Places
  61. Raster analysis
  62. Publishing the results as a web application
  63. Summary
  64. 3. Discovering Physical Relationships
  65. Spatial join for a performant operational layer interaction
  66. The CartoDB platform
  67. Leaflet and an external API: CartoDB SQL
  68. Summary
  69. 4. Finding the Best Way to Get There
  70. OpenStreetMap data for topology
  71. Database importing and topological relationships
  72. Creating the travel time isochron polygons
  73. Generating the shortest paths for all students
  74. Web applications – creating safe corridors
  75. Summary
  76. 5. Demonstrating Change
  77. TopoJSON
  78. The D3 data visualization library
  79. Summary
  80. 6. Estimating Unknown Values
  81. Interpolated model values
  82. A dynamic web application – OpenLayers AJAX with Python and SpatiaLite
  83. Summary
  84. 7. Mapping for Enterprises and Communities
  85. The cartographic rendering of geospatial data – MBTiles and UTFGrid
  86. Interacting with Mapbox services
  87. Putting it all together
  88. Going further – local MBTiles hosting with TileStream
  89. Summary
  90. 3. Module 3
  91. 1. Data Input and Output
  92. Finding geospatial data on your computer
  93. Describing data sources
  94. Importing data from text files
  95. Importing KML/KMZ files
  96. Importing DXF/DWG files
  97. Opening a NetCDF file
  98. Saving a vector layer
  99. Saving a raster layer
  100. Reprojecting a layer
  101. Batch format conversion
  102. Batch reprojection
  103. Loading vector layers into SpatiaLite
  104. Loading vector layers into PostGIS
  105. 2. Data Management
  106. Joining layer data
  107. Cleaning up the attribute table
  108. Configuring relations
  109. Joining tables in databases
  110. Creating views in SpatiaLite
  111. Creating views in PostGIS
  112. Creating spatial indexes
  113. Georeferencing rasters
  114. Georeferencing vector layers
  115. Creating raster overviews (pyramids)
  116. Building virtual rasters (catalogs)
  117. 3. Common Data Preprocessing Steps
  118. Converting points to lines to polygons and back – QGIS
  119. Converting points to lines to polygons and back – SpatiaLite
  120. Converting points to lines to polygons and back – PostGIS
  121. Cropping rasters
  122. Clipping vectors
  123. Extracting vectors
  124. Converting rasters to vectors
  125. Converting vectors to rasters
  126. Building DateTime strings
  127. Geotagging photos
  128. 4. Data Exploration
  129. Listing unique values in a column
  130. Exploring numeric value distribution in a column
  131. Exploring spatiotemporal vector data using Time Manager
  132. Creating animations using Time Manager
  133. Designing time-dependent styles
  134. Loading BaseMaps with the QuickMapServices plugin
  135. Loading BaseMaps with the OpenLayers plugin
  136. Viewing geotagged photos
  137. 5. Classic Vector Analysis
  138. Selecting optimum sites
  139. Dasymetric mapping
  140. Calculating regional statistics
  141. Estimating density heatmaps
  142. Estimating values based on samples
  143. 6. Network Analysis
  144. Creating a simple routing network
  145. Calculating the shortest paths using the Road graph plugin
  146. Routing with one-way streets in the Road graph plugin
  147. Calculating the shortest paths with the QGIS network analysis library
  148. Routing point sequences
  149. Automating multiple route computation using batch processing
  150. Matching points to the nearest line
  151. Creating a routing network for pgRouting
  152. Visualizing the pgRouting results in QGIS
  153. Using the pgRoutingLayer plugin for convenience
  154. Getting network data from the OSM
  155. 7. Raster Analysis I
  156. Using the raster calculator
  157. Preparing elevation data
  158. Calculating a slope
  159. Calculating a hillshade layer
  160. Analyzing hydrology
  161. Calculating a topographic index
  162. Automating analysis tasks using the graphical modeler
  163. 8. Raster Analysis II
  164. Calculating NDVI
  165. Handling null values
  166. Setting extents with masks
  167. Sampling a raster layer
  168. Visualizing multispectral layers
  169. Modifying and reclassifying values in raster layers
  170. Performing supervised classification of raster layers
  171. 9. QGIS and the Web
  172. Using web services
  173. Using WFS and WFS-T
  174. Searching CSW
  175. Using WMS and WMS Tiles
  176. Using WCS
  177. Using GDAL
  178. Serving web maps with the QGIS server
  179. Scale-dependent rendering
  180. Hooking up web clients
  181. Managing GeoServer from QGIS
  182. 10. Cartography Tips
  183. Using Rule Based Rendering
  184. Handling transparencies
  185. Understanding the feature and layer blending modes
  186. Saving and loading styles
  187. Configuring data-defined labels
  188. Creating custom SVG graphics
  189. Making pretty graticules in any projection
  190. Making useful graticules in printed maps
  191. Creating a map series using Atlas
  192. 11. Extending QGIS
  193. Defining custom projections
  194. Working near the dateline
  195. Working offline
  196. Using the QspatiaLite plugin
  197. Adding plugins with Python dependencies
  198. Using the Python console
  199. Writing Processing algorithms
  200. Writing QGIS plugins
  201. Using external tools
  202. 12. Up and Coming
  203. Preparing LiDAR data
  204. Opening File Geodatabases with the OpenFileGDB driver
  205. Using Geopackages
  206. The PostGIS Topology Editor plugin
  207. The Topology Checker plugin
  208. GRASS Topology tools
  209. Hunting for bugs
  210. Reporting bugs
  211. Bibliography
  212. Index

Georeferencing rasters

Sometimes, you have a paper map, an image of a map from the Internet, or even a raster file with projection data included. When working with these types of data, the first thing you'll need to do is reference them to existing spatial data so that they will work with your other data and GIS tools. This recipe will walk you through the process to reference your raster (image) data, called georeferencing.

Getting ready

You'll need a raster that lacks spatial reference information; that is, unknown projection according to QGIS. You'll also need a second layer (reference map) that is known and you can use for reference points. The exception to this is, if you have a paper map that has coordinates marked on it or a spatial dataset that just didn't come with a reference file but you happen to know its CRS/SRS definition. Load your reference map in QGIS.

This book's data includes a scanned USGS topographic map that's missing its o38121e7.tif projection information. This map is from Davis, CA, so the example data has plenty of other possible reference layers you could use, for example, the streets would be a good choice.

Tip

Actually, the world file was just renamed to o38121e7.tfw.orig so that QGIS wouldn't detect it. You can use this later to compare your georeference quality.

How to do it…

On the Raster menu, open the Georeferencing tool and perform the following steps:

  1. Use the file dialog to open your unknown map in the Georeferencing tool.
  2. Create a Ground Control Point (GCP) of matches between your start coordinates and end coordinates.

    Tip

    Building corners, street intersections, and things where line features intersect or significant edge features can be found.

  3. Add a point in your unknown map with GCP Add +. You can now enter the coordinates (that is, if it's a paper map with known coordinates marked on it), or you can select a match from the main QGIS window reference layer.
  4. Repeat this process to find at least four matches. If you want to get a really good fit do between 10-20 matches.
  5. (Optional) Save your GCPs as a text file for future reference and troubleshooting:
    How to do it…

    Tip

    Try to spread out your control points so that you have good coverage of the whole map. It's all about averaging the differences.

  6. Now, choose Transformation Settings, as follows:
  7. You have a choice here. Generally, you'll want to use Polynomial. If you set 4+ points for the first order, 6+ points for the second order and 10+ points for the third order, The second order is the currently recommend one. This will be discussed in the There's more… section of this recipe.
  8. Set Target SRS to the same projection as the reference layer. (In this case, this is EPSG:26910 UTM Zone 10n)
  9. Output Raster should be a different name from the original so that you can easily identify it.

    Tip

    Save your GCP list to the file. If you don't like the results, come back and try a different algorithm or change the number of GCPs used. If you want a reference for comparison, look at the text o38121e6.tif.points file in this book's data folder.

  10. When you're happy with your list of GCPs click on Start Georeferencing in File or on the green triangular button.

How it works…

A mathematical function is created based on the differences between your two sets of points. This function is then applied to the whole image, stretching it in an attempt to fit. This is basically a translation or projection from one coordinate system to another.

There's more…

Picking transformation types can be a little tricky, the list in QGIS is currently in alphabetical order and not the recommended order. Polynomial 2 and Thin-plate-spline (TPS) are probably the two most common choices. Polynomial 1 is great when you just have minor shift, zooming (scale), and rotation. When you have old well-made maps in consistent projections, this will apply the least amount of change. Polynomial 2 picks up from here and handles consistent distortion. Both of these provide you with an error estimate as the Residual or RMSE (Root Mean Square Error). TPS handles variable distortion, varying it's correction around each control point. This will almost always result in the best fit, at least through the GCPs that you provide. However, because it varies at every GCP location, you can't calculate an error estimate and it may actually overfit (create new distortion). TPS is best for hand-drawn maps, nonflat scans of maps, or other variable distorted sources. Polynomial methods are good for sources that had high accuracy and reference marks to begin with.

If you really want a good match, once you have all your points, check the RMSE values in the table at the bottom. Generally, you want this near or less than 1. If you have a point with a huge value, consider deleting it or redoing it. You can move existing points, and a line will be drawn in the direction of the estimated error. So, go back over the high values, zoom in extra close, and use the GCP move option.

Sometimes, just changing your transformation type will help, as shown in the following screenshot that compares Polynomial 1 versus Polynomial 2 for the same set of GCP:

There's more…

Polynomial 1

Note the residual values difference when changing to Polynomial 2 (assuming that you have the minimum number of points to use Polynomial 2):

There's more…

Polynomial 2

Tip

Resampling methods can also have a big impact on how the output looks. Some of the methods are more aggressive about trying to smooth out distortions. If you're not sure, stick with the default nearest neighbor. This will copy the value of the nearest pixel from the original to a new square pixel in the output.

See also