Table of Contents for
Mastering PostGIS

Version ebook / Retour

Cover image for bash Cookbook, 2nd Edition Mastering PostGIS by Tomasz Nycz Published by Packt Publishing, 2017
  1. Mastering PostGIS
  2. Title Page
  3. Copyright
  4. Credits
  5. About the Authors
  6. About the Reviewers
  7. www.PacktPub.com
  8. Customer Feedback
  9. Table of Contents
  10. Preface
  11. What this book covers
  12. What you need for this book
  13. Who this book is for
  14. Conventions
  15. Reader feedback
  16. Customer support
  17. Downloading the example code
  18. Downloading the color images of this book
  19. Errata
  20. Piracy
  21. Questions
  22. Importing Spatial Data
  23. Obtaining test data
  24. Setting up the database
  25. Importing flat data
  26. Importing data using psql
  27. Importing data interactively
  28. Importing data non-interactively
  29. Importing data using pgAdmin
  30. Extracting spatial information from flat data
  31. Importing shape files using shp2pgsql
  32. shp2pgsql in cmd
  33. The shp2pgsql GUI version
  34. Importing vector data using ogr2ogr
  35. Importing GML
  36. Importing MIF and TAB
  37. Importing KML
  38. ogr2ogr GUI (Windows only)
  39. Importing data using GIS clients
  40. Exporting a shapefile to PostGIS using QGIS and SPIT
  41. Exporting shapefile to PostGIS using QGIS and DbManager
  42. Exporting spatial data to PostGIS from Manifold GIS
  43. Importing OpenStreetMap data
  44. Connecting to external data sources with foreign data wrappers
  45. Connecting to SQL Server Spatial
  46. Connecting to WFS service
  47. Loading rasters using raster2pgsql
  48. Importing a single raster
  49. Importing multiple rasters
  50. Importing data with pgrestore
  51. Summary
  52. Spatial Data Analysis
  53. Composing and decomposing geometries
  54. Creating points
  55. Extracting coordinates from points
  56. Composing and decomposing Multi-geometries
  57. Multi-geometry decomposition
  58. Composing and decomposing LineStrings
  59. LineString composition
  60. LineString decomposition
  61. Composing and decomposing polygons
  62. Polygon composition
  63. Polygon decomposition
  64. Spatial measurement
  65. General warning - mind the SRID!
  66. Measuring distances between two geometries
  67. Measuring the length, area, and perimeter of geometries
  68. Line length
  69. Polygon perimeter
  70. Polygon area
  71. Geometry bounding boxes
  72. Accessing bounding boxes
  73. Creating bounding boxes
  74. Using bounding boxes in spatial queries
  75. Geometry simplification
  76. Geometry validation
  77. Simplicity and validity
  78. Testing for simplicity and validity
  79. Checking for validity
  80. Repairing geometry errors
  81. Validity constraint
  82. Intersecting geometries
  83. Nearest feature queries
  84. Summary
  85. Data Processing - Vector Ops
  86. Primer - obtaining and importing OpenStreetMap data
  87. Merging geometries
  88. Merging polygons
  89. Merging MultiLineStrings
  90. Slicing geometries
  91. Splitting a polygon by LineString
  92. Splitting a LineString with another LineString
  93. Extracting a section of LineString
  94. Buffering and offsetting geometries
  95. Offsetting features
  96. Creating convex and concave hulls
  97. Computing centroids, points-on-surface, and points-on-line
  98. Reprojecting geometries
  99. Spatial relationships
  100. Touching
  101. Crossing
  102. Overlapping
  103. Containing
  104. Radius queries
  105. Summary
  106. Data Processing - Raster Ops
  107. Preparing data
  108. Processing and analysis
  109. Analytic and statistical functions
  110. Vector to raster conversion
  111. Raster to vector conversion
  112. Spatial relationship
  113. Metadata
  114. Summary
  115. Exporting Spatial Data
  116. Exporting data using \COPY in psql
  117. Exporting data in psql interactively
  118. Exporting data in psql non-interactively
  119. Exporting data in PgAdmin
  120. Exporting vector data using pgsql2shp
  121. pgsql2sph command line
  122. pgsql2shp gui
  123. Exporting vector data using ogr2ogr
  124. Exporting KML revisited
  125. Exporting SHP
  126. Exporting MapInfo TAB and MIF
  127. Exporting to SQL Server
  128. ogr2ogr GUI
  129. Exporting data using GIS clients
  130. Exporting data using QGIS
  131. Exporting data using Manifold.
  132. Outputting rasters using GDAL
  133. Outputting raster using psql
  134. Exporting data using the PostgreSQL backup functionality
  135. Summary
  136. ETL Using Node.js
  137. Setting up Node.js
  138. Making a simple Node.js hello world in the command line
  139. Making a simple HTTP server
  140. Handshaking with a database using Node.js PgSQL client
  141. Retrieving and processing JSON data
  142. Importing shapefiles revisited
  143. Consuming JSON data
  144. Geocoding address data
  145. Consuming WFS data
  146. Summary
  147. PostGIS – Creating Simple WebGIS Applications
  148. ExtJS says Hello World
  149. Configuring GeoServer web services
  150. Importing test data
  151. Outputting vector data as WMS services in GeoServer
  152. Outputting raster data as WMS services in GeoServer
  153. Outputting vector data as WFS services
  154. Making use of PgRaster in a simple WMS GetMap handler
  155. Consuming WMS
  156. Consuming WMS in ol3
  157. Consuming WMS in Leaflet
  158. Enabling CORS in Jetty
  159. Consuming WFS in ol3
  160. Outputting and consuming GeoJSON
  161. Consuming GeoJSON in ol3
  162. Consuming GeoJSON in Leaflet
  163. Outputting and consuming TopoJSON
  164. Consuming TopoJSON in ol3
  165. Consuming TopoJSON in Leaflet
  166. Implementing a simple CRUD application that demonstrates vector editing via web interfaces
  167. WebGIS CRUD server in Node.js
  168. WebGIS CRUD client
  169. Layer manager
  170. Drawing tools
  171. Analysis tools - buffering
  172. Summary
  173. PostGIS Topology
  174. The conceptual model
  175. The data
  176. Installation
  177. Creating an empty topology
  178. Importing Simple Feature data into topology
  179. Checking the validity of input geometries
  180. Creating a TopoGeometry column and a topology layer
  181. Populating a TopoGeometry column from an existing geometry
  182. Inspecting and validating a topology
  183. Topology validation
  184. Accessing the topology data
  185. Querying topological elements by a point
  186. Locating nodes
  187. Locating edges
  188. Locating faces
  189. Topology editing
  190. Adding new elements
  191. Creating TopoGeometries
  192. Splitting and merging features
  193. Splitting features
  194. Merging features
  195. Updating edge geometry
  196. Topology-aware simplification
  197. Importing sample data
  198. Topology output
  199. GML output
  200. TopoJSON output
  201. Summary
  202. pgRouting
  203. Installing the pgRouting extension
  204. Importing routing data
  205. Importing shapefiles
  206. Importing OSM data using osm2pgrouting
  207. pgRouting algorithms
  208. All pairs shortest path
  209. Shortest path
  210. Shortest path Dijkstra
  211. A-Star (A*)
  212. K-Dijkstra
  213. K-Shortest path
  214. Turn restrictions shortest path (TRSP)
  215. Driving distance
  216. Traveling sales person
  217. Handling one-way edges
  218. Consuming pgRouting functionality in a web app
  219. Summary

Consuming WFS data

Let's imagine that we work for a utilities company that is contracted to build a piece of underground pipeline. The job is not only to do the actual construction work, the company also has to negotiate with the land owners and obtain their legal agreements for the construction work. The company GIS department has been tasked to prepare a list of parcels that will be affected by the pipeline itself and the construction work - after all, builders do need to be able to get to a place with their heavy equipment.

Our job is, therefore, to do the following:

  1. Buffer the pipeline geometry with a radius of 100 m.
  2. Extract the buffer geometry off the database.
  3. Query a WFS service to obtain parcels that intersect with the pipeline buffer.
  4. Load the parcels data to the PostGIS database.
  5. Prepare a report with the parcel data.

We have visited Poland and the UK in the previous examples. For this example, we will fly over to New Zealand and consume a web feature service provided for us by LINZ (Land Information New Zealand). In order to use LINZ services, we need to register and create an API key. You can do this at https://data.linz.govt.nz/accounts/register/. Then, when ready, follow the API key generation instructions at http://www.linz.govt.nz/data/linz-data-service/guides-and-documentation/creating-an-api-key.

We have received geometry of the pipeline in question as a shapefile, so let's import it to the database. First, let's ensure that our schema is intact:

create schema if not exists etl_pipeline;  

Next, we'll let ogr2ogr do the work for us:

ogr2ogr -f "PostgreSQL" PG:"host=localhost port=5434 user=postgres dbname=mastering_postgis" "pipeline.shp" -t_srs EPSG:2193 -nln etl_pipeline.pipeline -overwrite -lco GEOMETRY_NAME=geom 
The data for this example can be found in the data/08_consumig_wfs directory.

The pipeline is located just on the outskirts of New Plymouth, near Egmont National Park, and is marked red on the following screenshot:

At this stage, we're ready to write the code again. We will warm up by buffering our pipeline. Let's assume that we need 5 m on each side for heavy equipment access:

/** 
* buffers the pipeline and returns a buffer geom as WKT
*/
const getPipelineBuffer = function(){
return new Promise((resolve, reject) => {
console.log('Buffering pipeline...');

let client = new pg.Client(dbCredentials);

client.connect((err) => {
if(err){
reject(err.message);
return;
}

//note
client.query(`select ST_AsGML(ST_Buffer(geom, 5, 'endcap=round
join=round')) as gml from ${pipelineSchema}.${pipelineTable}
limit 1;`, function(err, result){
if(err){
try {
client.end();
} catch(e){}
reject(err.message);
return;
}

client.end();

if(result.rows.length !== 1)
{
reject('Hmm it looks like we have a little problem with
a pipeline...');
}
else {
console.log('Done!');
resolve(result.rows[0].gml);
}
});
});
});
}

Once we have our buffer GML ready, we can query a WFS service. We will just send a POST GetFeature request to the LINZ WFS and request the data in the same projection as our pipeline dataset, so EPSG:2193 (New Zealand Transverse Mercator 2000); since we're using JavaScript and our WFS supports JSON output, we will opt for it.

At this stage, we should have the data at hand, and since we asked for JSON output, our data should be similar to the following:

{
type: "FeatureCollection",
totalFeatures: "unknown",
features:
[
{
type: "Feature",
id: "layer-772.4611152",
geometry: {
"type": "MultiPolygon",
"coordinates": [...]
},
geometry_name: "shape",
properties: [{
"id": 4611152,
"appellation": "Lot 2 DP 13024",
"affected_surveys": "DP 13024",
"parcel_intent": "DCDB",
"topology_type": "Primary",
"statutory_actions": null,
"land_district": "Taranaki",
"titles": "TNF1/130",
"survey_area": 202380,
"calc_area": 202486
}]
}
]
}

The geometry object is GeoJSON, so we should be able to easily make PostGIS read it. Let's do just that and put our parcels data in the database now:

/** 
* saves wfs json parcels to the database
*/
const saveParcels = function(data){
return new Promise((resolve, reject) => {

console.log('Saving parcels...');

let client = new pg.Client(dbCredentials);

client.connect((err) => {
if(err){
reject(err.message);
return;
}

const sql = [
executeNonQuery(client, `DROP TABLE IF EXISTS
${pipelineSchema}.${pipelineParcels};`),
executeNonQuery(
client,
`CREATE TABLE ${pipelineSchema}.${pipelineParcels}
(id numeric, appellation varchar, affected_surveys
varchar, parcel_intent varchar, topology_type varchar,
statutory_actions varchar, land_district varchar,
titles varchar, survey_area numeric, geom geometry);`
)
];

for(let f of data.features){
sql.push(
executeNonQuery(
client,
`INSERT INTO ${pipelineSchema}.${pipelineParcels}
(id, appellation, affected_surveys, parcel_intent,
topology_type, statutory_actions, land_district,
titles, survey_area, geom)
VALUES
($1,$2,$3,$4,$5,$6,$7,$8,$9,ST_SetSRID
(ST_GeomFromGeoJSON($10),2193));`,
[
f.properties.id,
f.properties.appellation,
f.properties.affected_surveys,
f.properties.parcel_intent,
f.properties.topology_type,
f.properties.statutory_actions,
f.properties.land_district,
f.properties.titles,
f.properties.survey_area,
JSON.stringify(f.geometry)
]
)
);
}

Promise.all(sql)
.then(() => {
client.end();
console.log('Done!');
resolve();
})
.catch((err) => {
client.end();
reject(err)
});
});
});
}

Finally, let's chain our ops:

//chain all the stuff together 
getPipelineBuffer()
.then(getParcels)
.then(saveParcels)
.catch(err => console.log(`uups, an error has occured: ${err}`));

Voila! We have just obtained a set of parcels that intersect with our 5 m buffer around the pipeline. We can pass the parcels information so that our legal department obtains detailed information for further negotiations. Our parcels map now looks like this: