Table of Contents for
Mastering PostGIS

Version ebook / Retour

Cover image for bash Cookbook, 2nd Edition Mastering PostGIS by Tomasz Nycz Published by Packt Publishing, 2017
  1. Mastering PostGIS
  2. Title Page
  3. Copyright
  4. Credits
  5. About the Authors
  6. About the Reviewers
  7. www.PacktPub.com
  8. Customer Feedback
  9. Table of Contents
  10. Preface
  11. What this book covers
  12. What you need for this book
  13. Who this book is for
  14. Conventions
  15. Reader feedback
  16. Customer support
  17. Downloading the example code
  18. Downloading the color images of this book
  19. Errata
  20. Piracy
  21. Questions
  22. Importing Spatial Data
  23. Obtaining test data
  24. Setting up the database
  25. Importing flat data
  26. Importing data using psql
  27. Importing data interactively
  28. Importing data non-interactively
  29. Importing data using pgAdmin
  30. Extracting spatial information from flat data
  31. Importing shape files using shp2pgsql
  32. shp2pgsql in cmd
  33. The shp2pgsql GUI version
  34. Importing vector data using ogr2ogr
  35. Importing GML
  36. Importing MIF and TAB
  37. Importing KML
  38. ogr2ogr GUI (Windows only)
  39. Importing data using GIS clients
  40. Exporting a shapefile to PostGIS using QGIS and SPIT
  41. Exporting shapefile to PostGIS using QGIS and DbManager
  42. Exporting spatial data to PostGIS from Manifold GIS
  43. Importing OpenStreetMap data
  44. Connecting to external data sources with foreign data wrappers
  45. Connecting to SQL Server Spatial
  46. Connecting to WFS service
  47. Loading rasters using raster2pgsql
  48. Importing a single raster
  49. Importing multiple rasters
  50. Importing data with pgrestore
  51. Summary
  52. Spatial Data Analysis
  53. Composing and decomposing geometries
  54. Creating points
  55. Extracting coordinates from points
  56. Composing and decomposing Multi-geometries
  57. Multi-geometry decomposition
  58. Composing and decomposing LineStrings
  59. LineString composition
  60. LineString decomposition
  61. Composing and decomposing polygons
  62. Polygon composition
  63. Polygon decomposition
  64. Spatial measurement
  65. General warning - mind the SRID!
  66. Measuring distances between two geometries
  67. Measuring the length, area, and perimeter of geometries
  68. Line length
  69. Polygon perimeter
  70. Polygon area
  71. Geometry bounding boxes
  72. Accessing bounding boxes
  73. Creating bounding boxes
  74. Using bounding boxes in spatial queries
  75. Geometry simplification
  76. Geometry validation
  77. Simplicity and validity
  78. Testing for simplicity and validity
  79. Checking for validity
  80. Repairing geometry errors
  81. Validity constraint
  82. Intersecting geometries
  83. Nearest feature queries
  84. Summary
  85. Data Processing - Vector Ops
  86. Primer - obtaining and importing OpenStreetMap data
  87. Merging geometries
  88. Merging polygons
  89. Merging MultiLineStrings
  90. Slicing geometries
  91. Splitting a polygon by LineString
  92. Splitting a LineString with another LineString
  93. Extracting a section of LineString
  94. Buffering and offsetting geometries
  95. Offsetting features
  96. Creating convex and concave hulls
  97. Computing centroids, points-on-surface, and points-on-line
  98. Reprojecting geometries
  99. Spatial relationships
  100. Touching
  101. Crossing
  102. Overlapping
  103. Containing
  104. Radius queries
  105. Summary
  106. Data Processing - Raster Ops
  107. Preparing data
  108. Processing and analysis
  109. Analytic and statistical functions
  110. Vector to raster conversion
  111. Raster to vector conversion
  112. Spatial relationship
  113. Metadata
  114. Summary
  115. Exporting Spatial Data
  116. Exporting data using \COPY in psql
  117. Exporting data in psql interactively
  118. Exporting data in psql non-interactively
  119. Exporting data in PgAdmin
  120. Exporting vector data using pgsql2shp
  121. pgsql2sph command line
  122. pgsql2shp gui
  123. Exporting vector data using ogr2ogr
  124. Exporting KML revisited
  125. Exporting SHP
  126. Exporting MapInfo TAB and MIF
  127. Exporting to SQL Server
  128. ogr2ogr GUI
  129. Exporting data using GIS clients
  130. Exporting data using QGIS
  131. Exporting data using Manifold.
  132. Outputting rasters using GDAL
  133. Outputting raster using psql
  134. Exporting data using the PostgreSQL backup functionality
  135. Summary
  136. ETL Using Node.js
  137. Setting up Node.js
  138. Making a simple Node.js hello world in the command line
  139. Making a simple HTTP server
  140. Handshaking with a database using Node.js PgSQL client
  141. Retrieving and processing JSON data
  142. Importing shapefiles revisited
  143. Consuming JSON data
  144. Geocoding address data
  145. Consuming WFS data
  146. Summary
  147. PostGIS – Creating Simple WebGIS Applications
  148. ExtJS says Hello World
  149. Configuring GeoServer web services
  150. Importing test data
  151. Outputting vector data as WMS services in GeoServer
  152. Outputting raster data as WMS services in GeoServer
  153. Outputting vector data as WFS services
  154. Making use of PgRaster in a simple WMS GetMap handler
  155. Consuming WMS
  156. Consuming WMS in ol3
  157. Consuming WMS in Leaflet
  158. Enabling CORS in Jetty
  159. Consuming WFS in ol3
  160. Outputting and consuming GeoJSON
  161. Consuming GeoJSON in ol3
  162. Consuming GeoJSON in Leaflet
  163. Outputting and consuming TopoJSON
  164. Consuming TopoJSON in ol3
  165. Consuming TopoJSON in Leaflet
  166. Implementing a simple CRUD application that demonstrates vector editing via web interfaces
  167. WebGIS CRUD server in Node.js
  168. WebGIS CRUD client
  169. Layer manager
  170. Drawing tools
  171. Analysis tools - buffering
  172. Summary
  173. PostGIS Topology
  174. The conceptual model
  175. The data
  176. Installation
  177. Creating an empty topology
  178. Importing Simple Feature data into topology
  179. Checking the validity of input geometries
  180. Creating a TopoGeometry column and a topology layer
  181. Populating a TopoGeometry column from an existing geometry
  182. Inspecting and validating a topology
  183. Topology validation
  184. Accessing the topology data
  185. Querying topological elements by a point
  186. Locating nodes
  187. Locating edges
  188. Locating faces
  189. Topology editing
  190. Adding new elements
  191. Creating TopoGeometries
  192. Splitting and merging features
  193. Splitting features
  194. Merging features
  195. Updating edge geometry
  196. Topology-aware simplification
  197. Importing sample data
  198. Topology output
  199. GML output
  200. TopoJSON output
  201. Summary
  202. pgRouting
  203. Installing the pgRouting extension
  204. Importing routing data
  205. Importing shapefiles
  206. Importing OSM data using osm2pgrouting
  207. pgRouting algorithms
  208. All pairs shortest path
  209. Shortest path
  210. Shortest path Dijkstra
  211. A-Star (A*)
  212. K-Dijkstra
  213. K-Shortest path
  214. Turn restrictions shortest path (TRSP)
  215. Driving distance
  216. Traveling sales person
  217. Handling one-way edges
  218. Consuming pgRouting functionality in a web app
  219. Summary

Geocoding address data

Let's imagine that we have a potential customer database built upon yellow pages, with the customer locations expressed as addresses. I guess that using yellow pages may not be the best idea these days, but it makes a good starting point for this example.

We need to send our sales representatives to these addresses in order to establish relationships with the new customers, but first we should assign the customers to proper sales regions. Having seen St_Intersects in action already, searching for points in polygons seems like a trivial task. We need to have point geoms for this though, and we'll soon see how to go from addresses to geometry using some simple Node.js code.

Let's prepare our hypothetical new customers database first. We will reuse some data we have seen already, namely the Ordnance Survey GB address points. We imported this data in Chapter 1, Importing Spatial Data, and I assume you have not deleted the dataset already - it should be in data_import.osgb_addresses.

The records seem to be spread quite nicely, so we will simply select 100 records where we have a meaningful name and also where the building number is known:

--schema prepare / cleanup 
create schema if not exists etl_geocoding;
drop table if exists etl_geocoding.customers;

--customers table
create table etl_geocoding.customers (
id serial NOT NULL,
name varchar,
street varchar,
street_no varchar,
postcode varchar,
town varchar,
lon numeric,
lat numeric,
geom geometry,
geocoded boolean
);

--get some hypothethical customers
insert into etl_geocoding.customers (
name,
street,
street_no,
postcode,
town,
geocoded
)

select organisation_name, thoroughfare, building_number, postcode, post_town, false
from data_import.osgb_addresses
where organisation_name != '' and building_number != ''
limit 100;

Having prepared our customer database, we can now define the steps we need to take to end up with geocoded addresses:

  • Extract the non-geocoded records from the database
  • Use an external geocoding API in order to obtain the locations
  • Pump the data back to the database

Our geocoding API will be Google Maps Geocoding API. It has its own node module, so we will be able to focus on the task without having to bother with assembling a valid URL to call the API using http GET. You will find more information on the Google Maps' node module at https://github.com/googlemaps/google-maps-services-js.

In order to use Google services, one has to generate an API key. API keys are freely available and can be created via a Google Account at https://developers.google.com/console.

Once our geocoding node module has been created, we will need to install some external packages:

npm install pg --save
npm install @google/maps --save

Our first step is to extract the customer records that have not yet been geocoded:

/** 
* reads non-geocoded customer records
*/
const readCustomers = function(){
return new Promise((resolve, reject) => {
console.log('Extracting customer record...');

let client = new pg.Client(dbCredentials);

client.connect((err) => {
if(err){
reject(err.message);
return;
}

client.query(`SELECT * FROM
${customersSchema}.${customersTable} WHERE geocoded = false
LIMIT 10;`, function(err, result){
if(err){
try {
client.end();
} catch(e){}
reject(err.message);
return;
}

client.end();

console.log('Done!');
resolve(result.rows);
});
});
});
}
You may have noticed that I am reading only ten records at a time. This is because it lets me debug the code without having to reset the geocoding status in the database. Also, there are some usage limits with the free Google Maps account, so I am avoiding too many API calls.

Once we have the customer records at hand, we can geocode them. However, we should probably stop for a second and familiarize ourselves with the geocoder API output so that we can properly extract the information later. The following is an output example for our first address in the database: 30, GUILDHALL SHOPPING CENTRE, EX4 3HJ, EXETER:

{ 
"results" : [
{
"address_components":[
{"long_name":"30-32","short_name":"30-32",
"types":["street_number"]},
{"long_name":"Guildhall Shopping Centre",
"short_name":"Guildhall Shopping Centre",
"types":["route"]},
{"long_name":"Exeter","short_name":"Exeter","types":
["locality","political"]},
{"long_name":"Exeter","short_name":"Exeter","types":
["postal_town"]},
{"long_name":"Devon","short_name":"Devon","types":
["administrative_area_level_2","political"]},
{"long_name":"England","short_name":"England","types":
["administrative_area_level_1","political"]},
{"long_name":"United Kingdom","short_name":"GB","types":
["country","political"]},
{"long_name":"EX4 3HH","short_name":"EX4 3HH","types":
["postal_code"]}
],
"formatted_address":"30-32 Guildhall Shopping Centre, Exeter
EX4 3HH, UK",
"geometry":{
"location":{
"lat":50.7235944,
"lng":-3.5333662
},
"location_type":"ROOFTOP",
"viewport":{
"northeast":
{"lat":50.7249433802915,"lng":-3.532017219708498},
"southwest":
{"lat":50.7222454197085,"lng":-3.534715180291502}
}
},
"partial_match":true,
"place_id":"ChIJy3ZkNDqkbUgR1WXtac_0ClE",
"types":["street_address"]
},
"status" : "OK"
}

Once we know what the geocoder data looks like, we can easily code the geocoding procedure:

/** 
* generates a geocoding call
*/
const generateGeocodingCall = function(gMapsClient, customer){
return new Promise((resolve, reject) => {
let address = `${customer.street_no} ${customer.street},
${customer.postcode}, ${customer.town}`;

gMapsClient.geocode({
address: address
}, (err, response) => {
if(err){
reject(err.message);
return;
}

if(response.json.error_message){
console.log(response.json.status,
response.json.error_message);
reject(err);
return;
}

//update customer
let geocoded = response.json.results[0];
if(geocoded){
customer.geocoded = true;
customer.lon = geocoded.geometry.location.lng;
customer.lat = geocoded.geometry.location.lat;
}

resolve();
});
});
}

In order to make our geocoding call work for us, we need to call it for the retrieved records. Let's do it this way:

/** 
* geocodes specified customer addresses
*/
const geocodeAddresses = function(customers){
return new Promise((resolve, reject) => {
console.log('Geocoding addresses...');

let gMapsClient = require('@google/maps').createClient({
key: gMapsApiKey
});

//prepare geocoding calls
let geocodingCalls = [];
for(let c of customers){
geocodingCalls.push(
generateGeocodingCall(gMapsClient, c)
);
}

//and execute them
Promise.all(geocodingCalls)
.then(()=>resolve(customers))
.catch((err) => reject(err));
});
}

At this stage, we should have our customer records geocoded so we can save them back to the database. As you may expect, this is rather straightforward:

/** 
* saves geocoded customers back to the database
*/
const saveCustomers = function(customers){
return new Promise((resolve, reject) => {

console.log('Saving geocoded customer records...');

let client = new pg.Client(dbCredentials);

client.connect((err) => {
if(err){
reject(err.message);
return;
}

const updateSQLs = [];
var pCounter = 0;

for(let c of customers){
updateSQLs.push(executeNonQuery(client, `UPDATE
${customersSchema}.${customersTable} SET
lon=$1,lat=$2,geocoded=true WHERE id=$3;`, [c.lon,
c.lat, c.id]));
}

Promise.all(updateSQLs)
.then(() => {
client.end();
resolve();
})
.catch((err)=>{
try{
client.end();
}
catch(e){}
reject(err);
});
});
});
}

Finally, let's call our methods in a sequence and watch the magic happen:

//chain all the stuff together 
readCustomers()
.then(geocodeAddresses)
.then(saveCustomers)
.catch(err => console.log(`uups, an error has occured: ${err}`));