Table of Contents for
PostGIS Cookbook - Second Edition

Version ebook / Retour

Cover image for bash Cookbook, 2nd Edition PostGIS Cookbook - Second Edition by Thomas J Kraft Published by Packt Publishing, 2018
  1. PostGIS Cookbook, Second Edition
  2. Title Page
  3. Copyright and Credits
  4. PostGIS Cookbook Second Edition
  5. Packt Upsell
  6. Why subscribe?
  7. PacktPub.com
  8. Contributors
  9. About the authors
  10. Packt is searching for authors like you
  11. Table of Contents
  12. Preface
  13. Who this book is for
  14. What this book covers
  15. To get the most out of this book
  16. Download the example code files
  17. Download the color images
  18. Conventions used
  19. Sections
  20. Getting ready
  21. How to do it…
  22. How it works…
  23. There's more…
  24. See also
  25. Get in touch
  26. Reviews
  27. Moving Data In and Out of PostGIS
  28. Introduction
  29. Importing nonspatial tabular data (CSV) using PostGIS functions
  30. Getting ready
  31. How to do it...
  32. How it works...
  33. Importing nonspatial tabular data (CSV) using GDAL
  34. Getting ready
  35. How to do it...
  36. How it works...
  37. Importing shapefiles with shp2pgsql
  38. How to do it...
  39. How it works...
  40. There's more...
  41. Importing and exporting data with the ogr2ogr GDAL command
  42. How to do it...
  43. How it works...
  44. See also
  45. Handling batch importing and exporting of datasets
  46. Getting ready
  47. How to do it...
  48. How it works...
  49. Exporting data to a shapefile with the pgsql2shp PostGIS command
  50. How to do it...
  51. How it works...
  52. Importing OpenStreetMap data with the osm2pgsql command
  53. Getting ready
  54. How to do it...
  55. How it works...
  56. Importing raster data with the raster2pgsql PostGIS command
  57. Getting ready
  58. How to do it...
  59. How it works...
  60. Importing multiple rasters at a time
  61. Getting ready
  62. How to do it...
  63. How it works...
  64. Exporting rasters with the gdal_translate and gdalwarp GDAL commands
  65. Getting ready
  66. How to do it...
  67. How it works...
  68. See also
  69. Structures That Work
  70. Introduction
  71. Using geospatial views
  72. Getting ready
  73. How to do it...
  74. How it works...
  75. There's more...
  76. See also
  77. Using triggers to populate the geometry column
  78. Getting ready
  79. How to do it...
  80. There's more...
  81. Extending further...
  82. See also
  83. Structuring spatial data with table inheritance
  84. Getting ready
  85. How to do it...
  86. How it works...
  87. See also
  88. Extending inheritance – table partitioning
  89. Getting ready
  90. How to do it...
  91. How it works...
  92. See also
  93. Normalizing imports
  94. Getting ready
  95. How to do it...
  96. How it works...
  97. There's more...
  98. Normalizing internal overlays
  99. Getting ready
  100. How to do it...
  101. How it works...
  102. There's more...
  103. Using polygon overlays for proportional census estimates
  104. Getting ready
  105. How to do it...
  106. How it works...
  107. Working with Vector Data – The Basics
  108. Introduction
  109. Working with GPS data
  110. Getting ready
  111. How to do it...
  112. How it works...
  113. Fixing invalid geometries
  114. Getting ready
  115. How to do it...
  116. How it works...
  117. GIS analysis with spatial joins
  118. Getting ready
  119. How to do it...
  120. How it works...
  121. Simplifying geometries
  122. How to do it...
  123. How it works...
  124. Measuring distances
  125. Getting ready
  126. How to do it...
  127. How it works...
  128. Merging polygons using a common attribute
  129. Getting ready
  130. How to do it...
  131. How it works...
  132. Computing intersections
  133. Getting ready
  134. How to do it...
  135. How it works...
  136. Clipping geometries to deploy data
  137. Getting ready
  138. How to do it...
  139. How it works...
  140. Simplifying geometries with PostGIS topology
  141. Getting ready
  142. How to do it...
  143. How it works...
  144. Working with Vector Data – Advanced Recipes
  145. Introduction
  146. Improving proximity filtering with KNN
  147. Getting ready
  148. How to do it...
  149. How it works...
  150. See also
  151. Improving proximity filtering with KNN – advanced
  152. Getting ready
  153. How to do it...
  154. How it works...
  155. See also
  156. Rotating geometries
  157. Getting ready
  158. How to do it...
  159. How it works...
  160. See also
  161. Improving ST_Polygonize
  162. Getting ready
  163. How to do it...
  164. See also
  165. Translating, scaling, and rotating geometries – advanced
  166. Getting ready
  167. How to do it...
  168. How it works...
  169. See also
  170. Detailed building footprints from LiDAR
  171. Getting ready
  172. How to do it...
  173. How it works...
  174. Creating a fixed number of clusters from a set of points
  175. Getting ready
  176. How to do it...
  177. Calculating Voronoi diagrams
  178. Getting ready
  179. How to do it...
  180. Working with Raster Data
  181. Introduction
  182. Getting and loading rasters
  183. Getting ready
  184. How to do it...
  185. How it works...
  186. Working with basic raster information and analysis
  187. Getting ready
  188. How to do it...
  189. How it works...
  190. Performing simple map-algebra operations
  191. Getting ready
  192. How to do it...
  193. How it works...
  194. Combining geometries with rasters for analysis
  195. Getting ready
  196. How to do it...
  197. How it works...
  198. Converting between rasters and geometries
  199. Getting ready
  200. How to do it...
  201. How it works...
  202. Processing and loading rasters with GDAL VRT
  203. Getting ready
  204. How to do it...
  205. How it works...
  206. Warping and resampling rasters
  207. Getting ready
  208. How to do it...
  209. How it works...
  210. Performing advanced map-algebra operations
  211. Getting ready
  212. How to do it...
  213. How it works...
  214. Executing DEM operations
  215. Getting ready
  216. How to do it...
  217. How it works...
  218. Sharing and visualizing rasters through SQL
  219. Getting ready
  220. How to do it...
  221. How it works...
  222. Working with pgRouting
  223. Introduction
  224. Startup – Dijkstra routing
  225. Getting ready
  226. How to do it...
  227. Loading data from OpenStreetMap and finding the shortest path using A*
  228. Getting ready
  229. How to do it...
  230. How it works...
  231. Calculating the driving distance/service area
  232. Getting ready
  233. How to do it...
  234. See also
  235. Calculating the driving distance with demographics
  236. Getting ready
  237. How to do it...
  238. Extracting the centerlines of polygons
  239. Getting ready
  240. How to do it...
  241. There's more...
  242. Into the Nth Dimension
  243. Introduction
  244. Importing LiDAR data
  245. Getting ready
  246. How to do it...
  247. See also
  248. Performing 3D queries on a LiDAR point cloud
  249. How to do it...
  250. Constructing and serving buildings 2.5D
  251. Getting ready
  252. How to do it...
  253. Using ST_Extrude to extrude building footprints
  254. How to do it...
  255. Creating arbitrary 3D objects for PostGIS
  256. Getting ready
  257. How to do it...
  258. Exporting models as X3D for the web
  259. Getting ready
  260. How to do it...
  261. There's more...
  262. Reconstructing Unmanned Aerial Vehicle (UAV) image footprints with PostGIS 3D
  263. Getting started
  264. How to do it...
  265. UAV photogrammetry in PostGIS – point cloud
  266. Getting ready
  267. How to do it...
  268. UAV photogrammetry in PostGIS – DSM creation
  269. Getting ready
  270. How to do it...
  271. PostGIS Programming
  272. Introduction
  273. Writing PostGIS vector data with Psycopg
  274. Getting ready
  275. How to do it...
  276. How it works...
  277. Writing PostGIS vector data with OGR Python bindings
  278. Getting ready
  279. How to do it...
  280. How it works...
  281. Writing PostGIS functions with PL/Python
  282. Getting ready
  283. How to do it...
  284. How it works...
  285. Geocoding and reverse geocoding using the GeoNames datasets
  286. Getting ready
  287. How to do it...
  288. How it works...
  289. Geocoding using the OSM datasets with trigrams
  290. Getting ready
  291. How to do it...
  292. How it works...
  293. Geocoding with geopy and PL/Python
  294. Getting ready
  295. How to do it...
  296. How it works...
  297. Importing NetCDF datasets with Python and GDAL
  298. Getting ready
  299. How to do it...
  300. How it works...
  301. PostGIS and the Web
  302. Introduction
  303. Creating WMS and WFS services with MapServer
  304. Getting ready
  305. How to do it...
  306. How it works...
  307. See also
  308. Creating WMS and WFS services with GeoServer
  309. Getting ready
  310. How to do it...
  311. How it works...
  312. See also
  313. Creating a WMS Time service with MapServer
  314. Getting ready
  315. How to do it...
  316. How it works...
  317. Consuming WMS services with OpenLayers
  318. Getting ready
  319. How to do it...
  320. How it works..
  321. Consuming WMS services with Leaflet
  322. How to do it...
  323. How it works...
  324. Consuming WFS-T services with OpenLayers
  325. Getting ready
  326. How to do it...
  327. How it works...
  328. Developing web applications with GeoDjango – part 1
  329. Getting ready
  330. How to do it...
  331. How it works...
  332. Developing web applications with GeoDjango – part 2
  333. Getting ready
  334. How to do it...
  335. How it works...
  336. Developing a web GPX viewer with Mapbox
  337. How to do it...
  338. How it works...
  339. Maintenance, Optimization, and Performance Tuning
  340. Introduction
  341. Organizing the database
  342. Getting ready
  343. How to do it...
  344. How it works...
  345. Setting up the correct data privilege mechanism
  346. Getting ready
  347. How to do it...
  348. How it works...
  349. Backing up the database
  350. Getting ready
  351. How to do it...
  352. How it works...
  353. Using indexes
  354. Getting ready
  355. How to do it...
  356. How it works...
  357. Clustering for efficiency
  358. Getting ready
  359. How to do it...
  360. How it works...
  361. Optimizing SQL queries
  362. Getting ready
  363. How to do it...
  364. How it works...
  365. Migrating a PostGIS database to a different server
  366. Getting ready
  367. How to do it...
  368. How it works...
  369. Replicating a PostGIS database with streaming replication
  370. Getting ready
  371. How to do it...
  372. How it works...
  373. Geospatial sharding
  374. Getting ready
  375. How to do it...
  376. How it works...
  377. Paralellizing in PosgtreSQL
  378. Getting ready
  379. How to do it...
  380. How it works...
  381. Using Desktop Clients
  382. Introduction
  383. Adding PostGIS layers – QGIS
  384. Getting ready
  385. How to do it...
  386. How it works...
  387. Using the Database Manager plugin – QGIS
  388. Getting ready
  389. How to do it...
  390. How it works...
  391. Adding PostGIS layers – OpenJUMP GIS
  392. Getting ready
  393. How to do it...
  394. How it works...
  395. Running database queries – OpenJUMP GIS
  396. Getting ready
  397. How to do it...
  398. How it works...
  399. Adding PostGIS layers – gvSIG
  400. Getting ready
  401. How to do it...
  402. How it works...
  403. Adding PostGIS layers – uDig
  404. How to do it...
  405. How it works...
  406. Introduction to Location Privacy Protection Mechanisms
  407. Introduction
  408. Definition of Location Privacy Protection Mechanisms – LPPMs
  409. Classifying LPPMs
  410. Adding noise to protect location data
  411. Getting ready
  412. How to do it...
  413. How it works...
  414. Creating redundancy in geographical query results
  415. Getting ready
  416. How to do it...
  417. How it works...
  418. References
  419. Other Books You May Enjoy
  420. Leave a review - let other readers know what you think

How to do it...

If you did not follow the recipes in Chapter 1, Moving Data in and out of PostGIS, be sure to import the hotspots (Global_24h.csv) in PostGIS. The following steps explain how to do it with ogr2ogr (you should import the dataset in their original SRID, 4326, to make spatial operations faster):

  1. Start a session in the postgis_cookbook database:
      > psql -d postgis_cookbook -U me
  1. Create a new schema chp10 in the postgis_cookbook database:
      postgis_cookbook=# CREATE SCHEMA chp10;
  1. We need to create the hotspots_dist table, that will serve as parent for the foreign tables:
      postgis_cookbook =# CREATE TABLE chp10.hotspots_dist (id serial 
PRIMARY KEY, the_geom public.geometry(Point,4326));
  1. Exit the psql environment:
      postgis_cookbook=# \q
  1. Connect to the psql environment as the postgres user:
      > psql -U me
  1. Create the remote databases, connect them, create the postgis extension and create the foreign tables that will receive the sharded data. Then, exit the psql environment. For this, execute the following SQL commands:
      postgres=# CREATE DATABASE quad_NW;
CREATE DATABASE quad_NE;
CREATE DATABASE quad_SW;
CREATE DATABASE quad_SE;
postgres=# \c quad_NW;
quad_NW =# CREAT EXTENSION postgis;
quad_NW =# CREATE TABLE hotspots_quad_NW (
id serial PRIMARY KEY,
the_geom public.geometry(Point,4326)
);
quad_NW =# \c quad_NE;
quad_NE =# CREAT EXTENSION postgis;
quad_NE =# CREATE TABLE hotspots_quad_NE (
id serial PRIMARY KEY,
the_geom public.geometry(Point,4326)
);
quad_NW =# \c quad_SW;
quad_SW =# CREAT EXTENSION postgis;
quad_SW =# CREATE TABLE hotspots_quad_SW (
id serial PRIMARY KEY,
the_geom public.geometry(Point,4326)
);
quad_SW =# \c quad_SE;
quad_SE =# CREAT EXTENSION postgis;
quad_SE =# CREATE TABLE hotspots_quad_SE (
id serial PRIMARY KEY,
the_geom public.geometry(Point,4326)
);
quad_SE =# \q
  1. In order to import the fire dataset, create a GDAL virtual data source composed of just one layer derived from the Global_24h.csv file. To do so, create a text file named global_24h.vrt in the same directory where the CSV file is and edit it as follows:
        <OGRVRTDataSource> 
          <OGRVRTLayer name="Global_24h"> 
            <SrcDataSource>Global_24h.csv</SrcDataSource> 
            <GeometryType>wkbPoint</GeometryType> 
            <LayerSRS>EPSG:4326</LayerSRS> 
            <GeometryField encoding="PointFromColumns" 
x="longitude" y="latitude"/> </OGRVRTLayer> </OGRVRTDataSource>
  1. Import in PostGIS the Global_24h.csv file using the global_24.vrt virtual driver you created in a previous recipe:
      $ ogr2ogr -f PostgreSQL PG:"dbname='postgis_cookbook' user='me' 
password='mypassword'" -lco SCHEMA=chp10 global_24h.vrt
-lco OVERWRITE=YES -lco GEOMETRY_NAME=the_geom -nln hotspots
  1. Create the extension postgres_fdw in the database:
      postgis_cookbook =# CREATE EXTENSION postgres_fdw;
  1. Define the servers that will host the external databases. You need to define the name of the database, the host address and the port in which the database will receive connections. In this case we will create 4 databases, one per global quadrant, according to latitude and longitude in the Mercator SRID. Execute the following commands to create the four servers:
      postgis_cookbook =# CREATE SERVER quad_NW 
FOREIGN DATA WRAPPER postgres_fdw OPTIONS
(dbname 'quad_NW', host 'localhost', port '5432');
CREATE SERVER quad_SW FOREIGN DATA WRAPPER postgres_fdw OPTIONS
(dbname 'quad_SW', host 'localhost', port '5432');
CREATE SERVER quad_NE FOREIGN DATA WRAPPER postgres_fdw OPTIONS
(dbname 'quad_NE', host 'localhost', port '5432');
CREATE SERVER quad_SE FOREIGN DATA WRAPPER postgres_fdw OPTIONS
(dbname 'quad_SE', host 'localhost', port '5432');
  1. For this example, we will be using local databases, but the host parameter can be either an IP address or a database file. The user who creates these commands will be defined as the local owner of the servers.
  2. Create the user mapping in order to be able to connect to the foreign databases. For this, you need to write the login information of the owner of the foreign database in their local server:
      postgis_cookbook =# CREATE USER MAPPING FOR POSTGRES SERVER quad_NW 
OPTIONS (user 'remoteme1', password 'myPassremote1');
CREATE USER MAPPING FOR POSTGRES SERVER quad_SW
OPTIONS (user 'remoteme2', password 'myPassremote2');
CREATE USER MAPPING FOR POSTGRES SERVER quad_NE
OPTIONS (user 'remoteme3', password 'myPassremote3');
CREATE USER MAPPING FOR POSTGRES SERVER quad_SE
OPTIONS (user 'remoteme4', password 'myPassremote4');
  1. Create the tables in the foreign databases, based on the local table chp10.hotspots_dist:
    postgis_cookbook =# CREATE FOREIGN TABLE hotspots_quad_NW () 
INHERITS (chp10.hotspots_dist) SERVER quad_NW
OPTIONS (table_name 'hotspots_quad_sw');
CREATE FOREIGN TABLE hotspots_quad_SW () INHERITS (chp10.hotspots_dist)
SERVER quad_SW OPTIONS (table_name 'hotspots_quad_sw');
CREATE FOREIGN TABLE hotspots_quad_NE () INHERITS (chp10.hotspots_dist)
SERVER quad_NE OPTIONS (table_name 'hotspots_quad_ne');
CREATE FOREIGN TABLE hotspots_quad_SE () INHERITS (chp10.hotspots_dist)
SERVER quad_SE OPTIONS (table_name 'hotspots_quad_se');
  1. The name of the table name should preferably be written in lowercase.
  2. Create a function that will calculate the quadrant of the point to be inserted in the database:
      postgis_cookbook=# CREATE OR REPLACE 
FUNCTION __trigger_users_before_insert(
) RETURNS trigger AS $__$
DECLARE
angle integer;
BEGIN
EXECUTE $$ select (st_azimuth(ST_geomfromtext('Point(0 0)',4326),
$1)

/(2*PI()))*360 $$ INTO angle
USING NEW.the_geom;
IF (angle >= 0 AND angle<90) THEN
EXECUTE $$
INSERT INTO hotspots_quad_ne (the_geom) VALUES ($1)
$$ USING
NEW.the_geom;
END IF;
IF (angle >= 90 AND angle <180) THEN
EXECUTE $$ INSERT INTO hotspots_quad_NW (the_geom) VALUES ($1)
$$ USING NEW.the_geom;
END IF;
IF (angle >= 180 AND angle <270) THEN
EXECUTE $$ INSERT INTO hotspots_quad_SW (the_geom) VALUES ($1)
$$ USING NEW.the_geom;
END IF;
IF (angle >= 270 AND angle <360) THEN
EXECUTE $$ INSERT INTO hotspots_quad_SE (the_geom) VALUES ($1)
$$ USING NEW.the_geom;
END IF;
RETURN null;
END;
$__$ LANGUAGE plpgsql;
CREATE TRIGGER users_before_insert
BEFORE INSERT ON chp10.hotspots_dist
FOR EACH ROW EXECUTE PROCEDURE __trigger_users_before_insert();
  1. Insert the test coordinates (10, 10), (-10, 10) and (-10 -10). The first one should be stored in the NE quadrant, the second on the SE quadrant and the third on the SW quadrant.
      postgis_cookbook=# INSERT INTO CHP10.hotspots_dist (the_geom)
VALUES (0, st_geomfromtext('POINT (10 10)',4326));

INSERT INTO CHP10.hotspots_dist (the_geom)
VALUES ( st_geomfromtext('POINT (-10 10)',4326));

INSERT INTO CHP10.hotspots_dist (the_geom)
VALUES ( st_geomfromtext('POINT (-10 -10)',4326));
  1. Check the data insertion in the tables, both the local view and the external database hotspots_quad_NE:
      postgis_cookbook=# SELECT ST_ASTEXT(the_geom) 
FROM CHP10.hotspots_dist;
  1. As can be seen, the local version shows all the points that were inserted. Now, execute the query over a remote database:
      postgis_cookbook=# SELECT ST_ASTEXT(the_geom) FROM hotspots_quad_ne;

The remote databases only has the point that it should store, based on the trigger function defined earlier.

  1. Now, insert all the points from the original hotspot table, imported in step 8. For this test, we will just insert the geometry information. Execute the following SQL sentence:
      postgis_cookbook=# insert into CHP10.hotspots_dist
(the_geom, quadrant)

select the_geom, 0 as geom from chp10.hotspots;
  1. As in step 15, in order to check if the results were classified and stored correctly, execute the following queries, to the local table hotspots_dist and the remote table hotsports_quad_ne:
      postgis_cookbook=# SELECT ST_ASTEXT(the_geom) 
FROM CHP10.hotspots_dist;
  1. The results show the first 10 points stored in the local logical version of the database.
      postgis_cookbook=# SELECT ST_ASTEXT(the_geom) FROM hotspots_quad_ne;
  1. The results show the first 10 points stored in the remote database with all the points in the NE quadrant. The points indeed show that they all have positive latitude and longitude values. When presented in a GIS application, the results is the following: