Skip to content
justjkk edited this page Aug 20, 2011 · 9 revisions

Transit Routing Tutorial

Installation from source

Pre-requisites

  1. cmake
  2. postgresql-server
  3. boost-graph library 1.46

Ubuntu

The following commands and instructions should help you install and set up on a fresh Ubuntu system.

Installing Postgresql

sudo apt-get install postgresql
#TODO: Create a user

Installing Postgis

sudo apt-get install postgis

Refer here for instructions on creating a postgis template database.

Installing Boost-1.46

If using Ubuntu 11.10, boost 1.46 can be installed from the repository,

sudo apt-get install libboost-graph1.46-dev

Otherwise, download the source and copy the graph library into includes folder

cd /tmp
wget http://sourceforge.net/projects/boost/files/boost/1.46.1/boost_1_46_1.tar.gz
tar -xzvf boost_1_46_1.tar.gz
sudo mkdir -p /usr/local/include/boost
sudo cp -r boost_1_46_1/boost/graph /usr/local/include/boost/

Installing Build utilities

sudo apt-get install gcc g++ cmake

Installing Required libraries

sudo apt-get install postgresql-server-dev-all libpq-dev libcunit1-dev \
libgmp-dev

Compiling

PGROUTING_PATH=/usr/local/src/pgrouting
cd $PGROUTING_PATH
cmake -DWITH_TRANSIT=1 .
make
sudo make install

Testing

  1. Copy test/settings.sample.py into settings.py
  2. Edit settings.py with actual database username, password, etc..
  3. If you don't have a postgis template, refer here
  4. Invoke make test from inside PGROUTING_PATH

Configuring Database

Create a transit database

DBNAME=transit
createdb $DBNAME -T template_postgis

Loading GTFS data

GTFS data can be easily loaded using gtfs2pgrouting. Please check the project's README file for instructions.

Loading pgrouting sql functions

Open a client connection to the transit database(psql transit) and run the following,

\i /usr/share/pgrouting/routing_core.sql
\i /usr/share/pgrouting/routing_core_wrappers.sql
\i /usr/share/pgrouting/routing_transit.sql
\i /usr/share/pgrouting/routing_transit_wrappers.sql --This step may take several minutes based on the volume of gtfs data loaded.

Executing Queries

Preparing Dataset for routing

SELECT prepare_scheduled('gtfs'); -- where 'gtfs' is the schema name

Now, prepare_scheduled function adds a new column named 'stop_id_int4' to the 'stops' table. NOTE: The above query may take several minutes to execute and consume 100% CPU during the process.

Executing core queries

SELECT s.stop_id, trip_id, waiting_time, travel_time
FROM scheduled_route('gtfs',
  (SELECT stop_id_int4 FROM gtfs.stops WHERE stop_id = 'SRC_ID'),
  (SELECT stop_id_int4 FROM gtfs.stops WHERE stop_id = 'DEST_ID'),
  extract(epoch from timestamp '20-Aug-2011 8:30PM Asia/Kolkata')::INTEGER
) sr,
gtfs.stops s
WHERE sr.stop_id = s.stop_id_int4;

The above query returns a list of stops(changeovers) and the trips to take between them along with the waiting time and travel time (in seconds) for that trip.

Executing wrapped queries

The same result can be obtained using a wrapper function:

SELECT stop_id, trip_id, waiting_time, travel_time
FROM gtfs_route('SRC_ID', 'DEST_ID', '20-Aug-2011 8:30PM Asia/Kolkata');
Clone this wiki locally