-
Notifications
You must be signed in to change notification settings - Fork 3
Traffic Count Data Publishing
The traffic count data publishing tools load traffic count data collected from road tubes into a series of Esri ArcSDE feature classes as well as the City of Austin's Open Data Portal.
The general workflow of the processing is as follows:
- Traffic Engineer Technians export traffic count reports to a network fileshare.
- A series of Python scripts translates and merge the traffic count reports into a database-friendly schema.
- An FME Server workspace loads the processed count reports into an Esri ArcSDE geodatabase.
- The CTM GEODATAPUSHER application synchronizes the file geodatabase with an ArcSDE feature class on GISMAINT1.
The core components of the tool are described below.
Traffic engineer technicians process raw traffic count data collected from road tubes using the TimeMark VIAS softare. Using the VIAS software, the engineers export three traffic count reports in CSV format, each of which corresponds to a downstream feature class on GISMAINT1. The three traffic count report types are:
- Classification
- Speed
- Volume
Each traffic count report shares a common 'Data File' and 'Site Code' identifier, which uniquely identify the traffic study. Engineer technicians export each of the three traffic count reports and store them in a network fileshare at [path to unprocseed source files][study_year]
The filename of each CSV report follows the pattern "data file name" + "report type". For example, a traffic count with data file identifier "RambleLn802BlkBD" has three corresponding reports:
- RambleLn802BlkBDCls.csv << Traffic Classificaiton Report
- RambleLn802BlkBDSpd.csv << Traffic Speed Report
- RambleLn802BlkBDVol.csv << Traffic Volume Report
A series of Python scripts translate the TimeMark traffic count reports into a database-ready schema. Those are:
- traffic_count_cls.py <= translates classification reports
- traffic_count_spd.py <= translates speed reports
- traffic_count_vol.py <= translates volme reports
- traffic_count_loader.py <= merges processed reports of any type into a master table
- traffic_count_pub.py <= publishes master data to data.austintexas.gov (in-progress)
The source code for the translation scripts is available here: https://github.com/cityofaustin/transportation-data-publishing/
The translation scripts are run on a nightly basis from the Austin Transporation Arterial Management Divison scripting VM (ATDATMSSCRIPT).
Each traffic count report processed by the translation script is moved to: [path to unprocseed source files][study year]\processed\
The translated traffic count reports are stored at: [path to data-friendly traffic count root directory]\SOURCE_FILES[report_type]
Where report_type is one of CLASSIFICATION, VOLUME, or SPEED.
The translated traffic counts are merged into master tables located here: [path to data-friendly traffic count root directory]\MASTER_DATA\
This work is ongoing. When complete
Each FME workspace evaluates each row in each traffic count report and does the following:
- Determine if the record should written as INSERT or UPDATE (based on the rows ROW_ID attribute)
- If the record is new, generate a unique primary key (TRAFFIC_COUNT_ID) for the row
- INSERT OR UPDATE the record in the corresponding feature class in traffic_count.gdb
- Once all rows in the report have been processed, the report is moved to: G:\ATD\ATD_GIS\02_ENT_APPLICATIONS\TRAFFIC_COUNT\SOURCE_FILES[report_type]\PROCESSED
The FME server workspaces are maintained here: G:\ATD\ATD_GIS\02_ENT_APPLICATIONS\TRAFFIC_COUNT\FME
TDB
The database schema for each traffic count report type are available here: G:\ATD\ATD_GIS\02_ENT_APPLICATIONS\TRAFFIC_COUNT\SCHEMA