LA Metro Open Data - what's available and where can I find it? While have no official "Open Data Portal", there is some data out there. Our Developer Site is the de facto aggregator at the moment. These are the broad categories of data we currently have available:
Government should release machine readable data openly for the public to use, by default. Doing so benefits both the public and government itself. Open data that is released should be:
Contact: Naomi Iwasaki (Office of Race & Equity)
Equity Focused Communities Map/Data: https://www.arcgis.com/home/webmap/viewer.html?webmap=e2b8b6597bcd4db9a1c47737595de070&extent=-119.1491,33.6991,-117.184,34.6794
Figure out how we can combine our work with what's being done on OneMAP (contacts: Anika and Kenneth)
OneMap Hub for employee guides on how to use ArcGIS: https://la-metro-onemap-program-lametro.hub.arcgis.com/
ArcGIS Online uses Metro SSO: https://lametro.maps.arcgis.com/home/index.html
Not sure what this is, but I believe it's only accessible from the Metro internal network: https://gis.metro.net/portal/
Contact: Dan Nguyen
The Ridership Stats Dashboard and its underlying data is open to our team for re-writing and re-designing. It's built on .NET/VB and there is no staff in Operations available/knowledgable enough to continue development in its current form. It is in dire need of improvements to its interface and the ability to download the raw data. (Joe/Rollin)
References from other agencies:
Talk to Ana Vallianatos for contacts.
Talk to Ana Vallianatos for contacts.
Automate and possibly move/restructure.
The current process for updating GTFS bus data:
Rollin Baker uploads a zip file of the new GTFS data to Google Drive. He then uses the Google Drive share feature to email the files to a list that he maintains:
amanda.pacheco@goswift.ly; Baker, Rollin BAKERRO@metro.net; bingmapstransit@microsoft.com; chloe.haines@itoworld.com; crivera@devsar.com; data@transitapp.com; drew@interline.io; emma.bridgeford@itoworld.com; fiona_mcdonnell2@apple.com; g.genna@goswift.ly; gabriella@moovit.com; Gordon, John GordonJ@metro.net; ian@interline.io; jacob.whitbeck@pactera.com; jacob.whitbeck@pacteraedge.com; justine.coates@microsoft.com; kayla@goswift.ly; Lam, Lan-Chi LAML@metro.net; maps_data_validation@apple.com; Martinez, Al MartinezAl@metro.net; mgridley@apple.com; michael.jacklin@itoworld.com; moovitaccess@moovit.com; Nguyen, Dan NguyenD@metro.net; nicolas@transit.app; pavel.belsky@moovit.com; rob.gaffney@goswift.ly; shane.reynolds@itoworld.com; Simpson, Joe SimpsonJ@metro.net; support@goswift.ly; transit_contact@group.apple.com; Wang, Peishan WANGP@metro.net; yochi.danino@moovit.com
He includes a brief high level description of changes to the GTFS. Our web team receives this email notification of the new GTFS data.
Originally, we would have to process it to convert the Windows-style line endings to Unix-style. It would then be published to the GitLab repository as individual unzipped files in addition to the zip file. The README file would be updated with the description of the latest changes as well as a log of the zip file contents.
Because of the OpenTripPlanner implementation on metro.net, we would have to publish the new GTFS to GitLab at a specific time - Saturday evening - so that it wouldn't be used by OTP too soon. That's because the new GTFS' calendar dates don't start until the day of the shakeup and we don't release a merged version of the old and new GTFS (because it was too big for Google to consume back in the day).
Now, because we don't have OTP on our website anymore, some of these things do not apply and we can re-do the way we publish our GTFS.
1) It is on GitHub instead of GitLab. 2) It archives old GTFS versions. 3) It can be released well in advance before service goes into effect. 4) Build in automation (see below)
This will require us to communicate with data consumers as external stakeholders. Internal stakeholders to include are Dan Nguyen and Rollin Baker.
The way GTFS gets published currently:
Normally, bus GTFS updates very infrequently, but starting the week of 9/13, new calendar_dates.txt
files will be generated weekly.
This file has 2 components. The base data, which is generated from HASTUS, and the Dodger Stadium/SoFi Stadium service which is NOT in HASTUS and is manually created by Rollin Baker. The two datasets can easily be appended. We can also choose to optimize the data by removing non-relevant date ranges.
The thought is:
calendar_dates.txt
file is generated.calendar_dates.txt
, or do we want our script to do that.