diff --git a/README.md b/README.md index a8c29e0..ed5711c 100644 --- a/README.md +++ b/README.md @@ -8,20 +8,26 @@ The Travel Time Request App is a simple web application designed to help City st This app was originally developed as a [class project by U of T students](https://www.youtube.com/watch?v=y6lnefduogo) in partnership with the City, though it has undergone substantial development by the Data & Analytics Unit since then. ## How to use the app -When you [visit the app](https://trans-bdit.intra.prod-toronto.ca/traveltime-request/), you will be prompted to add/create at least one of each of the following: -* a corridor, drawn on the map -* a time range, given in hours of the day, 00 - 23 -* a date range (note that the end of the date range is exclusive) -* a day of week selection -* a selection of whether or not to include statutory holidays -The app will combine these factors together to request travel times for all possible combinations. If one of each type of factor is selected, only a single travel time will be estimated with the given parameters. +### Via the front-end user-interface + +When you [visit the app](https://trans-bdit.intra.prod-toronto.ca/traveltime-request/), you will be prompted to create at least one of each of the following factors: + +| Factor | Description | +| ----- | ----------- | +| Corridor | Drawn on the map, it is a shortest path between two intersections of your choice. Draw it in both directions if you need both directions of travel. | +| Time Range | Times must start and end on the hour and the app accepts integer values between 0 and 24. The final hour is _exclusive_, meaning that a range of 7am to 9am covers two hours, not three. Values of 0 and 24 both interchangeably represent midnight; a time range of 0 - 24 will return all hours of the day. A time range starting after it ends (e.g. 10pm to 4am) will wrap around midnight[^1]. | +| Date Range | Use the calendar widget to select a date range. Note that selected ranges are displayed with an exclusive end date. | +| Day of Week | Identify the days of week to include in the aggregation. | +| Holiday Inclusion | Decide whether to include or exclude Ontario's statutory holidays if applicable. You can also opt to do it both ways. | + +The app will combine these factors together to request travel times for all valid combinations. If one of each type of factor is selected, only a single travel time will be estimated with the given parameters. Once each factor type has been validly entered it will turn from red to green. Once one or more of each type of factor is ready, a button will appear allowing you to submit the query. Once the data is returned from the server (this can take a while when there are many combinations to process) you will be prompted to download the data as either CSV or JSON. If you have any trouble using the app, please send an email to Nate Wessel (nate.wessel@toronto.ca) or feel free to open an issue in this repository if you are at all familiar with that process. -## Outputs +#### Outputs The app can return results in either CSV or JSON format. The fields in either case are the same: @@ -40,10 +46,15 @@ The app can return results in either CSV or JSON format. The fields in either ca | `mean_travel_time_minutes` | The mean travel time in minutes is given as a floating point number rounded to three decimal places. Where insufficient data was available to complete the request, the value will be null. | | `mean_travel_time_seconds` | Same as above, but measured in seconds. | +### By querying the back-end API directly + +The front-end UI pulls all data from the backend service available at https://trans-bdit.intra.prod-toronto.ca/tt-request-backend/. This API defines several endpoints, all of which return JSON-structured data. Those endpoints are documented at the link above. + +Generally, the API returns much more data than is available through the UI and this allows some extended use-cases which are just starting to be documented in the [`analysis/`](./analysis) folder. These may include looking at travel time variability within a given window and conducting statistical comparisons between different time periods. ## Methodology -Data for travel time estimation through the app are sourced from [HERE](https://github.com/CityofToronto/bdit_data-sources/tree/master/here)'s [traffic API](https://developer.here.com/documentation/traffic-api/api-reference.html) and are available back to about 2017. HERE collects data from motor vehicles that report their speed and position to HERE, most likely as a by-poduct of the driver making use of an in-car navigation system connected to the Internet. +Data for travel time estimation through the app are sourced from [HERE](https://github.com/CityofToronto/bdit_data-sources/tree/master/here)'s [traffic API](https://developer.here.com/documentation/traffic-api/api-reference.html) and are available back to 2017-09-01. HERE collects data from motor vehicles that report their speed and position to HERE, most likely as a by-poduct of the driver making use of an in-car navigation system connected to the Internet. The number of vehicles within the City of Toronto reporting their position to HERE in this way has been [estimated](./analysis/total-fleet-size.r) to be around 2,000 to 3,000 vehicles during the AM and PM peak periods, with lower numbers in the off hours. While this may seem like a lot, in practice many of these vehicles are on the highways and the coverage of any particular city street within a several hour time window can be very minimal if not nil. For this reason, we are currently restricting travel time estimates to "arterial" streets and highways. @@ -59,8 +70,10 @@ We aggregate corridors together spatially as necessary into larger corridors whe ### Other means of estimating travel times -The City also has [bluetooth sensors](https://github.com/CityofToronto/bdit_data-sources/blob/master/bluetooth/README.md) at some intersections which can be used to get a more reliable measure of travel time. These sensors pick up a much larger proportion of vehicles than the HERE data, making it possible to do a temporally fine-grained analysis. The sensors however are only in a few locations, especially in the downtown core and along the Gardiner and DVP expressways. +The City also has [bluetooth sensors](https://github.com/CityofToronto/bdit_data-sources/blob/master/bluetooth/README.md) at some intersections which can be used to get a more reliable measure of travel time. These sensors pick up a much larger proportion of vehicles than the HERE data, making it possible to do a temporally fine-grained analysis. The sensors however are only in a few locations, especially in the downtown core and along the Gardiner and DVP expressways. ## Development For information on development and deployment, see [Running the App](./running-the-app.md). + +[^1]: Time ranges that wrap midnight will result in some discontinuity of periods because of the interaction with the date range and day-of-week paremeters. For example, if you select the time range `[22,2)` but only have one date within your date range (e.g. `[2024-01-01,2024-01-02)`) then the aggregation will include both the period from `[2024-01-01 00:00:00, 2024-01-01 02:00:00)` and `[2024-01-01 22:00:00, 2024-01-01 00:00:00)`, averaged together. That is, both the early morning and late evening of the same day. \ No newline at end of file diff --git a/backend/app/getGitHash.py b/backend/app/getGitHash.py new file mode 100644 index 0000000..f3063fa --- /dev/null +++ b/backend/app/getGitHash.py @@ -0,0 +1,4 @@ +from subprocess import check_output + +def getGitHash(): + return check_output(['git', 'rev-parse', 'HEAD']).decode('ascii').strip() diff --git a/backend/app/get_centreline_links.py b/backend/app/get_centreline_links.py new file mode 100644 index 0000000..5edabab --- /dev/null +++ b/backend/app/get_centreline_links.py @@ -0,0 +1,49 @@ +import json +from app.db import getConnection + +links_query = ''' +WITH centreline_path AS ( + SELECT unnest(links)::int AS centreline_id + FROM gis_core.get_centreline_btwn_intersections( + %(from_node_id)s, + %(to_node_id)s + ) +) + +SELECT + centreline_id, + linear_name_full_legal AS st_name, + ST_AsGeoJSON(geom) AS geojson, + ST_length(ST_Transform(geom, 2952)) AS length_m, + from_intersection_id, + to_intersection_id +FROM centreline_path +JOIN gis_core.centreline_latest USING (centreline_id) +''' + +# returns a json with geometries of links between two nodes +def get_centreline_links(from_node_id, to_node_id, map_version='23_4'): + with getConnection() as connection: + with connection.cursor() as cursor: + cursor.execute( + links_query, + { + "from_node_id": from_node_id, + "to_node_id": to_node_id + } + ) + + links = [ + { + 'centreline_id': centreline_id, + 'name': st_name, + #'sequence': seq, + 'geometry': json.loads(geojson), + 'length_m': length_m, + 'source': source, + 'target': target + } for centreline_id, st_name, geojson, length_m, source, target in cursor.fetchall() + ] + + connection.close() + return links diff --git a/backend/app/get_links.py b/backend/app/get_here_links.py similarity index 96% rename from backend/app/get_links.py rename to backend/app/get_here_links.py index b2a3b65..cfe411c 100644 --- a/backend/app/get_links.py +++ b/backend/app/get_here_links.py @@ -28,7 +28,7 @@ ''' # returns a json with geometries of links between two nodes -def get_links(from_node_id, to_node_id, map_version='23_4'): +def get_here_links(from_node_id, to_node_id, map_version='23_4'): parsed_links_query = sql.SQL(links_query).format( routing_function = sql.Identifier(f'get_links_btwn_nodes_{map_version}'), street_geoms_table = sql.Identifier(f'routing_streets_{map_version}'), diff --git a/backend/app/get_nearest_centreline_node.py b/backend/app/get_nearest_centreline_node.py new file mode 100644 index 0000000..c15bfa7 --- /dev/null +++ b/backend/app/get_nearest_centreline_node.py @@ -0,0 +1,48 @@ +from app.db import getConnection +from json import loads as loadJSON + +SQL = ''' +WITH nearest_centreline AS ( + SELECT + intersection_id, + geom::geography <-> ST_MakePoint(%(longitude)s, %(latitude)s)::geography AS distance + FROM gis_core.intersection_latest + ORDER BY geom <-> ST_SetSRID(ST_MakePoint(%(longitude)s, %(latitude)s), 4326) ASC + LIMIT 1 +) + +SELECT + intersection_id AS centreline_id, + ST_AsGeoJSON(geom) AS geojson, + distance, + ARRAY_AGG(DISTINCT linear_name_full_from) AS street_names +FROM nearest_centreline +JOIN gis_core.intersection_latest AS ci USING (intersection_id) +GROUP BY + intersection_id, + geom, + distance +''' + +def get_nearest_centreline_node(longitude, latitude): + """ + Return the nearest node from the latest city centreline network + + arguments: + longitude (float): longitude of the point to search around + latitude (float): latitude of the point to search around + """ + node = {} + with getConnection() as connection: + with connection.cursor() as cursor: + cursor.execute(SQL, {'longitude': longitude, 'latitude': latitude}) + centreline_id, geojson, distance, street_names = cursor.fetchone() + node = { + 'centreline_id': centreline_id, + 'street_names': street_names, + 'geometry': loadJSON(geojson), + 'distance': distance + } + connection.close() + return node + diff --git a/backend/app/get_node.py b/backend/app/get_node.py index ce72521..3c0955f 100644 --- a/backend/app/get_node.py +++ b/backend/app/get_node.py @@ -2,21 +2,23 @@ import json from app.db import getConnection +from app.get_nearest_centreline_node import get_nearest_centreline_node SQL = ''' SELECT - ST_AsGeoJSON(cg_nodes.geom) AS geom, + ST_AsGeoJSON( + ST_GeometryN(here_nodes.geom, 1) -- necessary because currently stored as a multi-point + ) AS geom, array_agg(DISTINCT InitCap(streets.st_name)) FILTER (WHERE streets.st_name IS NOT NULL) AS street_names -FROM congestion.network_nodes AS cg_nodes -JOIN here.routing_nodes_21_1 AS here_nodes USING (node_id) -JOIN here_gis.streets_att_21_1 AS streets USING (link_id) -WHERE node_id = %(node_id)s +FROM here.routing_nodes_23_4 AS here_nodes +JOIN here_gis.streets_att_23_4 AS streets USING (link_id) +WHERE here_nodes.node_id = %(node_id)s GROUP BY - node_id, - cg_nodes.geom; + here_nodes.node_id, + here_nodes.geom; ''' -def get_node(node_id): +def get_node(node_id, conflate_with_centreline=False): node = {} with getConnection() as connection: with connection.cursor() as cursor: @@ -27,5 +29,11 @@ def get_node(node_id): 'street_names': street_names, 'geometry': json.loads(geojson) } + if conflate_with_centreline: + lon = node['geometry']['coordinates'][0] + lat = node['geometry']['coordinates'][1] + node['conflated'] = { + 'centreline': get_nearest_centreline_node(lon, lat) + } connection.close() return node diff --git a/backend/app/get_travel_time.py b/backend/app/get_travel_time.py index c411f92..67e249f 100644 --- a/backend/app/get_travel_time.py +++ b/backend/app/get_travel_time.py @@ -1,12 +1,15 @@ """Function for returning data from the aggregate-travel-times/ endpoint""" from app.db import getConnection -from app.get_links import get_links +from app.get_here_links import get_here_links from app.selectMapVersion import selectMapVersion +from traveltimetools.utils import timeFormats import numpy import math import pandas import random +import json +from app.getGitHash import getGitHash # the way we currently do it def mean_daily_mean(obs): @@ -19,23 +22,53 @@ def mean_daily_mean(obs): # average the days together return numpy.mean(daily_means) -def timeFormat(seconds): - return { - 'seconds': round(seconds,3), - 'minutes': round(seconds/60,3), - # format travel times in seconds like a clock for humans to read - 'clock': f'{math.floor(seconds/3600):02d}:{math.floor((seconds/60)%60):02d}:{round(seconds%60):02d}' - } +def checkCache(uri): + query = f''' + SELECT results + FROM nwessel.cached_travel_times + WHERE uri_string = %(uri)s AND commit_hash = %(hash)s + ''' + connection = getConnection() + with connection: + with connection.cursor() as cursor: + try: + cursor.execute(query, {'uri': uri, 'hash': getGitHash()}) + for (record,) in cursor: # will skip if no records + return record # there could only be one + except: + pass + +def cacheAndReturn(obj,uri): + query = f''' + INSERT INTO nwessel.cached_travel_times (uri_string, commit_hash, results) + VALUES (%(uri)s, %(hash)s, %(results)s) + ''' + connection = getConnection() + with connection: + with connection.cursor() as cursor: + try: + cursor.execute(query, {'uri': uri, 'hash': getGitHash(), 'results': json.dumps(obj)}) + finally: + return obj def get_travel_time(start_node, end_node, start_time, end_time, start_date, end_date, include_holidays, dow_list): """Function for returning data from the aggregate-travel-times/ endpoint""" + # first check the cache + cacheURI = f'/{start_node}/{end_node}/{start_time}/{end_time}/{start_date}/{end_date}/{str(include_holidays).lower()}/{"".join(map(str,dow_list))}' + cachedValue = checkCache(cacheURI) + if cachedValue: + return cachedValue + holiday_clause = '' if not include_holidays: holiday_clause = '''AND NOT EXISTS ( SELECT 1 FROM ref.holiday WHERE ta.dt = holiday.dt )''' + # if end_time is less than the start_time, then we wrap around midnight + ToD_and_or = 'AND' if end_time > start_time else 'OR' + query = f''' SELECT link_dir, @@ -45,8 +78,10 @@ def get_travel_time(start_node, end_node, start_time, end_time, start_date, end_ FROM here.ta WHERE link_dir = ANY(%(link_dir_list)s) - AND tod >= %(start_time)s::time - AND tod < %(end_time)s::time + AND ( + tod >= %(start_time)s::time + {ToD_and_or} tod < %(end_time)s::time + ) AND date_part('ISODOW', dt) = ANY(%(dow_list)s) AND dt >= %(start_date)s::date AND dt < %(end_date)s::date @@ -55,7 +90,7 @@ def get_travel_time(start_node, end_node, start_time, end_time, start_date, end_ map_version = selectMapVersion(start_date, end_date) - links = get_links( + links = get_here_links( start_node, end_node, map_version @@ -114,7 +149,7 @@ def get_travel_time(start_node, end_node, start_time, end_time, start_date, end_ if len(sample) < 1: # no travel times or related info to return here - return { + return cacheAndReturn({ 'results': { 'travel_time': None, 'observations': [], @@ -126,7 +161,7 @@ def get_travel_time(start_node, end_node, start_time, end_time, start_date, end_ 'corridor': {'links': links, 'map_version': map_version}, 'query_params': query_params } - } + }, cacheURI) tt_seconds = mean_daily_mean(sample) @@ -140,22 +175,22 @@ def get_travel_time(start_node, end_node, start_time, end_time, start_date, end_ p95lower, p95upper = numpy.percentile(sample_distribution, [2.5, 97.5]) reported_intervals = { 'p=0.95': { - 'lower': timeFormat(p95lower), - 'upper': timeFormat(p95upper) + 'lower': timeFormats(p95lower,1), + 'upper': timeFormats(p95upper,1) } } - return { + return cacheAndReturn({ 'results': { - 'travel_time': timeFormat(tt_seconds), + 'travel_time': timeFormats(tt_seconds,1), 'confidence': { 'sample': len(sample), 'intervals': reported_intervals }, - 'observations': [timeFormat(tt) for (dt,tt) in sample] + 'observations': [timeFormats(tt,1) for (dt,tt) in sample] }, 'query': { 'corridor': {'links': links, 'map_version': map_version}, 'query_params': query_params } - } + },cacheURI) diff --git a/backend/app/routes.py b/backend/app/routes.py index 578ed5c..346df02 100644 --- a/backend/app/routes.py +++ b/backend/app/routes.py @@ -1,24 +1,49 @@ import json, re from datetime import datetime -from flask import jsonify +from flask import jsonify, request from app import app from app.db import getConnection from app.get_closest_nodes import get_nodes_within from app.get_node import get_node from app.get_travel_time import get_travel_time - -from app.get_links import get_links - +from app.get_here_links import get_here_links +from app.get_centreline_links import get_centreline_links +from app.getGitHash import getGitHash @app.route('/') def index(): + """Provide basic documentation about the available resources. + + All endpoints return JSON-formatted data. + """ + return jsonify({ + 'description': 'Travel Time App backend', + 'available_endpoints': [ + { + 'path': str(rule), + 'docstring': app.view_functions[rule.endpoint].__doc__ + } for rule in app.url_map.iter_rules() + ] + }) + +@app.route('/version') +def version(): + """Return the Git hash of the current application HEAD""" return jsonify({ - 'description': 'Travel Time App backend root', - 'endpoints': [str(rule) for rule in app.url_map.iter_rules()] + 'git-HEAD': getGitHash() }) # test URL /closest-node/-79.3400/43.6610 @app.route('/nodes-within///', methods=['GET']) -def closest_node(meters,longitude,latitude): +def closest_node(meters, longitude, latitude): + """Return up to 20 nodes within a given radius (in meters) of a point. + + Nodes are drawn from the Congestion Network, i.e. are fairly major intersections. + + Arguments: + meters (float): distance around latitude and longitude to search + latitude (float): latitude of point to search around + longitude (float): longitude of point to search around + """ try: longitude = float(longitude) latitude = float(latitude) @@ -30,17 +55,44 @@ def closest_node(meters,longitude,latitude): # test URL /node/30357505 @app.route('/node/', methods=['GET']) def node(node_id): + """Returns information about a given node in the Here street network. + + This uses the latest map version and may not recognize an older node_id. + + arguments: + node_id (int): identifier of the node in the latest Here map version + optional GET arg ?doConflation will also return the nearest node in the centreline network + """ try: node_id = int(node_id) except: return jsonify({'error': "node_id should be an integer"}) - return jsonify(get_node(node_id)) -# test URL /link-nodes/30421154/30421153 + doConflation = False + if request.args.get('doConflation') is not None: + doConflation = True + + return jsonify(get_node(node_id, doConflation)) + +# test URL /link-nodes/here/30421154/30421153 #shell function - outputs json for use on frontend -@app.route('/link-nodes//', methods=['GET']) -def get_links_between_two_nodes(from_node_id, to_node_id): - """Returns links of the shortest path between two nodes on the HERE network""" +@app.route('/link-nodes///') +def get_here_links_between_two_nodes(network, from_node_id, to_node_id): + """Returns a list of links/edges defining the shortest path between two nodes. + + Each link has + * an ID (centreline_id or linkdir, depending on the reference network) + * a geometry, GeoJSON style + * a length in meters + * the name of the street + * source and target nodes in the reference network + Routing is done in PostgreSQL using `here_gis.get_links_btwn_nodes_{map_version}` + + arguments: + network (str): reference network to use; either 'here' or 'centreline' + from_node_id (int): origin node ID on the reference network + to_node_id (int): destination node ID on the reference network + """ try: from_node_id = int(from_node_id) to_node_id = int(to_node_id) @@ -50,35 +102,37 @@ def get_links_between_two_nodes(from_node_id, to_node_id): if from_node_id == to_node_id: return jsonify({'error': "Source node can not be the same as target node."}), 400 - links = get_links(from_node_id, to_node_id) + if network == 'here': + links = get_here_links(from_node_id, to_node_id) + elif network == 'centreline': + links = get_centreline_links(from_node_id, to_node_id) + else: + return jsonify({'error': "Network should be one of ['here','centreline']"}), 400 return jsonify({ "source": from_node_id, "target": to_node_id, - "links": links, - # the following three fields are for compatibility and should eventually be removed - "path_name": "", - "link_dirs": [ link['link_dir'] for link in links ], - "geometry": { - "type": "MultiLineString", - "coordinates": [ link['geometry']['coordinates'] for link in links ] - } + "links": links }) - - # test URL /aggregate-travel-times/30310940/30310942/9/12/2020-05-01/2020-06-01/true/2 -@app.route( - '/aggregate-travel-times////////', - methods=['GET'] -) -# - start_node, end_node (int): the congestion network / HERE node_id's -# - start_time, end_time (int): starting (inclusive), ending (exclusive) hours of aggregation -# - start_date, end_date (YYYY-MM-DD): start (inclusive), end (exclusive) date of aggregation -# - include_holidays(str, boolean-ish): 'true' will include holidays -# - dow_list(str): flattened list of integers, i.e. [1,2,3,4] -> '1234', representing days of week to be included (ISODOW) +@app.route('/aggregate-travel-times////////') def aggregate_travel_times(start_node, end_node, start_time, end_time, start_date, end_date, include_holidays, dow_str): + """ + Return averaged travel times given the specified parameters. + + This function just parses arguments and otherwise wraps around `get_travel_times` which does the actual work... + Aggregates travel times, returning averaged travel times along the selected corridor during the specified dates and times. + Also returns some helpful diagnostic data such as the parsed query args, the route identified between the nodes, and some measures of sampling error. + + Arguments: + start_node, end_node (int): HERE network node_id's from the current Here map version + start_time, end_time (int): starting (inclusive), ending (exclusive) hours. May include leading zeros. If the end_time is less than the start_time, the time will wrap midnight. + start_date, end_date (str, YYYY-MM-DD): start (inclusive), end (exclusive) dates. end_date must be greater than start_date. + include_holidays (str, boolean): 'true' will include holidays, 'false' will exclude them if applicable + dow_list (str): concatenated list of integers representing days of week to be included; ISODOW specification. E.g. [6,7] -> '67' for Saturday and Sunday only. + """ try: start_node = int(start_node) end_node = int(end_node) @@ -116,6 +170,7 @@ def aggregate_travel_times(start_node, end_node, start_time, end_time, start_dat # test URL /date-bounds @app.route('/date-range', methods=['GET']) def get_date_bounds(): + """Returns the dates of the earliest and latest available travel time data.""" connection = getConnection() with connection: with connection.cursor() as cursor: @@ -130,7 +185,10 @@ def get_date_bounds(): # test URL /holidays @app.route('/holidays', methods=['GET']) def get_holidays(): - "Return dates of all known holidays in ascending order" + """Return dates of all Ontario holidays in ascending order. + + Holidays will fully cover the range of any available travel time data. + """ connection = getConnection() query = f""" SELECT diff --git a/backend/requirements.txt b/backend/requirements.txt index bd39afc..2177ad6 100644 --- a/backend/requirements.txt +++ b/backend/requirements.txt @@ -6,4 +6,5 @@ packaging==20.7 pandas==2.2.2 psycopg==3.1.18 python-dotenv==0.15.0 -numpy==1.26.0 \ No newline at end of file +numpy==1.26.0 +git+ssh://git@github.com/Toronto-Big-Data-Innovation-Team/traveltimetools.git@v1.0.0 \ No newline at end of file diff --git a/backend/tables/travel_time_cache.sql b/backend/tables/travel_time_cache.sql new file mode 100644 index 0000000..a030f0e --- /dev/null +++ b/backend/tables/travel_time_cache.sql @@ -0,0 +1,8 @@ +CREATE TABLE nwessel.cached_travel_times ( + uri_string text CHECK(uri_string ~ '^\/\d+\/\d+\/\d{1,2}\/\d{1,2}\/\d{4}-\d{2}-\d{2}\/\d{4}-\d{2}-\d{2}\/(true|false)\/[1-7]{1,7}$'), + commit_hash text, + results jsonb NOT NULL, + PRIMARY KEY (uri_string, commit_hash) +); + +GRANT SELECT, INSERT ON nwessel.cached_travel_times TO tt_request_bot; diff --git a/frontend/package-lock.json b/frontend/package-lock.json index 35bc9a3..26cbaed 100644 --- a/frontend/package-lock.json +++ b/frontend/package-lock.json @@ -3643,9 +3643,9 @@ } }, "node_modules/cross-spawn": { - "version": "7.0.3", - "resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.3.tgz", - "integrity": "sha512-iRDPJKUPVEND7dHPO8rkbOnPpyDygcDFtWjpeWNCgy8WP2rXcxXL8TskReQl6OrB2G7+UJrags1q15Fudc7G6w==", + "version": "7.0.6", + "resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz", + "integrity": "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==", "dev": true, "dependencies": { "path-key": "^3.1.0", @@ -4606,9 +4606,9 @@ } }, "node_modules/express": { - "version": "4.21.1", - "resolved": "https://registry.npmjs.org/express/-/express-4.21.1.tgz", - "integrity": "sha512-YSFlK1Ee0/GC8QaO91tHcDxJiE/X4FbpAyQWkxAvG6AXCuR65YzK8ua6D9hvi/TzUfZMpc+BwuM1IPw8fmQBiQ==", + "version": "4.21.2", + "resolved": "https://registry.npmjs.org/express/-/express-4.21.2.tgz", + "integrity": "sha512-28HqgMZAmih1Czt9ny7qr6ek2qddF4FclbMzwhCREB6OFfH+rXAnuNCwo1/wFvrtbgsQDb4kSbX9de9lFbrXnA==", "dependencies": { "accepts": "~1.3.8", "array-flatten": "1.1.1", @@ -4629,7 +4629,7 @@ "methods": "~1.1.2", "on-finished": "2.4.1", "parseurl": "~1.3.3", - "path-to-regexp": "0.1.10", + "path-to-regexp": "0.1.12", "proxy-addr": "~2.0.7", "qs": "6.13.0", "range-parser": "~1.2.1", @@ -4644,6 +4644,10 @@ }, "engines": { "node": ">= 0.10.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/express" } }, "node_modules/express/node_modules/debug": { @@ -6551,9 +6555,9 @@ "integrity": "sha512-TvmkNhkv8yct0SVBSy+o8wYzXjE4Zz3PCesbfs8HiCXXdcTuocApFv11UWlNFWKYsP2okqrhb7JNlSm9InBhIw==" }, "node_modules/nanoid": { - "version": "3.3.7", - "resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.7.tgz", - "integrity": "sha512-eSRppjcPIatRIMC1U6UngP8XFcz8MQWGQdt1MTBQ7NaAmvXDfvNxbvWV3x2y6CdEUciCSsDHDQZbhYaB8QEo2g==", + "version": "3.3.8", + "resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.8.tgz", + "integrity": "sha512-WNLf5Sd8oZxOm+TzppcYk8gVOgP+l58xNy58D0nbUnOxOWRWvlcCV4kUF7ltmI6PsrLl/BgKEyS4mqsGChFN0w==", "dev": true, "funding": [ { @@ -6981,9 +6985,9 @@ "integrity": "sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==" }, "node_modules/path-to-regexp": { - "version": "0.1.10", - "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.10.tgz", - "integrity": "sha512-7lf7qcQidTku0Gu3YDPc8DJ1q7OOucfa/BSsIwjuh56VU7katFvuM8hULfkwB3Fns/rsVF7PwPKVw1sl5KQS9w==" + "version": "0.1.12", + "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.12.tgz", + "integrity": "sha512-RA1GjUVMnvYFxuqovrEqZoxxW5NUZqbwKtYz/Tt7nXerk0LbLblQmrsgdeOxV5SFHf0UDggjS/bSeOZwt1pmEQ==" }, "node_modules/path-type": { "version": "4.0.0", diff --git a/frontend/src/Sidebar/index.jsx b/frontend/src/Sidebar/index.jsx index b97faed..6deec17 100644 --- a/frontend/src/Sidebar/index.jsx +++ b/frontend/src/Sidebar/index.jsx @@ -153,7 +153,7 @@ function Welcome(){

Toronto Historic Travel Times

This application allows you to query averaged motor vehicle travel - times across the city, as far back as 2017. Data come from a small + times across the city, as far back as 2017-09-01. Data come from a small sample of probe vehicles that report their location data to Here. For more information on this application and our methodology diff --git a/frontend/src/segment.js b/frontend/src/segment.js index bd60ea6..09a41d0 100644 --- a/frontend/src/segment.js +++ b/frontend/src/segment.js @@ -15,7 +15,7 @@ export class Segment { get fromIntersection(){ return this.#fromIntersection } get toIntersection(){ return this.#toIntersection } fetchLinks(){ - return fetch(`${domain}/link-nodes/${this.#fromIntersection.id}/${this.toIntersection.id}`) + return fetch(`${domain}/link-nodes/here/${this.#fromIntersection.id}/${this.toIntersection.id}`) .then( resp => resp.json() ) .then( ({links}) => { this.#links = links diff --git a/frontend/src/timeRange.js b/frontend/src/timeRange.js index 22ef535..7a14e5b 100644 --- a/frontend/src/timeRange.js +++ b/frontend/src/timeRange.js @@ -21,7 +21,7 @@ export class TimeRange extends Factor { if(!(this.#startTime && this.#endTime)){ return false } - return this.startHour < this.endHour + return this.startHour != this.endHour } get name(){ if(this.#startTime || this.#endTime){ @@ -68,7 +68,11 @@ export class TimeRange extends Factor { } get hoursInRange(){ // how many hours are in the timeRange? if(! this.isComplete){ return undefined } - return this.endHour - this.startHour + if(this.endHour > this.startHour){ + return this.endHour - this.startHour + } else { + return 24 - this.startHour + this.endHour + } } }