Simplifying a model network

Contents

Simplifying a model network#

This example shows how to leverage Polaris-Studio tools to suggest possible simplifications on Polaris networks that are deemed too dense. The main idea is to execute a static traffic assignment using an arbitrary full demand matrix and look into which links were actually used during the assignment.

The procedure allows the user to execute an All-or-Nothing assignment or an equilibrium assignment, which causes flows to spread out and therefore yields a larger number of links and a better connected network.

The algorithm is a basic heuristic, however, and we recommend users to use caution and thoroughly review its results when simplifying networks.

Icon source: https://foto.wuestenigel.com/strainer-in-close-up

Imports#

import shutil
from pathlib import Path
from uuid import uuid4

import numpy as np

# Figure out where our polaris directory is (so we can get some test data)
from polaris.network.data.data_table_cache import DataTableCache
from polaris.prepare.supply_tables.network.traffic_links import used_links_traffic
from polaris.runs.static_skimmer.static_graph import build_graph
from polaris.utils.database.db_utils import commit_and_close

Creates a new supply file with the chosen SRID sphinx_gallery_thumbnail_path = ‘../../examples/model_building/strainer-in-close-up.jpeg’

source = Path("/tmp/Grid/Grid-Supply.sqlite")
supply_file = Path("/tmp/Grid") / f"{uuid4().hex}-Supply.sqlite"
shutil.copy(source, supply_file)
PosixPath('/tmp/Grid/2310a19cc11644dbb0a1920944c2210f-Supply.sqlite')

Opens the newly-created network and import the GMNS network into it

graph = build_graph(supply_file)
# The most basic use is to figure out which links are part of the shortest path between any two zones
links_usage = used_links_traffic(graph, 0.1, algorithm="all-or-nothing")

# # Or we can use biconjugate Frank-Wolfe with this heavily congested network to *spread the flow* a little
# # That would result in a larger number of links being used
# The default number of iterations when algorithm and max_iterations are not provided is 50, which may take a little
# to compute when the network is very large and the number of zones is high

# links_usage = used_links_traffic(graph, 0.1, max_iterations=10)

# Let's get the links that ended up having flow on them (were part of at least one shortest path)
links_to_keep = links_usage.link_id.to_numpy()

# That's what we get back from our little algorithm
links_usage.head()
car_usage                                         :   0%|          | 0/9 [00:00<?, ?it/s]
Equilibrium Assignment                            :   0%|          | 0/50 [00:00<?, ?it/s]
link_id demand_ab demand_ba demand_tot Preload_AB Preload_BA Preload_tot Congested_Time_AB Congested_Time_BA Congested_Time_Max Delay_factor_AB Delay_factor_BA Delay_factor_Max VOC_AB VOC_BA VOC_max PCE_AB PCE_BA PCE_tot
0 34 0.4 0.0 0.4 0.0 0.0 0.0 0.519940 0.000000 0.519940 1.0 0.0 1.0 0.001000 0.000000 0.001000 0.4 0.0 0.4
1 35 0.4 0.2 0.6 0.0 0.0 0.0 0.634641 0.634641 0.634641 1.0 1.0 1.0 0.000667 0.000333 0.000667 0.4 0.2 0.6
2 36 0.4 0.1 0.5 0.0 0.0 0.0 0.481215 0.481215 0.481215 1.0 1.0 1.0 0.000667 0.000167 0.000667 0.4 0.1 0.5
3 37 0.1 0.0 0.1 0.0 0.0 0.0 0.691475 0.000000 0.691475 1.0 0.0 1.0 0.000250 0.000000 0.000250 0.1 0.0 0.1
4 38 0.3 0.0 0.3 0.0 0.0 0.0 0.482759 0.000000 0.482759 1.0 0.0 1.0 0.000750 0.000000 0.000750 0.3 0.0 0.3


YOU PROBABLY ALSO WANT TO KEEP ALL LINKS THAT ARE OF A CERTAIN HIERARCHIES

keep_types = ["FREEWAY", "MINOR", "MAJOR", "RAMP", "EXPRESSWAY", "PRINCIPAL", "FRONTAGE", "BRIDGE", "TUNNEL"]
links = DataTableCache(supply_file).get_table("link").reset_index()
links = links[links["type"].isin(keep_types)]

links_to_keep = np.unique(np.hstack([links_to_keep, links.link]))

WARNING

WE HIGHLY RECOMMEND REVIEWING THE RESULTS OF THE PROCEDURE ABOVE. YOU WILL LIKELY HAVE TO TUNE ITS PARAMETERS IT IS ALSO LIKELY THAT YOU WILL HAVE TO FIX SOME ISSUES IN THE NETWORK MANUALLY.

We can now delete the links that we are not going to use

with commit_and_close(supply_file, spatial=True) as conn:
    sql = "DELETE FROM Link WHERE link NOT IN (" + ",".join([str(x) for x in links_to_keep]) + ")"
    conn.execute(sql)

Total running time of the script: (0 minutes 0.302 seconds)

Gallery generated by Sphinx-Gallery