API#
The entire Polaris library is predicated on the existence of a folder that contains all the data for a particular model, including input data and model results.
- class polaris.project.polaris.Polaris(project_folder=None, config_file=None)#
Bases:
object
Python interface for all things Polaris
Running polaris models with this interface is trivial
- ::
model = Polaris().from_dir(“D:/src/argonne/MODELS/bloomington”) model.run()
- __init__(project_folder=None, config_file=None)#
- classmethod from_dir(project_folder, config_file=None)#
- classmethod from_config_file(config_file)#
- classmethod build_from_git(model_dir, city, db_name=None, overwrite=False, inplace=False, branch='main')#
Clones a polaris project from git and builds it into a runnable model
from polaris import Polaris Polaris.from_dir(Polaris.build_from_git("e:/models/from_git", "atlanta"))
- property is_open: bool#
- open(model_path: PathLike, config_file: str | None = None) None #
Opens a Polaris model in memory. When a config file is provided, the model tries to load it
- Parameters:
model_path (
PathLike
) – Complete path for the folder containing the Polaris model.config_file – Optional, Name of the convergence control yaml we want to work with. Defaults to convergence_control.yaml
from polaris import Polaris model = Polaris() model.open('path/to/model', 'convergence_control_modified.yaml')
- property model_path: Path#
Path to the loaded project
- property supply_file: Path#
Path to the supply file in project
- property demand_file: Path#
Path to the demand file in project
- property result_file: Path#
Path to the result sqlite file in project
- property result_h5_file: Path#
Path to the result h5 file in project
- property network#
- property latest_output_dir: Path#
- property router#
- property skims#
- upgrade(max_migration: str | None = None, force_migrations: List[str] | None = None)#
Upgrade the underlying databases to the latest / greatest schema version.
- Parameters:
max_migration – a string (date) defining the latest migration id that should be applied. Useful if working with an older version of the POLARIS executable which isn’t compatible with the latest schema.
- ::
model.upgrade(“202402”) # only apply upgrades (migrations) from before February 2024.
- run(**kwargs) None #
- close()#
Eliminates all data from memory
- ::
model.close()
Since all three submodules are fairly independent, we list them below.
Network module#
The network module library is predicated on the existence of a single network project that will be under analysis/modification, and therefore all submodules responsible for different capabilities will ge generated from this network object, below.
- class polaris.network.network.Network#
Bases:
object
Polaris Network Class
# We can open the network just to figure out its projection from polaris.network.network import Network n = Network() n.open(source) # We get the projection used in this project srid = n.srid # Or we can also get the checker for this network checker = n.checker # In case we ran some query while with the database open, we can commit it n.commit() # We can create a new network file from scratch as well new_net = Network() new_net.srid = srid # We use the SRID from the known network, for example # if we don't have Spatialite on our system path environment variables, we can # add it manually before attempting to create a new database new_net.add_spatialite('path/to/spatialite/folder') # If the second parameter is True, an SQLite file pre-populated with Spatialite extensions is used new_net.new('path/to/new/file', True) # To close the connection is simply n.close()
- __init__()#
Instantiates the network
- static from_file(network_file: PathLike, run_consistency=False)#
- static create(network_file: PathLike, srid: int, jumpstart: bool = False) None #
Creates new empty network file Args:
network_file (
str
): Full path to the network file to be opened. jumpstart (bool
): Copies base sql already loaded with spatialite extension. It saves a few seconds of runtime.
- new(network_file: str, jumpstart=False) None #
- open(network_file: PathLike, run_consistency=False)#
Opens project for editing/querying
- Args:
network_file (
str
): Full path to the network file to be opened.
- upgrade() None #
Updates the network to the latest version available
- close(clear_issues=False)#
Closes database connection
- set_debug(level=10)#
Sets logging to debug mode throughout the package.
As a result of this method_call, logging will become extremely verbose. Use only when necessary
- Args:
level (
int
): Logging level to be used (DEBUG:10, INFO:20, WARN:30, ERROR:40 CRITICAL:50)
- log_to_terminal()#
Adds the terminal as a logging output
- property transit: Transit#
Transit manipulation class
- property active#
Active transport network creation class
- property checker#
Network checker class
- property populate#
Network checker class
- property diagnostics#
Network checker class
- property geo_consistency#
Geo-consistency analysis class
- property osm: OSM#
- property geotools#
- property data_tables: DataTableCache#
- property consistency#
Network checker class
- property ie: ImportExport#
Network Import-Export class
- get_location(location_id: int) Location #
Location object
- get_intersection(node: int) Intersection #
Network intersection class
- clear_editing_records()#
“Removes previously addressed records from the editing table
- clear_log()#
Clears log file
- static has_edit_table(path_to_file)#
Polaris Network Class |
Due to the complexity of some of the supply model components, this page only lists the package submodules, while further detail on manipulating Traffic and Public transport components, go to the dedicated pages listed on the left side bar.
Main submodules#
The PolarisNetwork has 6 submodules that bring critical features to the network, which are organized as follows.
Accessing these submodules is as simple as opening the network and using method calls dedicated to returning class instances of each submodule already properly configured for the network at hand.
import os
from polaris.network.network import Network
net_file = 'D:/src/argonne/chicago2018-Supply.sqlite'
net = Network()
net.open(net_file)
checker = net.checker
walk_nt = net.active
transit = net.transit
geo_con = net.geo_consistency
consistency = net.consistency
Analyze submodule#
Key Performance Indicators#
The KPI submodules are designed to compile summary data for a range of summary metrics useful when evaluating results and comparing across iterations and across model runs.
The ResultKPI class allows all metrics to be computed and cached for each simulation run, so it is convenient to compute them once the simulation is over for easy future access.
- class polaris.analyze.result_kpis.ResultKPIs(inputs: PolarisInputs, cache_dir: Path, population_scale_factor: float)
Bases:
object
This class provides an easy way to extract relevant metrics for a single simulation run of POLARIS. The easiest way to generate an instance is via the factory method from_iteration which takes the path to the outputs of a simulation run (or a
ConvergenceIteration()
)from polaris.analyze.result_kpi import ResultKPIs kpis = ResultKPIs.from_iteration(ref_project_dir / f"{city}_iteration_2")
Metric comparison plots can then be generated in a notebook using:
results = KpiComparator() results.add_run(kpis, 'an-arbitrary-label') results.plot_mode_share() results.plot_vmt() results.plot_vmt_by_link_type()
Any number of runs can be added using add_run up to the limit of readability on the generated plots.
- result_time_step = 3600
- __init__(inputs: PolarisInputs, cache_dir: Path, population_scale_factor: float)
- classmethod from_iteration(iteration: ConvergenceIteration, **kwargs)
Create a KPI object from a ConvergenceIteration object.
- classmethod from_dir(iteration_dir: Path, **kwargs)
Create a KPI object from a given directory.
- classmethod from_args(files: PolarisInputs, iteration_dir: Path, cache_name: str = 'kpi.cache', clear_cache=False, exit_if_no_cache=False, population_scale_factor=None)
- cache_all_available_metrics(verbose=True, metrics_to_cache=None)
- classmethod available_metrics()
- close()
- get_kpi_value(kpi_name)
- get_cached_kpi_value(kpi_name, skip_cache=False, force_cache=False)
- has_cached_kpi(kpi_name)
- cached_metrics()
- metric_summary()
- metric_gaps()
- static one_value(conn, query, default=0)
- metric_population()
- metric_num_adults()
- metric_num_employed()
- metric_num_persons_by_age_band_5()
- metric_num_persons_by_age_band_10()
- metric_num_hh()
- metric_num_hh_by_hh_size()
- metric_tts()
- metric_distance_by_act_type()
- metric_planned_modes()
- metric_executed_modes()
- metric_mode_shares()
- metric_executed_activity_mode_share_by_income()
- metric_activity_start_distribution()
- metric_activity_rate_distribution()
- metric_activity_duration_distribution()
- metric_vmt_vht()
- metric_ev_charging()
- metric_vmt_vht_by_link()
- load_link_types()
- metric_activity_distances()
- metric_vehicle_technology()
- metric_tnc_times_by_tnc_operator()
- metric_pmt_pht_by_tnc_mode()
- metric_vmt_vht_by_tnc_mode()
- metric_vmt_vht_by_tnc_operator()
- metric_empty_vmt_vht_by_tnc_operator()
- metric_tnc_result_db_by_tnc_operator()
- metric_avo_by_tnc_operator()
- metric_road_pricing()
- metric_transit_boardings()
- network_gaps_by_x(x)
- metric_network_gaps_by_link_type()
- metric_network_gaps_by_hour()
- metric_skim_stats()
- metric_rmse_vs_observed()
- metric_calibration_act_gen()
- metric_calibration_mode_share()
- metric_calibration_timing()
- metric_calibration_destination()
- metric_trip_length_distribution()
- metric_activity_start_time_distributions()
- metric_traffic_cumulative_gap()
The KPI comparator leverages KPIs produced by the ResultKPI class and produces a series of useful plots, which are also accessible through the QPolaris interface.
- class polaris.analyze.kpi_comparator.KpiComparator
Bases:
object
This class provides an easy way to group together multiple runs of POLARIS and compare their outputs. Runs KPIs are added along with a string based name which is used as the label for that run in any subsequent plots which are generated.
from polaris.analyze.kpi_comparator import KpiComparator results = KpiComparator() results.add_run(ResultKPIs.from_iteration(ref_project_dir / f"{city}_iteration_2"), 'REF_iteration_2') results.add_run(ResultKPIs.from_iteration(eval_project_dir / f"{city}_iteration_2"), 'EVAL_iteration_2')
Metric comparison plots can then be generated in a notebook using:
results.plot_mode_share() results.plot_vmt() results.plot_vmt_by_link_type()
Any number of runs can be added using add_run up to the limit of readability on the generated plots.
The object can also be used to generate a set of csv files for input into Excel (if you really have to use Excel):
results.dump_to_csvs(output_dir = "my_csv_dump_dir")
- __init__()
- add_run(kpi: ResultKPIs, run_id: str)
- has_run(run_id)
- dump_to_csvs(output_dir, metrics_to_dump=None, **kwargs)
- plot_everything(**kwargs)
- classmethod available_plots()
- plot_mode_share(**kwargs)
- plot_population(**kwargs)
- plot_congestion_pricing(**kwargs)
- plot_transit(**kwargs)
- add_iter()
- across_iterations(cols, **kwargs)
- plot_act_dist(act_type: str | None = None, **kwargs)
- plot_vmt(**kwargs)
- plot_vehicle_connectivity(**kwargs)
- plot_vmt_by_link_type(**kwargs)
- plot_gaps(**kwargs)
- static plot_multiple_gaps(kpi_results)
- plot_pax_in_network(**kwargs)
- plot_cpu_mem(**kwargs)
- plot_network_gaps(**kwargs)
- plot_skim_stats(show_min_max=False, **kwargs)
- plot_trip_length_distributions(max_dist=None, **kwargs)
- plot_activity_start_time_distributions(**kwargs)
- plot_tnc(**kwargs)
- plot_rmse_vs_observed(**kwargs)
- plot_calibration_for_activity_generation(**kwargs)
- plot_calibration_for_mode_share(**kwargs)
- plot_calibration_timing(**kwargs)
- plot_calibration_destination(**kwargs)
Spatial metrics#
This module is the basis for all the transit metrics mapping in QPolaris. This means, however, that a couple of summary tables are created in the demand database the first time the computation of demand results are first invoked, which may be somewhat time-consuming.
- class polaris.analyze.spatial_metrics.SpatialMetrics(inputs: PolarisInputs)
Bases:
object
Spatial metrics class that provides access to metrics associated with spatially-enabled elements of the model.
from polaris.runs.polaris_inputs import PolarisInputs from Polaris.analyze.spatial_metrics import SpatialMetrics inputs = PolarisInputs.from_dir('path/to/model_dir') metrics_object = SpatialMetrics(inputs)
- __init__(inputs: PolarisInputs)
- classmethod from_iteration(iteration: ConvergenceIteration)
Create a KPI object from a ConvergenceIteration object.
- classmethod from_dir(iteration_dir: Path)
Create a Spatial metrics object from a given directory.
- transit_supply_metrics() TransitSupplyMetrics
Returns a class of
TransitSupplyMetrics()
- transit_demand_metrics() TransitDemandMetrics
Returns a class of
TransitDemandMetrics()
- tnc_metrics() TNCMetrics
Returns a class of
TNCMetrics()
- set_demand_file(demand_file: Path)
- set_supply_file(supply_file: Path)
- set_result_file(result_file: Path)
Among these, the transit analysis submodules are designed to compile summary data for stops, patterns and routes, for both the supply used in the model run, as well as for the results recorded in the demand database.
Other metrics#
These methods allow for retrieval of path and activity data directly from the demand database for more detailed analysis.
- class polaris.analyze.path_metrics.PathMetrics(demand_file: Path, h5_file: Path)#
Bases:
object
Loads all data required for the computation of metrics on Paths.
- __init__(demand_file: Path, h5_file: Path)#
- Parameters:
demand_file (
Path
) – Path to the result file we want to compute metrics for
- property data: DataFrame#
- class polaris.analyze.activity_metrics.ActivityMetrics(supply_file: Path, demand_file: Path)#
Bases:
DemandTableMetrics
Loads all data required for the computation of metrics on activities.
The behavior of time filtering consists of setting to instant zero whenever from_time is not provided and the end of simulation when to_time is not provided
- __init__(supply_file: Path, demand_file: Path)#
- Parameters:
supply_file (
Path
) – Path to the supply file corresponding to the demand file we will compute metrics fordemand_file (
Path
) – Path to the demand file we want to compute metrics for
- get_trips(aggregation='zone') DataFrame #
Queries all trips for the current set of filters and for the aggregation of choice
- Return type:
DataFrame
- Parameters:
aggregation (
str
) – Filter to see either location or zone. Default is “zone”- Returns:
Statistics DataFrame
- set_mode(mode: str, mode_share=False)#
- set_start_hour(start_hour: int)#
- set_end_hour(end_hour: int)#
The hour for the end (INCLUDED) of the period.
- set_time_period(time_period: str)#
sets the time period we want to retrieve statistics about
- property modes#
- property types#
- property data: DataFrame#
- property locations: DataFrame#
- vehicle_trip_matrix(from_start_time: float, to_start_time: float)#
Returns the expected trip matrix for the UNIVERSE of trips starting between from_start_time and to_start_time, according to the results of the ABM seen in the Activities table
Skim module#
The skim module is designed to provide convenient access to the Skim matrices produced by Polaris, particularly for visualization in QGIS and exporting to OMX.
- class polaris.skims.highway.highway_skim.HighwaySkim(filename: PathLike | None = None)#
Bases:
SkimBase
Polaris Skims class
from polaris.skims.highway.highway_skim import HighwaySkim skims = HighwaySkim() # Load skims for highway skims.open('path/to/hwy/skims') # accessing skims is easy m1 = skims.time[1440] # time for the interval m2 = skims.distance[720] # distance for the interval 720 m3 = skims.cost[240] # cost for the interval 240 # We can also access skims like we do for PT time_morning = skims.get_skims(interval=240, metric="time")
- prefix = 'highway'#
- __init__(filename: PathLike | None = None)#
- open(path_to_file: PathLike)#
Loads the highway skim data
- Args:
path_to_file (
str
): Full file path to the highway skim
- create_empty(intervals: list, zones: int)#
- Creates a new skim data cube for a given set of intervals and number of zones.
All matrices are filled with zeros
- Args:
intervals (
list
): List of all intervals this skim file should have zones (int
): Number of zones for this skim
- get_skims(interval=None, metric=None, **kwargs)#
Gets skim data for specified mode/interval/metric. These filters are not, however, required. If one or more parameters it not provided, a dictionary (or nested dictionaries) will be returned
- Args:
interval Optional (
int
): The time interval of interest metric Optional (str
): Metric
- remove_interval(interval: int)#
Removes one interval from this skim. Operation happens in memory only. It does NOT alter skim on disk
Args:
interval (
int
): Interval to remove from the skim
- add_interval(interval: int, copy_interval=None)#
Adds a new interval to the skim matrix
- Args:
interval (
int
): Interval to be added to the skim data cube copy_interval Optional (int
): Interval to be copied into the new interval. Arrays of zeros are added if not provided
- write_to_file(path_to_file: PathLike)#
- class polaris.skims.transit.transit_skim.TransitSkim(filename: str | None = None)#
Bases:
SkimBase
Polaris Transit Skim class
from polaris.skims import TransitSkim skims = TransitSkim() # Load skims for highway skims.open('path/to/pt/skims') # We can retrieve skims # The keys are not not case-sensitive bus_time_morning = skims.get_skims(interval=240, metric="time", mode="bus") # to get the metrics and modes available you can do skims.modes # or skims.metrics # We can also export it skims.export('path/to/omx/file.omx')
- prefix = 'transit'#
- __init__(filename: str | None = None)#
- open(path_to_file: PathLike)#
Loads the transit skim data
- Args:
path_to_file (
str
): Full file path to the transit skim
- get_skims(interval=None, metric=None, mode=None)#
Gets skim data for specified mode/interval/metric. These filters are not, however, required. If one or more parameters it not provided, a dictionary (or nested dictionaries) will be returned
- Args:
interval Optional (
int
): The time interval of interest metric Optional (str
): Metric mode Optional (str
): name of the transport mode
- create_empty(intervals: list, zones: int)#
- Creates a new skim data cube for a given set of intervals and number of zones.
All matrices are filled with zeros
- Args:
intervals (
list
): List of all intervals this skim file should have zones (int
): Number of zones for this skim
- remove_interval(interval: int)#
Removes one interval from this skim. Operation happens in memory only. It does NOT alter skim on disk
Args:
interval (
int
): Interval to remove from the skim
- add_interval(interval: int, copy_interval=None)#
Adds a new interval to the skim matrix
- Args:
interval (
int
): Interval to be added to the skim data cube copy_interval Optional (int
): Interval to be copied into the new interval. Arrays of zeros are added is not provided
- add_mode(mode_name: str, copy_mode: str)#
Adds a new mode to the skim matrix
- Args:
mode_name (
str
): Mode name to be added to the skim data cube copy_mode Optional (str
):copy_mode to be copied into the new mode. Arrays of zeros are added if not provided
- add_metric(metric: str, copy_metric: str)#
Adds a new metric to the skim matrix
- Args:
metric (
str
): Metric to be added to the skim data cube copy_metric Optional (str
): metric to be copied into the new metric. Arrays of zeros are added if not provided
- write_to_file(path_to_file: PathLike)#