Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EULP Final Run Branch (DO NOT MERGE) #690

Open
wants to merge 7 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 3 additions & 4 deletions measures/BuildExistingModel/measure.rb
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ def run(model, runner, user_arguments)
buildstock_file = File.join(resources_dir, 'buildstock.rb')
measures_dir = File.join(resources_dir, 'measures')
lookup_file = File.join(resources_dir, 'options_lookup.tsv')
buildstock_csv = File.absolute_path(File.join(characteristics_dir, 'buildstock.csv')) # Should have been generated by the Worker Initialization Script (run_sampling.rb) or provided by the project
buildstock_csv_path = File.absolute_path(File.join(characteristics_dir, 'buildstock.csv')) # Should have been generated by the Worker Initialization Script (run_sampling.rb) or provided by the project
if workflow_json.is_initialized
workflow_json = File.join(resources_dir, workflow_json.get)
else
Expand All @@ -111,13 +111,12 @@ def run(model, runner, user_arguments)
# Check file/dir paths exist
check_dir_exists(measures_dir, runner)
check_file_exists(lookup_file, runner)
check_file_exists(buildstock_csv, runner)
check_file_exists(buildstock_csv_path, runner)

lookup_csv_data = CSV.open(lookup_file, col_sep: "\t").each.to_a
buildstock_csv_data = CSV.open(buildstock_csv, headers: true).map(&:to_hash)

# Retrieve all data associated with sample number
bldg_data = get_data_for_sample(buildstock_csv_data, building_id, runner)
bldg_data = get_data_for_sample(buildstock_csv_path, building_id, runner)

# Retrieve order of parameters to run
parameters_ordered = get_parameters_ordered_from_options_lookup_tsv(lookup_csv_data, characteristics_dir)
Expand Down
53 changes: 53 additions & 0 deletions project_national/eulp_national_2018_final_runs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
schema_version: '0.3'
buildstock_directory: ../ # Relative to this file or absolute
project_directory: project_national # Relative to buildstock_directory
output_directory: /projects/enduse/resstock_runs/eulp_res_final_2018_v0
weather_files_path: /shared-projects/buildstock/weather/BuildStock_2018_FIPS.zip # Relative to this file or absolute path to zipped weather files

baseline:
n_buildings_represented: 133172057 # ACS 5-yr 2016

sampler:
type: residential_quota
args:
n_datapoints: 550000

workflow_generator:
type: residential_default
args:
residential_simulation_controls:
timesteps_per_hr: 4
begin_month: 1
begin_day_of_month: 1
end_month: 12
end_day_of_month: 31
calendar_year: 2018
timeseries_csv_export:
reporting_frequency: Timestep
include_enduse_subcategories: true
reporting_measures:
- measure_dir_name: QOIReport
server_directory_cleanup:
retain_in_osm: true
retain_in_idf: true

eagle:
n_jobs: 200
minutes_per_sim: 3
account: enduse
sampling:
time: 60
postprocessing:
time: 1439
n_workers: 32

postprocessing:
aws:
region_name: 'us-west-2'
s3:
bucket: eulp
prefix: simulation_output/national_runs/resstock
athena:
glue_service_role: service-role/AWSGlueServiceRole-default
database_name: enduse
max_crawling_time: 1000 #time to wait for the crawler to complete before aborting it
53 changes: 53 additions & 0 deletions project_national/eulp_national_2019_final_runs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
schema_version: '0.3'
buildstock_directory: ../ # Relative to this file or absolute
project_directory: project_national # Relative to buildstock_directory
output_directory: /projects/enduse/resstock_runs/eulp_res_final_2019_sept14
weather_files_path: /shared-projects/buildstock/weather/BuildStock_2019_FIPS.zip # Relative to this file or absolute path to zipped weather files

baseline:
n_buildings_represented: 133172057 # ACS 5-yr 2016

sampler:
type: residential_quota
args:
n_datapoints: 550000

workflow_generator:
type: residential_default
args:
residential_simulation_controls:
timesteps_per_hr: 4
begin_month: 1
begin_day_of_month: 1
end_month: 12
end_day_of_month: 31
calendar_year: 2018
timeseries_csv_export:
reporting_frequency: Timestep
include_enduse_subcategories: true
reporting_measures:
- measure_dir_name: QOIReport
server_directory_cleanup:
retain_in_osm: true
retain_in_idf: true

eagle:
n_jobs: 200
minutes_per_sim: 3
account: enduse
sampling:
time: 60
postprocessing:
time: 1439
n_workers: 32

postprocessing:
aws:
region_name: 'us-west-2'
s3:
bucket: eulp
prefix: simulation_output/national_runs/resstock
athena:
glue_service_role: service-role/AWSGlueServiceRole-default
database_name: enduse
max_crawling_time: 1000 #time to wait for the crawler to complete before aborting it
53 changes: 53 additions & 0 deletions project_national/eulp_national_TMY3_final_runs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
schema_version: '0.3'
buildstock_directory: ../ # Relative to this file or absolute
project_directory: project_national # Relative to buildstock_directory
output_directory: /projects/enduse/resstock_runs/eulp_res_final_TMY3_v0
weather_files_path: /shared-projects/buildstock/weather/BuildStock_TMY3_FIPS.zip # Relative to this file or absolute path to zipped weather files

baseline:
n_buildings_represented: 133172057 # ACS 5-yr 2016

sampler:
type: residential_quota
args:
n_datapoints: 550000

workflow_generator:
type: residential_default
args:
residential_simulation_controls:
timesteps_per_hr: 4
begin_month: 1
begin_day_of_month: 1
end_month: 12
end_day_of_month: 31
calendar_year: 2018
timeseries_csv_export:
reporting_frequency: Timestep
include_enduse_subcategories: true
reporting_measures:
- measure_dir_name: QOIReport
server_directory_cleanup:
retain_in_osm: true
retain_in_idf: true

eagle:
n_jobs: 200
minutes_per_sim: 3
account: enduse
sampling:
time: 60
postprocessing:
time: 1439
n_workers: 32

postprocessing:
aws:
region_name: 'us-west-2'
s3:
bucket: eulp
prefix: simulation_output/national_runs/resstock
athena:
glue_service_role: service-role/AWSGlueServiceRole-default
database_name: enduse
max_crawling_time: 1000 #time to wait for the crawler to complete before aborting it
11 changes: 6 additions & 5 deletions resources/buildstock.rb
Original file line number Diff line number Diff line change
Expand Up @@ -402,14 +402,15 @@ def evaluate_logic(option_apply_logic, runner, past_results = true)
return result
end

def get_data_for_sample(buildstock_csv_data, building_id, runner)
buildstock_csv_data.each do |sample|
next if sample['Building'].to_i != building_id
def get_data_for_sample(buildstock_csv_path, building_id, runner)
buildstock_csv = CSV.open(buildstock_csv_path, headers: true)

return sample
buildstock_csv.each do |row|
next if row['Building'].to_i != building_id.to_i
return row.to_hash
end
# If we got this far, couldn't find the sample #
msg = "Could not find row for #{building_id} in #{File.basename(buildstock_csv)}."
msg = "Could not find row for #{building_id} in #{buildstock_csv_path}."
runner.registerError(msg)
fail msg
end
Expand Down
Loading