Releases: gem/oq-engine
OpenQuake Engine 3.6.0
[Michele Simionato (@micheles)]
- In some cases
applyToSourceswas giving a fake error about the source
not being in the source model even if it actually was
[Chris Van Houtte (@cvanhoutte)]
- Adds the Van Houtte et al. (2018) significant duration model for New
Zealand
[Michele Simionato (@micheles)]
- Added a way to compute and plot the MFD coming from an event based
- Storing the MFDs in TOML format inside the datastore
[Robin Gee (@rcgee)]
- Moves b4 constant into COEFFS table for GMPE Sharma et al., 2009
[Graeme Weatherill (@g-weatherill)]
- Adds functionality to Cauzzi et al. (2014) and Derras et al. (2014)
calibrated GMPEs for Germany to use either finite or point source distances
[Michele Simionato (@micheles)]
- Restored the ability to associate site model parameters to a grid of sites
- Made it possible to set
hazard_curves_from_gmfs=truewith
ground_motion_fields=falsein the event based hazard calculator - Introduced a mechanism to split the tasks based on an estimated duration
- Integrated
oq plot_memoryintooq plot - Removed
NaNvalues for strike and dip when exporting griddedRuptures - Fixed
oq resetto work in multi-user mode - Extended the source_id-filtering feature in the job.ini to multiple sources
- Supported WKT files for the binary perils in the multi_risk calculator
- Added an early check on the coefficients of variation and loss ratios of
vulnerability functions with the Beta distribution - Made sure that
oq engine --dcremoves the HDF5 cache file too - Removed the flag
optimize_same_id_sourcesbecause it is useless now - Introduced a soft limit at 65,536 sites for event_based calculations
- Fixed a performance regression in ucerf_classical that was filtering
before splitting, thus becoming extra-slow - Improved the progress log, that was delayed for large classical calculations
- Exported the ruptures as 3D multi-polygons (instead of 2D ones)
- Changed the
aggregate_byexports for consistency with the others - Changed the losses_by_event exporter for ebrisk, to make it more
consistent with scenario_risk and event_based_risk - Changed the agglosses and losses_by_event exporters in scenario_risk,
by adding a column with the realization index - Changed the generation of the hazard statistics to consume very little
memory - Fixed a bug with concurrent_tasks being inherited from the parent
calculation instead of using the standard default - Removed the dependency from mock, since it is included in unittest.mock
- For scenario, replaced the
branch_pathwith the GSIM representation in
the realizations output - Added a check for suspiciously large source geometries
- Deprecated the XML disaggregation exporters in favor of the CSV exporters
- Turned the disaggregation calculator into a classical post-calculator
to use the precomputed distances and speedup the computation even more - Fixed the disaggregation calculator by discarding the ruptures outside
the integration distance - Optimized the speed of the disaggregation calculator by moving a statistical
functions outside of the inner loop - Changed the file names of the exported disaggregation outputs
- Fixed an export agg_curves issue with pre-imported exposures
- Fixed an export agg_curves issue when the hazard statistics are different
from the risk statistics - Removed the disaggregation statistics: now the engine disaggregates only on
a single realization (default: the closest to the mean) - Forbidden disaggregation matrices with more than 1 million elements
- Reduced the data transfer when computing the hazard curves
- Optimized the reading of large CSV exposures
- Fixed the --hc functionality across users
- Optimized the reduction of the site collection on the exposure sites
- Made more robust the gsim logic tree parser: lines like
<uncertaintyModel gmpe_table="../gm_tables/Woffshore_low_clC.hdf5">
are accepted again - Added a check against duplicated values in nodal plane distributions and
hypocenter depth distributions - Changed the support for zipped exposures and source models: now the
name of the archive must be written explicitly in the job.ini - Added support for numpy 1.16.3, scipy 1.3.0, h5py 2.9.0
- Removed the special case for event_based_risk running two calculations
[Graeme Weatherill (@g-weatherill)]
- Adds the Tromans et al. (2019) adjustable GMPE for application to PSHA
in the UK
[Michele Simionato (@micheles)]
- Optimized src.sample_ruptures for (multi)point sources and are sources
- Fixed a mutability bug in the DistancesContext and made all context
arrays read-only: the fix may affect calculations using the GMPEs
berge_thierry_2003, cauzzi_faccioli_2008 and zhao_2006; - Fixed a bug with the minimum_distance feature
- Fixed a bug in the exporter of the aggregate loss curves: now the loss
ratios are computed correctly even in presence of occupants - Removed the (long time deprecated) capability to read hazard curves and
ground motion fields from XML files: you must use CSV files instead
[Marco Pagani (@mmpagani)]
- Implemented a modified GMPE that add between and within std to GMPEs only
supporting total std
[Michele Simionato (@micheles)]
- Added the ability to use a taxonomy_mapping.csv file
- Fixed a bug in classical_damage from CSV: for hazard intensity measure
levels different from the fragility levels, the engine was giving incorrect
results - Serialized also the source model logic tree inside the datastore
- Added a check on missing intensity_measure_types in event based
- Fixed
oq prepare_site_modelin the case of an empty datadir - Added a comment line with useful metadata to the engine CSV outputs
- Removed the long time deprecated event loss table exporter for event based
risk and enhanced the losses_by_event exporter to export the realization ID - Removed the long time deprecated GMF XML exporter for scenario
- IMT-dependent weights in the gsim logic tree can be zero, to discard
contributions outside the range of validity of (some of the) GSIMs - Now it is possible to export individual hazard curves from an event
- Added a view gmvs_to_hazard
OpenQuake Engine 3.5.2
OpenQuake Engine 3.5.1
[Michele Simionato (@micheles)]
- Added a
rlzicolumn to to sig_eps.csv output - Accepted GMF CSV files without a
rlzicolumn - Accepted a list-like syntax like
return_periods=[30, 60, 120, 240, 480]
in the job.ini, as written in the manual - Fixed a bug in the asset_risk exporter for uppercase tags
[Paul Henshaw (@pslh)]
- Fixed an encoding bug while reading XML files on Windows
OpenQuake Engine 3.5.0
[Michele Simionato (@micheles)]
- Added a view gmvs_to_hazard
[Giovanni Lanzano (@giovannilanzanoINGV)]
- Lanzano and Luzi (2019) GMPE for volcanic zones in Italy
[Michele Simionato (@micheles)]
- Now it is possible to export individual hazard curves from an event
based calculation by settinghazard_curves_from_gmfs = trueand
`individual_curves = true (before only the statistics were saved)
[Graeme Weatherill (@g-weatherill)]
- Adds adaptation of Abrahamson et al. (2016) 'BC Hydro' GMPEs calibrated
to Mediterranean data and with epistemic adjustment factors
[Chris Van Houtte (@cvanhoutte)]
- Added new class to bradley_2013b.py for hazard maps
- Modified test case_37 to test multiple sites
[Marco Pagani (@mmpagani)]
- Fixed a bug in the logic tree parser and added a check to forbid logic
trees with applyToSources without applyToBranches, unless there is a
single source model branch
[Michele Simionato (@micheles)]
- Removed the experimental parameter
prefilter_sources
[Daniele Viganò (@daniviga)]
- Multiple DbServer ZMQ connections are restored to avoid errors under heavy
load and/or on slower machines
[Michele Simionato (@micheles)]
- Removed the ugly registration of custom signals at import time: now they
are registered only ifengine.run_calcis called - Removed the dependency from rtree
- Removed all calls to ProcessPool.shutdown to speed up the tests and to
avoid non-deterministic errors in atexit._run_exitfuncs
[Marco Pagani (@mmpagani)]
- Added tabular GMPEs as provided by Michal Kolaj, Natural Resources Canada
[Michele Simionato (@micheles)]
- Extended the ebrisk calculator to support coefficients of variations
[Graeme Weatherill (@g-weatherill)]
- Adds Kotha et al (2019) shallow crustal GMPE for SERA
- Adds 'ExperimentalWarning' to possible GMPE warnings
- Adds kwargs to check_gsim function
[Michele Simionato (@micheles)]
- Fixed problems like SA(0.7) != SA(0.70) in iml_disagg
- Exposed the outputs of the classical calculation in event based
calculations withcompare_with_classical=true - Made it possible to serialize together all kind of risk functions,
including consequence functions that before were not HDF5-serializable - Fixed a MemoryError when counting the number of bytes stored in large
HDF5 datasets - Extended
asset_hazard_distanceto a dictionary for usage with multi_risk - Extended oq prepare_site_model to work with sites.csv files
- Optimized the validation of the source model logic tree: now checking
the sources IDs is 5x faster - Went back to the old logic in sampling: the weights are used for the
sampling and the statistics are computed with identical weights - Avoided to transfer the epsilons by storing them in the cache file
and changed the event to epsilons associations - Reduced the data transfer in the computation of the hazard curves, causing
in some time huge speedups (over 100x) - Implemented a flag
modal_damage_stateto display only the most likely
damage state in the outputdmg_by_assetof scenario damage calculations - Reduced substantially the memory occupation in classical calculations
by including the prefiltering phase in the calculation phase
[Daniele Viganò (@daniviga)]
- Added a 'serialize_jobs' setting to the openquake.cfg
which limits the maximum number of jobs that can be run in parallel
[Michele Simionato (@micheles)]
- Fixed two exporters for the ebrisk calculator (agg_curves-stats and
losses_by_event) - Fixed two subtle bugs when reading site_model.csv files
- Added /extract/exposure_metadata and /extract/asset_risk
- Introduced an experimental multi_risk calculator for volcanic risk
[Guillaume Daniel (@guyomd)]
- Updating of Berge-Thierry (2003) GSIM and addition of several alternatives
for use with Mw
[Michele Simionato (@micheles)]
- Changed the classical_risk calculator to use the same loss ratios for all
taxonomies and then optimized all risk calculators - Temporarily removed the
insured_lossesfunctionality - Extended
oq restoreto download from URLs - Removed the column 'gsims' from the output 'realizations'
- Better parallelized the source splitting in classical calculations
- Added a check for missing hazard in scenario_risk/scenario_damage
- Improved the GsimLogicTree parser to get the line number information, a
feature that was lost with the passage to Python 3.5 - Added a check against mispellings in the loss type in the risk keys
- Changed the aggregation WebAPI from
aggregate_by/taxonomy,occupancy/avg_losses?kind=mean&loss_type=structural to
aggregate/avg_losses?kind=mean&loss_type=structural&tag=taxonomy&tag=occupancy - Do not export the stddevs in scenario_damage in the case of 1 event
- Fixed export bug for GMFs imported from a file
- Fixed an encoding error when storing a GMPETable
- Fixed an error while exporting the hazard curves generated by a GMPETable
- Removed the deprecated feature aggregate_by/curves_by_tag
OpenQuake Engine 3.4.0
[Michele Simionato (@micheles)]
- Compatibility with 'decorator' version >= 4.2
[Giovanni Lanzano (@giovannilanzanoINGV)]
- Contributed a GMPE SkarlatoudisEtAlSSlab2013
[Michele Simionato (@micheles)]
- Changed the event loss table exporter to export also rup_id and year
- Extended the ebrisk calculator to compute loss curves and maps
[Rodolfo Puglia (@rodolfopuglia)]
- Spectral acceleration amplitudes at 2.5, 2.75 and 4 seconds added
[Marco Pagani (@mmpagani)]
- Improved the event based calculator to account for cluster-based models
[Michele Simionato (@micheles)]
- Removed the now redundant command
oq extract hazard/rlzs
[Daniele Viganò (@daniviga)]
- Fixed 'oq abort' and always mark killed jobs as 'aborted'
[Michele Simionato (@micheles)]
- Made it possible to use in the Starmap tasks without a monitor argument
- Stored the sigma and epsilon parameters for each event in event based
and scenario calculations and extended the gmf_data exporter consequently - Fixed the realizations CSV exporter which was truncating the names of the
GSIMs - Deprecated the XML exporters for hcurves, hmaps, uhs
- Introduced a
sap.scriptdecorator - Used the WebExtractor in
oq importcalc - Restored validation of the source_model_logic_tree.xml file
- Raised an early error for missing occupants in the exposure
- Added a check to forbid duplicate file names in the
uncertaintyModeltag - Made it possible to store the asset loss table in the ebrisk calculator
by specifyingasset_loss_table=truein the job.ini - Added a flag
oq info --parametersto show the job.ini parameters - Removed the
source_namecolumn from the disagg by source output
[Rao Anirudh]
- Fixed wrong investigation_time in the calculation of loss maps from
loss curves
[Robin Gee (@rcgee)]
- Added capability to optionally specify a
time_cutoffparameter to
declustering time window
[Michele Simionato (@micheles)]
- Merged the commands
oq plot_hmapsandoq plot_uhsinsideoq plot - Changed the storage of hazard curves and hazard maps to make it consistent
with the risk outputs and Extractor-friendly
[Chris Van Houtte (@cvanhoutte)]
- Added necessary gsims to run the Canterbury Seismic Hazard Model
in Gerstenberger et al. (2014) - Added a new gsim file mcverry_2006_chch.py to have the Canterbury-
specific classes. - Added a new gsim file bradley_2013b.py to implement the
Christchurch-specific modifications to the Bradley2013 base model.
[Michele Simionato (@micheles)]
- Added a check on the intensity measure types and levels in the job.ini,
to make sure they are ordered by period - Reduced the number of client sockets to the DbServer that was causing
(sporadically) the hanging of calculations on Windows - Extended the WebAPI to be able to extract specific hazard curves, maps
and UHS (i.e. IMT-specific and site specific) - Removed the realization index from the event loss table export, since
is it redundant - Forced all lowercase Python files in the engine codebase
- Removed the dependency from nose
[Robin Gee (@rcgee)]
- Updated GMPE of Yu et al. (2013)
[Michele Simionato (@micheles)]
- Added an
Extractorclient class leveraging the WebAPI and enhanced
oq plot_hmapsto display remote hazard maps - Added a check when disaggregation is attempted on a source model
with atomic source groups - Implemented serialization/deserialization of GSIM instances to TOML
- Added a check against mispelled rupture distance names and fixed
the drouet_alpes_2015 GSIMs - Changed the XML syntax used to define dictionaries IMT -> GSIM
- Now GSIM classes have an
.init()method to manage notrivial
initializations, i.e. expensive initializations or initializations
requiring access to the filesystem - Fixed a bug in event based that made it impossible to use GMPETables
- Associated the events to the realizations even in scenario_risk: this
involved changing the generation of the epsilons in the case of asset
correlation. Now there is a single aggregate losses output for all
realizations - Removed the rlzi column from the GMF CSV export
- Introduced a new parameter
ebrisk_maxweightin the job.ini - For classical calculations with few sites, store information about the
realization closest to the mean hazard curve for each site - Removed the max_num_sites limit on the event based calculator
[Valerio Poggi (@klunk386)]
- Added an AvgSA intensity measure type and a GenericGmpeAvgSA which is
able to use it
[Michele Simionato (@micheles)]
- Introduced the ability to launch subtasks from tasks
- Stored rupture information in classical calculations with few sites
[Chris Van Houtte (@cvanhoutte)]
- Adding conversion from geometric mean to larger horizontal component in
bradley_2013.py
[Michele Simionato (@micheles)]
- Fixed a bug in applyToSources for the case of multiple sources
- Moved the prefiltering on the workers to save memory
- Exported the aggregated loss ratios in avg losses and agg losses
- Removed the variables quantile_loss_curves and mean_loss_curves: they
were duplicating quantile_hazard_curves and mean_hazard_curves - Only ruptures boundingbox-close to the site collection are stored
[Marco Pagani (@mmpagani)]
- Added cluster model to classical PSHA calculator
[Michele Simionato (@micheles)]
- Fixed a bug in scenario_damage from ShakeMap with noDamageLimit=0
- Avoided the MemoryError in the controller node by speeding up the saving
of the information about the sources - Turned utils/reduce_sm into a proper command
- Fixed a wrong coefficient in the ShakeMap amplification
- Fixed a bug in the hazard curves export (the filename did not contain
the period of the IMT thus producing duplicated files) - Parallelized the reading of the exposure
[Marco Pagani (@mmpagani)]
- Fixed the implementation on mutex ruptures
[Michele Simionato (@micheles)]
- Changed the aggregated loss curves exporter
- Added an experimental calculator ebrisk
- Changed the ordering of the events (akin to a change of seed in the
asset correlation)
[Robin Gee (@rcgee)]
- Fixed bug in tusa_langer_2016.py BA08SE model - authors updated b2 coeff
- Fixed bug in tusa_langer_2016.py related to coeffs affecting Repi models
[Michele Simionato (@micheles)]
- Added a check to forbid to set
ses_per_logic_tree_path = 0 - Added an API
/extract/event_info/eidx - Splitting the sources in classical calculators and not in event based
- Removed
max_site_model_distance - Extended the logic used in event_based_risk - read the hazard sites
from the site model, not from the exposure - to all calculators - In classical_bcr calculations with a CSV exposure the retrofitted field
was not read. Now a missing retrofitted value is an error
OpenQuake Engine 3.3.2
[Robin Gee (@rcgee)]
- Fixed bug in tusa_langer_2016.py BA08SE model - authors updated b2 coeff
[Michele Simionato (@micheles)]
- Fixed a bug in scenario_damage from ShakeMap with noDamageLimit=0
- Avoided the MemoryError in the controller node by speeding up the saving
of the information about the sources - Fixed a wrong coefficient in the ShakeMap amplification
- Fixed a bug in the hazard curves export (the filename did not contain
the period of the IMT thus producing duplicated files)
OpenQuake Engine 3.3.1
OpenQuake Engine 3.3.0
[Graeme Weatherill (@g-weatherill)]
- Adds GMPE suite for national PSHA for Germany
[Daniele Viganò (@daniviga)]
- Added a warning box when an unsupported browser is used to view the WebUI
- Updated Docker containers to support a multi-node deployment
with a shared directory - Moved the Docker containers source code from oq-builders
- Updated the documentation related to the shared directory
which is now mandatory for multi-node deployments
[Matteo Nastasi (@nastasi-oq)]
- Removed tests folders
[Stéphane Drouet (@stephane-on)]
- Added Drouet & Cotton (2015) GMPE including 2017 erratum
[Michele Simionato (@micheles)]
- Optimized the memory occupation in classical calculations (Context.poe_map)
- Fixed a wrong counting of the ruptures in split fault sources with
an hypo_list/slip_list causing the calculation to fail - Made the export of uniform hazard spectra fast
- Made the
stdhazard output properly exportable - Replaced the
~in the header of the UHS csv files with a- - Restored the
individual_curvesflag even for the hazard curves - Implemented dGMPE weights per intensity measure type
- Extended
--reuse-hazardto all calculators - Fixed a bug in event_based_risk from GMFs with coefficients of variations
[Graeme Weatherill (@g-weatherill)]
- Adds magnitude scaling relation for Germany
[Michele Simionato (@micheles)]
- Used floats for the the GSIM realization weights, not Python Decimals
- Added a flag
fast_sampling, by default False - Added an API
/extract/src_loss_table/<loss_type> - Removed the rupture filtering from
sample_rupturesand optimized it in
theRuptureGetterby making use of the bounding box - Raised the limit on
ses_per_logic_tree_pathfrom 216 to 232; - Added a parameter
max_num_sitesto increase the number of sites accepted
by an event based calculation up to 2 ** 32 (the default is still 2 ** 16) - Added a command
oq compareto compare hazard curves and maps within
calculations - Extended the engine to read transparently zipped source models and exposures
- Restored the check for invalid source IDs in applyToSources
- Extended the command
oq zipto zip source models and exposures - Parallelized the associations event ID -> realization ID
- Improved the message when assets are discarded in scenario calculations
- Implemented aggregation by multiple tags, plus a special case for the
country code in event based risk
[Marco Pagani (@mmpagani)]
- Added two modified versions of the Bindi et al. (2011) to be used in a
backbone approach to compute hazard in Italy - Added a modified version of Berge-Thierry et al. 2003 supporting Mw
[Michele Simionato (@micheles)]
- Changed the way loss curves and loss maps are stored in order to unify
the aggregation logic with the one used for the average losses - Now it is possible to compute the ruptures without specifying the sites
- Added an early check for the case of missing intensity measure types
- Deprecated the case of exposure, site model and region_grid_spacing all
set at the same time - Implemented multi-exposure functionality in event based risk
- Changed the event based calculator to store the ruptures incrementally
without keeping them all in memory - Refactored the UCERF event based calculator to work as much as possible
the regular calculator - Optimized the management and storage of the aggregate losses in the event
based risk calculation; also, reduced the memory consumption - Changed the default for
individual_curvesto "false", which is the right
default for large calculations - Optimized the saving of the events
- Removed the
save_rupturesflag in the job.ini since ruptures must be saved
always - Optimized the rupture generation in case of sampling and changed the
algorithm and seeds - Fixed a bug with the IMT
SA(1)considered different fromSA(1.0) - Removed the long-time deprecated GMF exporter in XML format for event_based
- Added a re-use hazard feature in event_based_risk in single-file mode
- Made the event ID unique also in scenario calculations with
multiple realizations - Removed the annoying hidden .zip archives littering the export directory
- Added an easy way to read the exposure header
- Added a way to run Python scripts using the engine libraries via
oq shell - Improved the minimum_magnitude feature
- Fixed the check on missing hazard IMTs
- Reduced substantially the memory occupation in event based risk
- Added the option
spatial_correlation=no correlationfor risk calculations
from ShakeMaps - Removed the experimental calculator
ucerf_risk - Optimized the sampling of time-independent sources for the case of
prefilter_sources=no - Changed the algorithm associating events to SESs and made the event based
hazard calculator faster in the case of many SESs - Reduced substantially the memory consumption in event based risk
- Made it possible to read multiple site model files in the same calculation
- Implemented a smart single job.ini file mode for event based risk
- Now warnings for invalid parameters are logged in the database too
- Fixed
oq export avg_losses-statsfor the case of one realization - Added
oq export losses_by_tagandoq export curves_by_tag - Extended
oq exportto work in a multi-user situation - Forbidden event based calculations with more than
max_potential_paths
in the case of full enumeration - Saved a large amount of memory in event_based_risk calculations
- Added a command
oq export losses_by_tag/<tagname> <calc_id> - Extended
oq zipto zip the risk files together with the hazard files - Changed the building convention for the event IDs and made them unique
in the event loss table, even in the case of full enumeration - Optimized the splitting of complex fault sources
- Fixed the ShakeMap download procedure for
uncertainty.ziparchives
with an incorrect structure (for instance for ci3031111) - Disabled the spatial correlation in risk-from-ShakeMap by default
- Optimized the rupture sampling where there is a large number of SESs
- Extended the
reqvfeature to multiple tectonic region types and
removed the spinning/floating for the TRTs using the feature - Reduced the GMPE logic tree upfront for TRTs missing in the source model
- Fixed the ShakeMap downloader to use the USGS GeoJSON feed
- Improved the error message when there are more than 65536 distinct tags
in the exposure - Turned
vs30measuredinto an optional parameter
[Chris Van Houtte (@cvanhoutte)]
- Added
siteclassas a site parameter, andreference_site_classas
a site parameter than can be specified by the user in the ini file - Added new classes to mcverry_2006.py to take siteclass as a predictor
- Updated comments in mcverry_2006.py
- Added new mcverry_2006 test tables to account for difference in site
parameter - Added qa_test_data classical case_32
[Michele Simionato (@micheles)]
- Fixed the rupture exporter for Canada
- Extended the
oq prepare_site_modelto optionally generate the
fields z1pt0, z2pt5 and vs30measured - It is now an error to specify both the sites and the site model in the
job.ini, to avoid confusion with the precedency - Implemented a reader for site models in CSV format
- Made the export_dir relative to the input directory
- Better error message for ShakeMaps with zero stddev
- Added a source_id-filtering feature in the job.ini
- Added a check on non-homogeneous tectonic region types in a source group
- Fixed the option
oq engine --config-filethat broke a few releases ago - Replaced
nodal_dist_collapsing_distanceand
hypo_dist_collapsing_distancewithpointsource_distanceand made
use of them in the classical and event based calculators
[Graeme Weatherill (@g-weatherill)]
- Fixes to hmtk completeness tables for consistent rates and addition of
more special methods to catalogue
[Michele Simionato (@micheles)]
- Restricted ChiouYoungs2008SWISS01 to StdDev.TOTAL to avoid a bug
when computing the GMFs with inter/intra stddevs - Raised an error if assets are discarded because too far from the hazard
sites (before it was just a warning) - Added an attribute .srcidx to every event based rupture and stored it
- Fixed an issue with the Byte Order Mark (BOM) for CSV exposures prepared
with Microsoft Excel - Reduced the site collection instead of just filtering it; this fixes
a source filtering bug and changes the numbers in case of GMF-correlation - Added a command
oq prepare_site_modelto prepare a sites.csv file
containing the vs30 and changed the engine to use it - Added a cutoff when storing a PoE=1 from a CSV file, thus avoiding NaNs
in classical_damage calculations - Reduced the data transfer in the risk model by only considering the
taxonomies relevant for the exposure - Extended
oq engine --runto accept a list of files - Optimized the saving of the risk results in event based in the case of
many sites and changed the commandoq show portfolio_lossto show
mean and standard deviation of the portfolio loss for each loss type
[Marco Pagani (@mmpagani)]
- Added a first and preliminary version of the GMM for the Canada model
represented in an analytical form. - Added a modified version of Atkinson and Macias to be used for the
calculation of hazard in NSHMP2014. - Added support for PGA to the Si and Midorikawa (1999).
[Michele Simionato (@micheles)]
- Made it possible to run the risk over an hazard calculation of another user
- Worked around the OverflowError: cannot serialize a bytes object larger
than 4 GiB in event based calculations - Started using Python 3.6 features
- Fixed the check on vulnerability function ID uniqueness for NRML 0.5
- Ruptures and GMFs are now computed concurrently, thus mitigating the
issue of slow tas...
OpenQuake Engine 3.2.0
[Matteo Nastasi (@nastasi-oq)]
- specified 'amd64' as the only architecture supported by ubuntu packages
[Michele Simionato (@micheles)]
- Changed the source writer: now the
srcs_weightsare written in the XML
file only if they are nontrivial - Changed the algorithm assigning the seeds: they are now generated before
the source splitting; also, a seed-related bug in the splitting was fixed - For event based, moved the rupture generation in the prefiltering phase
[Daniele Viganò (@daniviga)]
- Fixed a bug with CTRL-C when using the
processpooldistribution
[Robin Gee (@rcgee)]
- Raised the source ID length limit in the validation from 60 to 75 characters
to allow sources with longer IDs
[Michele Simionato (@micheles)]
- Introduced a
multi_nodeflag inopenquake.cfgand used it to
fully parallelize the prefiltering in a cluster - Used the rupture seed as rupture ID in event based calculations
- Changed the deprecation mechanism of GSIMs to use a class attribute
superseded_by=NewGsimClass - Solved the pickling bug in event based hazard by using generator tasks
- Improved the distribution of the risk tasks by changing the weight
[Pablo Heresi (@pheresi)]
- Contributed the HM2018CorrelationModel
[Michele Simionato (@micheles)]
- Restored the
individual_curvesflag that for the moment is used for the
risk curves - Introduced two experimental new parameters
floating_distanceand
spinning_distanceto reduce hypocenter distributions and nodal plane
distributions of ruptures over the corresponding distances - Optimized the parsing of the logic tree when there is no "applyToSources"
- Made the IMT classes extensible in client code
- Reduced the hazard maps from 64 to 32 bit, to be consistent with the
hazard curves and to reduce by half the download time
[Graeme Weatherill (@g-weatherill)]
- Implements a fix of Montalva et al (2016) for new coefficients (now
Montalva et al. (2017))
[Michele Simionato (@micheles)]
- Parallelized the reading of the source models
- Optimized
oq info --reportby not splitting the sources in that case - Speedup the download of the hazard curves, maps and uhs
- Honored
concurrent_tasksin the prefiltering phase too - It is now legal to compute uniform hazard spectra for a single period
- Added command
oq plot_memory - Introduced a MultiGMPE concept
- Saved the size of the datastore in the database and used it in the WebUI
[Graeme Weatherill (@g-weatherill)]
- Adds geotechnical related IMTs
[Michele Simionato (@micheles)]
- Renamed /extract/agglosses -> /extract/agg_losses and same for aggdamages
- Supported equivalent epicentral distance with a
reqv_hdf5file - Fixed the risk from ShakeMap feature in the case of missing IMTs
- Changed the way gmf_data/indices and ruptures are stored
- Added experimental support for dask
- Added 11 new site parameters for geotechnic hazard
- Changed the SiteCollection to store only the parameters required by the
GSIMs
[Robin Gee (@rcgee)]
- The number of sites is now an argument in the method _get_stddevs()
in the GMPE of Kanno, 2006
[Michele Simionato (@micheles)]
- Changed the serialization of ruptures to HDF5: the geometries are now
stored in a different dataset - Bug fix: the asset->site association was performed even when not needed
- Made it possible to serialize to .hdf5 multipoint sources and
nonparametric gridded sources - Added a check on source model logic tree files: the uncertaintyModel
values cannot be repeated in the same branchset - Added a flag
std_hazard_curves; by setting it totruethe user can
compute the standard deviation of the hazard curves across realizations
[Marco Pagani (@mmpagani)]
- Added Thingbaijam et al. (2017) magnitude-scaling relationship
[Michele Simionato (@micheles)]
- Added an /extract/ API for event_based_mfd
- Fixed a bug in the classical_damage calculators: multiple loss types
were not treated correctly
[Marco Pagani (@mmpagani)]
- Adding tests to the method computing decimal time
[Michele Simionato (@micheles)]
- Removed the event_based_rupture calculator and three others
- Added a field
size_mbto theoutputtable in the database and made
it visible in the WebUI as a tooltip - Added a command
oq check_input job.inito check the input files - Made the loss curves and maps outputs from an event based risk calculation
visible to the engine and the WebUI (only the stats) - Added a check on duplicated branchIDs in GMPE logic trees
[Daniele Viganò (@daniviga)]
- Fixed a bug when reading exposure with utf8 names on systems with non-utf8
terminals (Windows) - Changed the openquake.cfg file and added a dbserver.listen parameter
- Added the hostname in the WebUI page. It can be customize by the user
via thelocal_settings.pyfile
[Michele Simionato (@micheles)]
- Added a Content-Length to the outputs downloadable from the WebUI
- Fixed a bug when extracting gmf_data from a hazard calculation with a
filtered site collection - Stored an attributed
events.max_gmf_size - Added a check on exposures with missing loss types
- Added a LargeExposureGrid error to protect the user by tricky exposures
(i.e. France with assets in the Antilles) - Changed the event_based_risk calculator to compute the loss curves and
maps directly; removed the asset_loss_table - Changed the event_based_risk calculator to distribute by GMFs always
- Optimized the memory consumption in the UCERF classical calculator
- Added a parameter
minimum_magnitudein the job.ini - Added an utility
utils/combine_mean_curves.py
OpenQuake Engine 3.1.0
[Marco Pagani (@mmpagani) and Changlong Li (@mstlgzfdh)]
- Added a version of the Yu et al. (2013) GMPE supporting Mw
[Michele Simionato (@micheles)]
- Reduced the data transfer in the UCERF calculators
- Stored the zipped input files in the datastore for reproducibility
- Fixed a regression when reading GMFs from an XML in absence of a sites.csv
file
[Robin Gee (@rcgee)]
- Extend
oq to_shapefilemethod to also work withYoungsCoppersmithMFD
andarbitraryMFDMFD typologies.
[Michele Simionato (@micheles)]
- Now the hazard statistics can be computed efficiently even in a single
calculation, i.e. without the--hcoption - Added a check on the Python version in the
oqcommand - Reduced the data transfer when sending the site collection
- Changed the default
filter_distance
[Daniele Viganò (@daniviga)]
- Fixed a bug where the PID was not saved into the database
when using the command line interface - Made it impossible to fire multiple
CTRL-Cin sequence
to allow processes teardown and tasks revocation when Celery is used
[Michele Simionato (@micheles)]
- Used
scipy.spatial.distance.cdistinMesh.get_min_distance - Prefiltered sites and assets in scenario calculations
- Made it possible to specify the
filter_distancein thejob.ini - Made rtree optional again and disabled it in macOS
- Optimized the SiteCollection class and doubled the speed of distance
calculations in most continental scale calculations - Fixed an ordering bug in event based risk from GMFs when using a
vulnerability function with PMF - Replaced Rtree with KDtree except in the source filtering
- Parallelized the source prefiltering
- Removed the tiling feature from the classical calculator
- Undeprecated
hazardlib.calc.stochastic.stochastic_event_setand
made its signature right - Removed the source typology from the ruptures and reduced the rupture
hierarchy - Removed the mesh spacing from PlanarSurfaces
- Optimized the instantiation of the rtree index
- Replaced the old prefiltering mechanism with the new one
[Daniele Viganò (@daniviga)]
- Managed the case of a dead controlling terminal (SIGHUP)
[Michele Simionato (@micheles)]
- Removed Decimal numbers from the PMF distribution in hazardlib
- Fixed another tricky bug with rtree filtering across the international
date line - Added a parameter
prefilter_sourceswith valuesrtree|numpy|no - Removed the prefiltering on the workers, resulting in a huge speedup
for gridded ruptures at the cost of a larger data transfer - Changed the
losses_by_eventoutput to export a single .csv file with
all realizations - Added a
cross_correlationparameter used when working with shakemaps - Now sites and exposure can be set at the same time in the job.ini
- Introduced a
preclassicalcalculator - Extended the scenario_damage calculator to export
dmg_by_event
outputs as well aslosses_by_eventoutputs if there is a consequence
model - Unified
regionandregion_constraintparameters in the job.ini - Added a check to forbid duplicated GSIMs in the logic tree
- Introduced some changes to the
realizationsexporter (renamed field
uid->branch_pathand removed themodelfield) - Added a command
oq celery inspect - Reduced the check on too many realizations to a warning, except for
event based calculations - Improved the hazard exporter to exports only data for the filtered
site collection and not the full site collection - Extended the BCR exporter to export the asset tags
[Catalina Yepes (@CatalinaYepes)]
- Revised/enhanced the risk demos
[Michele Simionato (@micheles)]
- Added a warning about the option
optimize_same_id_sourceswhen the user
should take advantage of it
[Daniele Viganò (@daniviga)]
celery-statusscript converted intooq celery statuscommand- Removed Django < 1.10 backward compatibility
- Updated Python dependices (numpy 1.14, scipy 1.0.1,
Django 1.10+, Celery 4+)
[Michele Simionato (@micheles)]
- Implemented scenario_risk/scenario_damage from shakemap calculators
- Exported the asset tags in the asset based risk outputs
- Fixed a numeric issue for nonparametric sources causing the hazard curves
to saturate at high intensities - Added an utility to download shakemaps
- Added an XML exporter for the site model
- Slight change to the correlation module to fix a bug in the SMTK
- Added a distribution mechanism
threadpool