gravelamps.asimov

Asimov Pipeline Integration

Following is the implementation of Gravelamps as an Asimov pipeline based on the generic instructions provided by Daniel Williams within the Asimov documentation. This sets up the configuration required to automate Gravelamps running within the Asimov framework, allowing event handling to be automated.

Written by Daniel Williams,

Mick Wright.

Classes

Gravelamps

Gravelamps specific Pipeline configuration.

Module Contents

class gravelamps.asimov.Gravelamps(production, category=None)

Bases: asimov.pipeline.Pipeline

Gravelamps specific Pipeline configuration.

Based primarily upon Asimov’s built-in Bilby Pipeline class. This handles building and submtiting individual event runs from a properly configured ledger.

Methods

detect_completion

Assess if job has completed

before_submit

Pre submission hook

build_dag

Build Gravelamps DAG

submit_dag

Submits Gravelamps DAG to HTCondor

collect_assets

Collect result assets

samples

Collect result sample files for PESummary

after_completion

Post completion hook to run PESummary

collect_logs

Collect logs into dictionary

check_progress

Checks job progress

detect_completion()

Assess if job has completed.

The Gravelamps DAG’s final job is always the bilby_pipe DAG. To assess if the DAG has completed therefore, the function checks for the existance of the final result file in the ouput directory to assert the completion of the job.

Returns:
bool

Job completion status—true if complete, false otherwise.

before_submit()

Pre submission hook.

The hook at present adds the preserve relative file path argument to the condor submission file.

Notes

The hook currently adds the results directory from bilby_pipe to the individual submission files that transfer input files. This is to deal with a current ongoing issue in bilby_pipe that is due to be fixed in the next release, and will be modified after this occurs.

build_dag(psds=None, user=None, clobber_psd=None, dryrun=False)

Build Gravelamps DAG.

Construct a DAG file in order to submit a production to the condor scheduler using gravelamps_inference.

Parameters:
productionstr

Production name

psdsdict, optional

The PSDs which should be used for this DAG. If no PSDs are provided the PSD files specified in the ini file will be used instead

userstr

The user accounting tag which should be used to run the job

dryrunbool

If set to true the commands will not be run, but will be printed to standard output. Defaults to False

Raises:
PipelineException

Raised if the construction of the DAG fails

submit_dag(dryrun=False)

Submits DAG file to the condor cluster

Parameters:
dryrunbool

If set to true the DAG will not be submitted but all commands will be printed to standard output instead. Defaults to False

Returns:
int

The cluster ID assigned to the running DAG file

PipelineLogger

The pipeline logger message.

Raises:
PipelineException

This will be raised if the pipeline fails to submit the job

collect_assets()

Collect result assets.

The current result assests are deemed to be the samples produced by the nested sampling run.

samples(absolute=True)

Collect the combined samples file for PESummary

Parameters:
absolutebool

Flag to return the absolute or relative filepath

Returns:
sample_filesstr

Path to the combined sample file

after_completion()

Tasks to run after the job is complete.

These tasks are to read the Gravelamps dependency to find whether or not the result is interesting and to set the flag accordingly. If the job is interesting the job will also email the user if that flag is set

interest_email(model, ini, log_bayes)

Sends an email to interested parties when job is determined to have interesting results

get_ini()

Retrieve INI for the production

get_comparison_production()

Get the comparison production from the dependecies

get_comparison_data(comparison_production)

Get the information from the comparison production

collect_logs()

Collect all of the log files which have been produced by this production and return their contents as a dictionary

Returns:
messagesdict

Dictionary containing the log file content or notification that the file could not be opened

check_progress()

Checks job progress.

The job progress is checked up on by finding the number of iterations and the current value of the dlogz for the sampling runs. This combined information can be used to obtain a rough estimate of how far through the job the run is. This is returned in dictionary format.

Returns:
messagesdict

Dictionary containing job progress in the form of the number of iterations and current dlogz value. Will contain a message noting if the log file for the job could not be opened.

classmethod read_ini(filepath)

Read and parse Gravelamps configuration file.

Gravelamps configuration files are INI compliant, with dedicated and important sections Individual options can be repeated between sections.

Returns:
config_parserConfigParser

Object containing Gravelamps configuration settings based on the INI structure.

html()

Return the HTML representation of this pipeline

resurrect()

Attempt to ressurect a failed job.

A failed job will be resurrected a maximum of five times assuming that a rescue DAG has been produced.