gravelamps.asimov ================= .. py:module:: gravelamps.asimov .. autoapi-nested-parse:: Asimov Pipeline Integration Following is the implementation of Gravelamps as an Asimov pipeline based on the generic instructions provided by Daniel Williams within the Asimov documentation. This sets up the configuration required to automate Gravelamps running within the Asimov framework, allowing event handling to be automated. Written by Daniel Williams, Mick Wright. .. !! processed by numpydoc !! Classes ------- .. autoapisummary:: gravelamps.asimov.Gravelamps Module Contents --------------- .. py:class:: Gravelamps(production, category=None) Bases: :py:obj:`asimov.pipeline.Pipeline` Gravelamps specific Pipeline configuration. Based primarily upon Asimov's built-in Bilby Pipeline class. This handles building and submtiting individual event runs from a properly configured ledger. .. rubric:: Methods ===================== ========== **detect_completion** Assess if job has completed **before_submit** Pre submission hook **build_dag** Build Gravelamps DAG **submit_dag** Submits Gravelamps DAG to HTCondor **collect_assets** Collect result assets **samples** Collect result sample files for PESummary **after_completion** Post completion hook to run PESummary **collect_logs** Collect logs into dictionary **check_progress** Checks job progress ===================== ========== .. !! processed by numpydoc !! .. py:method:: detect_completion() Assess if job has completed. The Gravelamps DAG's final job is always the bilby_pipe DAG. To assess if the DAG has completed therefore, the function checks for the existance of the final result file in the ouput directory to assert the completion of the job. :Returns: bool Job completion status---true if complete, false otherwise. .. !! processed by numpydoc !! .. py:method:: before_submit() Pre submission hook. The hook at present adds the preserve relative file path argument to the condor submission file. .. rubric:: Notes The hook currently adds the results directory from bilby_pipe to the individual submission files that transfer input files. This is to deal with a current ongoing issue in bilby_pipe that is due to be fixed in the next release, and will be modified after this occurs. .. !! processed by numpydoc !! .. py:method:: build_dag(psds=None, user=None, clobber_psd=None, dryrun=False) Build Gravelamps DAG. Construct a DAG file in order to submit a production to the condor scheduler using gravelamps_inference. :Parameters: **production** : str Production name **psds** : dict, optional The PSDs which should be used for this DAG. If no PSDs are provided the PSD files specified in the ini file will be used instead **user** : str The user accounting tag which should be used to run the job **dryrun** : bool If set to true the commands will not be run, but will be printed to standard output. Defaults to False :Raises: PipelineException Raised if the construction of the DAG fails .. !! processed by numpydoc !! .. py:method:: submit_dag(dryrun=False) Submits DAG file to the condor cluster :Parameters: **dryrun** : bool If set to true the DAG will not be submitted but all commands will be printed to standard output instead. Defaults to False :Returns: int The cluster ID assigned to the running DAG file PipelineLogger The pipeline logger message. :Raises: PipelineException This will be raised if the pipeline fails to submit the job .. !! processed by numpydoc !! .. py:method:: collect_assets() Collect result assets. The current result assests are deemed to be the samples produced by the nested sampling run. .. !! processed by numpydoc !! .. py:method:: samples(absolute=True) Collect the combined samples file for PESummary :Parameters: **absolute** : bool Flag to return the absolute or relative filepath :Returns: **sample_files** : str Path to the combined sample file .. !! processed by numpydoc !! .. py:method:: after_completion() Tasks to run after the job is complete. These tasks are to read the Gravelamps dependency to find whether or not the result is interesting and to set the flag accordingly. If the job is interesting the job will also email the user if that flag is set .. !! processed by numpydoc !! .. py:method:: interest_email(model, ini, log_bayes) Sends an email to interested parties when job is determined to have interesting results .. !! processed by numpydoc !! .. py:method:: get_ini() Retrieve INI for the production .. !! processed by numpydoc !! .. py:method:: get_comparison_production() Get the comparison production from the dependecies .. !! processed by numpydoc !! .. py:method:: get_comparison_data(comparison_production) Get the information from the comparison production .. !! processed by numpydoc !! .. py:method:: collect_logs() Collect all of the log files which have been produced by this production and return their contents as a dictionary :Returns: **messages** : dict Dictionary containing the log file content or notification that the file could not be opened .. !! processed by numpydoc !! .. py:method:: check_progress() Checks job progress. The job progress is checked up on by finding the number of iterations and the current value of the dlogz for the sampling runs. This combined information can be used to obtain a rough estimate of how far through the job the run is. This is returned in dictionary format. :Returns: **messages** : dict Dictionary containing job progress in the form of the number of iterations and current dlogz value. Will contain a message noting if the log file for the job could not be opened. .. !! processed by numpydoc !! .. py:method:: read_ini(filepath) :classmethod: Read and parse Gravelamps configuration file. Gravelamps configuration files are INI compliant, with dedicated and important sections Individual options can be repeated between sections. :Returns: **config_parser** : ConfigParser Object containing Gravelamps configuration settings based on the INI structure. .. !! processed by numpydoc !! .. py:method:: html() Return the HTML representation of this pipeline .. !! processed by numpydoc !! .. py:method:: resurrect() Attempt to ressurect a failed job. A failed job will be resurrected a maximum of five times assuming that a rescue DAG has been produced. .. !! processed by numpydoc !!