tools package

Submodules

tools.analysesCombinations module

class tools.analysesCombinations.AnaCombLikelihoodComputer(theoryPredictions: list, deltas_rel=None)[source]

Bases: object

constructor. :param theoryPredictions: the List of theory predictions :param deltas_rel: relative uncertainty in signal (float). Default value is 20%.

CLs(mu: float = 1.0, expected: Union[str, bool] = False, return_type: str = 'CLs')[source]

Compute the exclusion confidence level of the model :param mu: compute for the parameter of interest mu :param expected: if false, compute observed, true: compute a priori expected :param return_type: (Text) can be “CLs-alpha”, “1-CLs”, “CLs” CLs-alpha: returns CLs - 0.05 1-CLs: returns 1-CLs value CLs: returns CLs value

getCLsRootFunc(expected: bool = False, allowNegativeSignals: bool = False) Tuple[float, float, Callable][source]

Obtain the function “CLs-alpha[0.05]” whose root defines the upper limit, plus mu_hat and sigma_mu :param expected: if True, compute expected likelihood, else observed

getLlhds(muvals, expected=False, normalize=True)[source]

Compute the likelihoods for the individual analyses and the combined likelihood. Returns a dictionary with the analysis IDs as keys and the likelihood values as values.

Parameters
  • muvals – List with values for the signal strenth for which the likelihoods must be evaluated.

  • expected – If True returns the expected likelihood values.

  • normalize – If True normalizes the likelihood by its integral over muvals.

getUpperLimitOnMu(expected=False, allowNegativeSignals=False)[source]

get upper limit on signal strength multiplier, i.e. value for mu for which CLs = 0.95 :param expected: if True, compute expected likelihood, else observed :returns: upper limit on signal strength multiplier mu

getUpperLimitOnSigmaTimesEff(expected=False, allowNegativeSignals=False)[source]
upper limit on the fiducial cross section sigma times efficiency,

summed over all signal regions, i.e. sum_i xsec^prod_i eff_i obtained from the defined Data (using the signal prediction for each signal region/dataset), by using the q_mu test statistic from the CCGV paper (arXiv:1007.1727).

Params expected

if false, compute observed, true: compute a priori expected, “posteriori”: compute a posteriori expected

Returns

upper limit on fiducial cross section

likelihood(mu: float = 1.0, expected: Union[bool, str] = False, return_nll: bool = False, useCached: bool = True) float[source]

Compute the likelihood at a given mu

Parameters
  • mu – signal strength

  • expected – if True, compute expected likelihood, else observed

  • return_nll – if True, return negative log likelihood, else likelihood

  • useCached – if True, will use the cached values from the theoryPrediction objects (if available)

lmax(allowNegativeSignals: bool = False, expected: Union[bool, str] = False, return_nll: bool = False) Optional[Dict][source]

find muhat and lmax.

Parameters
  • allowNegativeSignals – if true, then also allow for negative values

  • expected – if true, compute expected prior (=lsm), if “posteriori” compute posteriori expected

  • return_nll – if true, return negative log max likelihood instead of lmax

Returns

mu_hat, i.e. the maximum likelihood estimate of mu, if extended output is requested, it returns a dictionary with mu_hat, sigma_mu – the standard deviation around mu_hat, and lmax, i.e. the likelihood at mu_hat

tools.asciiGraph module

tools.asciiGraph.asciidraw(element, labels=True, html=False, border=False)[source]

Draw a simple ASCII graph on the screen.

tools.basicStats module

tools.basicStats.CLsfromNLL(nllA: float, nll0A: float, nll: float, nll0: float, return_type: str = 'CLs-alpha') float[source]

compute the CLs - alpha from the NLLs TODO: following needs explanation

Parameters
  • nllA

  • nll0A

  • nll

  • nll0

  • return_type – (Text) can be “CLs-alpha”, “1-CLs”, “CLs” CLs-alpha: returns CLs - 0.05 1-CLs: returns 1-CLs value CLs: returns CLs value

Returns

Cls-type value, see above

tools.basicStats.chi2FromLmax(llhd, lmax)[source]

compute the chi2 from likelihood and lmax

tools.basicStats.determineBrentBracket(mu_hat, sigma_mu, rootfinder, allowNegative=True)[source]

find a, b for brent bracketing

Parameters
  • mu_hat – mu that maximizes likelihood

  • sigm_mu – error on mu_hat (not too reliable)

  • rootfinder – function that finds the root (usually root_func)

  • allowNegative – if False, then do not allow a or b to become negative

Returns

the interval a,b

tools.caching module

class tools.caching.Cache[source]

Bases: object

a class for storing results from interpolation

static add(key, value)[source]
n_stored = 1000
static reset()[source]

completely reset the cache

static size()[source]

tools.colors module

class tools.colors.Colors[source]

Bases: object

property blue
property cyan
property debug
property error
property green
property info
property magenta
property red
property reset
property warn
property yellow

tools.coverage module

class tools.coverage.GeneralElement(el, missingX, smFinalStates, bsmFinalStates)[source]

Bases: object

This class represents a simplified (general) element which does only holds information about its even particles and decay type. The even particles are replaced/grouped by the particles defined in smFinalStates.

sortBranches()[source]

Sort branches. The smallest branch is the first one. If branches are equal, sort accoding to decayType.

class tools.coverage.Uncovered(topoList, sqrts=None, sigmacut=0.00E+00 [fb], groupFilters={'missing (all)': <function <lambda>>, 'missing (displaced)': <function <lambda>>, 'missing (prompt)': <function <lambda>>, 'outsideGrid (all)': <function <lambda>>}, groupFactors={'missing (all)': <function <lambda>>, 'missing (displaced)': <function <lambda>>, 'missing (prompt)': <function <lambda>>, 'outsideGrid (all)': <function <lambda>>}, groupdDescriptions={'missing (all)': 'missing topologies', 'missing (displaced)': 'missing topologies with displaced decays', 'missing (prompt)': 'missing topologies with prompt decays', 'outsideGrid (all)': 'topologies outside the grid'}, smFinalStates=None, bsmFinalSates=None)[source]

Bases: object

Wrapper object for defining and holding a list of coverage groups (UncoveredGroup objects).

The class builds a series of UncoveredGroup objects and stores them.

Inititalize the object.

Parameters
  • topoList – TopologyList object used to select elements from.

  • sqrts – Value (with units) for the center of mass energy used to compute the missing cross sections. If not specified the largest value available will be used.

  • sigmacut – Minimum cross-section/weight value (after applying the reweight factor) for an element to be included. The value should in fb (unitless)

  • groupFilters – Dictionary containing the groups’ labels and the method for selecting elements.

  • groupFactors – Dictionary containing the groups’ labels and the method for reweighting cross sections.

  • groupdDescriptions – Dictionary containing the groups’ labels and strings describing the group (used for printout)

  • smFinalStates – List of (inclusive) Particle or MultiParticle objects used for grouping Z2-even particles when creating GeneralElements.

  • bsmFinalSates – List of (inclusive) Particle or MultiParticle objects used for grouping Z2-odd particles when creating GeneralElements.

getGroup(label)[source]

Returns the group with the required label. If not found, returns None.

Parameters

label – String corresponding to the specific group label

Returns

UncoveredGroup object which matches the label

class tools.coverage.UncoveredGroup(label, elementFilter, reweightFactor, smFinalStates, bsmFinalStates, sqrts, sigmacut=0.0)[source]

Bases: object

Holds information about a single coverage group: criteria for selecting and grouping elements, function for reweighting cross sections, etc.

Parameters
  • label – Group label

  • elementFilter – Function which takes an element as argument and returns True (False) if the element should (not) be selected.

  • reweightFactor – Function which takes an element as argument and returns the reweighting factor to be applied to the element weight.

  • smFinalStates – List of Particle/MultiParticle objects used to group Z2-even particles appearing in the final state

  • bsmFinalStates – List of Particle/MultiParticle objects used to group Z2-odd particles appearing in the final state

  • sqrts – Value (with units) for the center of mass energy used to compute the missing cross sections. If not specified the largest value available will be used.

  • sigmacut – Minimum cross-section/weight value (after applying the reweight factor) for an element to be included. The value should in fb (unitless)

addToGeneralElements(el, missingX)[source]

Adds an element to the list of missing topologies = general elements. If the element contributes to a missing topology that is already in the list, add element and weight to topology. :parameter el: element to be added :parameter missingX: missing cross-section for the element (in fb)

getMissingX(element)[source]

Calculate total missing cross section of an element, by recursively checking if its mothers already appear in the list. :param element: Element object

Returns

missing cross section without units (in fb)

getToposFrom(topoList)[source]

Select the elements from topoList according to self.elementFilter and build GeneralElements from the selected elements. The GeneralElement weights corresponds to the missing cross-section with double counting from compressed elements already accounted for.

getTotalXSec(sqrts=None)[source]

Calculate total missing topology cross section at sqrts. If no sqrts is given use self.sqrts :ivar sqrts: sqrts

tools.crashReport module

class tools.crashReport.CrashReport[source]

Bases: object

Class that handles all crash report information.

createCrashReportFile(inputFileName, parameterFileName)[source]

Create a new SModelS crash report file.

A SModelS crash report file contains:

  • a timestamp

  • SModelS version

  • platform information (CPU architecture, operating system, …)

  • Python version

  • stack trace

  • input file name

  • input file content

  • parameter file name

  • parameter file content

Parameters
  • inputFileName – relative location of the input file

  • parameterFileName – relative location of the parameter file

createUnknownErrorMessage()[source]

Create a message for an unknown error.

tools.crashReport.createStackTrace()[source]

Return the stack trace.

tools.crashReport.readCrashReportFile(crashReportFileName)[source]

Read a crash report file to use its input and parameter file sections for a SModelS run.

Parameters

crashReportFileName – relative location of the crash report file

tools.databaseBrowser module

class tools.databaseBrowser.Browser(database, force_txt=False)[source]

Bases: object

Browses the database, exits if given path does not point to a valid smodels-database. Browser can be restricted to specified run or experiment.

Parameters
  • force_txt – If True forces loading the text database.

  • database – Path to the database or Database object

getAttributes(showPrivate=False)[source]

Checks for all the fields/attributes it contains as well as the attributes of its objects if they belong to smodels.experiment.

Parameters

showPrivate – if True, also returns the protected fields (_field)

Returns

list of field names (strings)

getEfficiencyFor(expid, dataset, txname, massarray)[source]

Get an efficiency for the given experimental id, the dataset name, the txname, and the massarray. Can only be used for EfficiencyMap-type experimental results. Interpolation is done, if necessary.

Parameters
  • expid – experimental id (string)

  • dataset – dataset name (string)

  • txname – txname (string).

  • massarray – list of masses with units, e.g. [[ 400.*GeV, 100.*GeV],[400.*GeV, 100.*GeV]]

Returns

efficiency

getULFor(expid, txname, massarray, expected=False)[source]

Get an upper limit for the given experimental id, the txname, and the massarray. Can only be used for UL experimental results. Interpolation is done, if necessary.

Parameters
  • expid – experimental id (string)

  • txname – txname (string). ONLY required for upper limit results

  • massarray – list of masses with units, e.g. [[ 400.*GeV, 100.*GeV],[400.*GeV, 100.*GeV]]

  • expected – If true, return expected upper limit, otherwise return observed upper limit.

Returns

upper limit [fb]

getULForSR(expid, datasetID)[source]

Get an upper limit for the given experimental id and dataset (signal region). Can only be used for efficiency-map results. :param expid: experimental id (string) :param datasetID: string defining the dataset id, e.g. ANA5-CUT3. :return: upper limit [fb]

getValuesFor(attribute, expResult=None)[source]

Returns a list for the possible values appearing in the database for the required attribute (sqrts,id,constraint,…).

Parameters
  • attribute – name of a field in the database (string).

  • expResult – if defined, restricts the list to the corresponding expResult. Must be an ExpResult object.

Returns

list of values

loadAllResults()[source]

Saves all the results from database to the _selectedExpResults. Can be used to restore all results to _selectedExpResults.

selectExpResultsWith(**restrDict)[source]

Loads the list of the experimental results (pair of InfoFile and DataFile) satisfying the restrictions to the _selectedExpResults. The restrictions specified as a dictionary.

Parameters

restrDict – selection fields and their allowed values. E.g. lumi = [19.4/fb, 20.3/fb], txName = ‘T1’,….} The values can be single entries or a list of values. For the fields not listed, all values are assumed to be allowed.

tools.databaseBrowser.main(args)[source]

IPython interface for browsing the Database.

tools.databaseClient module

class tools.databaseClient.DatabaseClient(servername=None, port=None, verbose='info', rundir='./', logfile='@@rundir@@/dbclient.log', clientid=-1)[source]

Bases: object

clearCache()[source]
findServerStatus()[source]
getWaitingTime()[source]

compute a waiting time between attempts, from self.ntries

initialize()[source]
log(*args)[source]
nameAndPort()[source]
pprint(*args)[source]
query(msg)[source]

query a certain result, msg is eg. obs:ATLAS-SUSY-2016-07:ul:T1:[[5.5000E+02,4.5000E+02],[5.5000E+02,4.5000E+02]]

saveStats()[source]
send(message, amount_expected=32)[source]

send the message. :param amount_expected: how many return bytes do you expect

send_shutdown()[source]

send shutdown request to server

setDefaults()[source]

put in some defaults if data members dont exist

tools.databaseClient.stresstest(args)[source]

this is one process in the stress test

tools.databaseServer module

class tools.databaseServer.DatabaseServer(dbpath, servername=None, port=None, verbose='info', rundir='./', logfile='@@rundir@@/dbserver.log')[source]

Bases: object

finish()[source]
initialize()[source]
is_port_in_use(port)[source]

check if port is in use

listen()[source]
log(*args)[source]
logServerStats()[source]

log our stats upon exit

lookUpResult(data)[source]
parseData(data)[source]

parse the data packet

pprint(*args)[source]
run(nonblocking=False)[source]

run server :param nonblock: run in nonblocking mode (not yet implemented)

setStatus(status)[source]

servers have a status file that tells us if they are running

shutdown(fromwhere='unknown')[source]
tools.databaseServer.shutdownAll()[source]

tools.externalPythonTools module

class tools.externalPythonTools.ExternalPythonTool(importname, optional=False)[source]

Bases: object

An instance of this class represents the installation of a python package. As it is python-only, we need this only for installation, not for running (contrary to nllfast or pythia).

Initializes the ExternalPythonTool object. Useful for installation. :params optional: optional package, not needed for core SModelS.

checkInstallation()[source]

The check is basically done in the constructor

compile()[source]
installDirectory()[source]

Just returns the pythonPath variable

pathOfExecutable()[source]

Just returns the pythonPath variable

tools.inclusiveObjects module

class tools.inclusiveObjects.InclusiveList[source]

Bases: list

An inclusive list class. It will return True when compared to any other list object.

class tools.inclusiveObjects.InclusiveValue[source]

Bases: int

An inclusive number class. It will return True when compared to any other integer, float or Unum object.

tools.interactivePlots module

tools.interactivePlotsHelpers module

tools.ioObjects module

class tools.ioObjects.FileStatus[source]

Bases: object

Object to run several checks on the input file. It holds an LheStatus (SlhaStatus) object if inputType = lhe (slha)

checkFile(inputFile)[source]

Run checks on the input file.

Parameters

inputFile – path to input file

class tools.ioObjects.LheStatus(filename)[source]

Bases: object

Object to check if input lhe file contains errors.

Variables

filename – path to input LHE file

evaluateStatus()[source]

run status check

class tools.ioObjects.OutputStatus(status, inputFile, parameters, databaseVersion)[source]

Bases: object

Object that holds all status information and has a predefined printout.

Initialize output. If one of the checks failed, exit.

Parameters
  • status – status of input file

  • inputFile – input file name

  • parameters – input parameters

  • databaseVersion – database version (string)

addWarning(warning)[source]

Append warning to warnings.

Parameters

warning – warning to be appended

updateSLHAStatus(status)[source]

Update SLHA status.

Parameters

status – new SLHA status flag

updateStatus(status)[source]

Update status.

Parameters

status – new status flag

class tools.ioObjects.SlhaStatus(filename, findMissingDecayBlocks=True, findIllegalDecays=False, checkXsec=True)[source]

Bases: object

An instance of this class represents the status of an SLHA file. The output status is: = 0 : the file is not checked, = 1: the check is ok = -1: case of a physical problem, e.g. charged LSP, = -2: case of formal problems, e.g. no cross sections

Parameters
  • filename – path to input SLHA file

  • findMissingDecayBlocks – if True add a warning for missing decay blocks

  • findIllegalDecays – if True check if all decays are kinematically allowed

  • checkXsec – if True check if SLHA file contains cross sections

  • findLonglived – if True find stable charged particles and displaced vertices

emptyDecay(pid)[source]

Check if any decay is missing for the particle with pid

Parameters

pid – PID number of particle to be checked

Returns

True if the decay block is missing or if it is empty, None otherwise

evaluateStatus()[source]

Get status summary from all performed checks.

Returns

a status flag and a message for explanation

findIllegalDecay(findIllegal)[source]

Find decays for which the sum of daughter masses excels the mother mass

Parameters

findIllegal – True if check should be run

Returns

status flag and message

findMissingDecayBlocks(findMissingBlocks)[source]

For all non-SMpdgs particles listed in mass block, check if decay block is written

Returns

status flag and message

hasXsec(checkXsec)[source]

Check if XSECTION table is present in the slha file.

Parameters

checkXsec – set True to run the check

Returns

status flag, message

read()[source]

Get pyslha output object.

tools.lheChecks module

tools.lheChecks.main(args)[source]

tools.modelTester module

tools.modelTester.checkForSemicolon(strng, section, var)[source]
tools.modelTester.getAllInputFiles(inFile)[source]

Given inFile, return list of all input files

Parameters

inFile – Path to input file or directory containing input files

Returns

List of all input files, and the directory name

tools.modelTester.getCombiner(inputFile, parameterFile)[source]

Facility for running SModelS, computing the theory predictions and returning the combination of analyses (defined in the parameterFile). Useful for plotting likelihoods!. Extracts and returns the TheoryPredictionsCombiner object from the master printer, if the object is found. Return None otherwise.

Parameters
  • inputFile – path to the input SLHA file

  • parameterFile – path to parameters.ini file

Returns

TheoryPredictionsCombiner object generated by running SModelS.

tools.modelTester.getParameters(parameterFile)[source]

Read parameter file, exit in case of errors

Parameters

parameterFile – Path to parameter File

Returns

ConfigParser read from parameterFile

tools.modelTester.loadDatabase(parser, db)[source]

Load database

Parameters
  • parser – ConfigParser with path to database

  • db – binary database object. If None, then database is loaded, according to databasePath. If True, then database is loaded, and text mode is forced.

Returns

database object, database version

tools.modelTester.loadDatabaseResults(parser, database)[source]

Load database entries specified in parser

Parameters
  • parser – ConfigParser, containing analysis and txnames selection

  • database – Database object

Returns

List of experimental results

tools.modelTester.runSetOfFiles(inputFiles, outputDir, parser, databaseVersion, listOfExpRes, timeout, development, parameterFile)[source]

Loop over all input files in inputFiles with testPoint

Parameters
  • inputFiles – list of input files to be tested

  • outputDir – path to directory where output is be stored

  • parser – ConfigParser storing information from parameter.ini file

  • databaseVersion – Database version (printed to output file)

  • listOfExpRes – list of ExpResult objects to be considered

  • development – turn on development mode (e.g. no crash report)

  • parameterFile – parameter file, for crash reports

Returns

printers output

tools.modelTester.runSingleFile(inputFile, outputDir, parser, databaseVersion, listOfExpRes, timeout, development, parameterFile)[source]

Call testPoint on inputFile, write crash report in case of problems

Parameters
  • inputFile – path to input file

  • outputDir – path to directory where output is be stored

  • parser – ConfigParser storing information from parameter.ini file

  • databaseVersion – Database version (printed to output file)

  • listOfExpRes – list of ExpResult objects to be considered

  • crashReport – if True, write crash report in case of problems

  • timeout – set a timeout for one model point (0 means no timeout)

Returns

output of printers

tools.modelTester.setExperimentalFlag(parser)[source]

set the experimental flag, if options:experimental = True

tools.modelTester.testPoint(inputFile, outputDir, parser, databaseVersion, listOfExpRes)[source]

Test model point defined in input file (running decomposition, check results, test coverage)

Parameters
  • inputFile – path to input file

  • outputDir – path to directory where output is be stored

  • parser – ConfigParser storing information from parameters file

  • databaseVersion – Database version (printed to output file)

  • listOfExpRes – list of ExpResult objects to be considered

Returns

dictionary with input filename as key and the MasterPrinter object as value

tools.modelTester.testPoints(fileList, inDir, outputDir, parser, databaseVersion, listOfExpRes, timeout, development, parameterFile)[source]

Loop over all input files in fileList with testPoint, using ncpus CPUs defined in parser

Parameters
  • fileList – list of input files to be tested

  • inDir – path to directory where input files are stored

  • outputDir – path to directory where output is stored

  • parser – ConfigParser storing information from parameter.ini file

  • databaseVersion – Database version (printed to output files)

  • listOfExpRes – list of ExpResult objects to be considered

  • timeout – set a timeout for one model point (0 means no timeout)

  • development – turn on development mode (e.g. no crash report)

  • parameterFile – parameter file, for crash reports

Returns

printer(s) output, if not run in parallel mode

tools.nllFastWrapper module

class tools.nllFastWrapper.NllFastWrapper(sqrts, nllfastVersion, testParams, testCondition)[source]

Bases: WrapperBase

An instance of this class represents the installation of nllfast.

Parameters
  • sqrts – sqrt of s, in TeV, as an integer,

  • nllfastVersion – version of the nllfast tool

  • testParams – what are the test params we need to run things with?

  • testCondition – the line that should be the last output line when running executable

SrcPath

the path of the source code, for compilation

getKfactorsFor(pIDs, slhafile, pdf='cteq')[source]

Read the NLLfast grid and returns a pair of k-factors (NLO and NLL) for the PIDs pair.

Returns

k-factors = None, if NLLfast does not contain the process; uses the slhafile to obtain the SUSY spectrum.

class tools.nllFastWrapper.NllFastWrapper13[source]

Bases: NllFastWrapper

An instance of this class represents the installation of nllfast 8.

Parameters
  • sqrts – sqrt of s, in TeV, as an integer,

  • nllfastVersion – version of the nllfast tool

  • testParams – what are the test params we need to run things with?

  • testCondition – the line that should be the last output line when running executable

SrcPath

the path of the source code, for compilation

class tools.nllFastWrapper.NllFastWrapper7[source]

Bases: NllFastWrapper

An instance of this class represents the installation of nllfast 7.

Parameters
  • sqrts – sqrt of s, in TeV, as an integer,

  • nllfastVersion – version of the nllfast tool

  • testParams – what are the test params we need to run things with?

  • testCondition – the line that should be the last output line when running executable

SrcPath

the path of the source code, for compilation

class tools.nllFastWrapper.NllFastWrapper8[source]

Bases: NllFastWrapper

An instance of this class represents the installation of nllfast 8.

Parameters
  • sqrts – sqrt of s, in TeV, as an integer,

  • nllfastVersion – version of the nllfast tool

  • testParams – what are the test params we need to run things with?

  • testCondition – the line that should be the last output line when running executable

SrcPath

the path of the source code, for compilation

tools.physicsUnits module

tools.printer module

class tools.printer.BasicPrinter(output, filename)[source]

Bases: object

Super class to handle the basic printing methods

Variables

typeofexpectedvalues (str) – what type of expected values to print, apriori or posteriori

addObj(obj)[source]

Adds object to the Printer.

Parameters

obj – A object to be printed. Must match one of the types defined in formatObj

Returns

True if the object has been added to the output. If the object does not belong to the pre-defined printing list toPrint, returns False.

property filename
flush()[source]

Format the objects added to the output, print them to the screen or file and remove them from the printer.

getTypeOfExpected()[source]

tiny convenience function for what expected values to print, apriori (True) or posteriori

mkdir()[source]

create directory to file, if necessary

openOutFile(filename, mode)[source]

creates and opens a data sink, creates path if needed

setOptions(options)[source]

Store the printer specific options to control the output of each printer. Each option is stored as a printer attribute.

Parameters

options – a list of (option,value) for the printer.

class tools.printer.MPrinter[source]

Bases: object

Master Printer class to handle the Printers (one printer/output type)

addObj(obj)[source]

Adds the object to all its Printers:

Parameters

obj – An object which can be handled by the Printers.

flush()[source]

Ask all printers to write the output and clear their cache. If the printers return anything other than None, we pass it on.

setOutPutFiles(filename, silent=False)[source]

Set the basename for the output files. Each printer will use this file name appended of the respective extension (i.e. .py for a python printer, .smodels for a summary printer,…)

Parameters
  • filename – Input file name

  • silent – dont comment removing old files

setPrinterOptions(parser)[source]

Define the printer types and their options.

Parameters

parser – ConfigParser storing information from the parameters file

class tools.printer.PyPrinter(output='stdout', filename=None)[source]

Bases: BasicPrinter

Printer class to handle the printing of one single pythonic output

Variables

typeofexpectedvalues (str) – what type of expected values to print, apriori or posteriori

flush()[source]

Write the python dictionaries generated by the object formatting to the defined output

setOutPutFile(filename, overwrite=True, silent=False)[source]

Set the basename for the text printer. The output filename will be filename.py. :param filename: Base filename :param overwrite: If True and the file already exists, it will be removed. :param silent: dont comment removing old files

class tools.printer.SLHAPrinter(output='file', filename=None)[source]

Bases: TxTPrinter

Printer class to handle the printing of slha format summary output. It uses the facilities of the TxTPrinter.

Variables

typeofexpectedvalues (str) – what type of expected values to print, apriori or posteriori

setOutPutFile(filename, overwrite=True, silent=False)[source]

Set the basename for the text printer. The output filename will be filename.smodels. :param filename: Base filename :param overwrite: If True and the file already exists, it will be removed. :param silent: dont comment removing old files

class tools.printer.SummaryPrinter(output='stdout', filename=None)[source]

Bases: TxTPrinter

Printer class to handle the printing of one single summary output. It uses the facilities of the TxTPrinter.

Variables

typeofexpectedvalues (str) – what type of expected values to print, apriori or posteriori

setOutPutFile(filename, overwrite=True, silent=False)[source]

Set the basename for the text printer. The output filename will be filename.smodels. :param filename: Base filename :param overwrite: If True and the file already exists, it will be removed. :param silent: dont comment removing old files

class tools.printer.TxTPrinter(output='stdout', filename=None)[source]

Bases: BasicPrinter

Printer class to handle the printing of one single text output

Variables

typeofexpectedvalues (str) – what type of expected values to print, apriori or posteriori

setOutPutFile(filename, overwrite=True, silent=False)[source]

Set the basename for the text printer. The output filename will be filename.log.

Parameters
  • filename – Base filename

  • overwrite – If True and the file already exists, it will be removed.

  • silent – dont comment removing old files

class tools.printer.XmlPrinter(output='stdout', filename=None)[source]

Bases: PyPrinter

Printer class to handle the printing of one single XML output

Variables

typeofexpectedvalues (str) – what type of expected values to print, apriori or posteriori

convertToElement(pyObj, parent, tag='')[source]

Convert a python object (list,dict,string,…) to a nested XML element tree. :param pyObj: python object (list,dict,string…) :param parent: XML Element parent :param tag: tag for the daughter element

flush()[source]

Get the python dictionaries generated by the object formatting to the defined output and convert to XML

setOutPutFile(filename, overwrite=True, silent=False)[source]

Set the basename for the text printer. The output filename will be filename.xml. :param filename: Base filename :param overwrite: If True and the file already exists, it will be removed. :param silent: dont comment removing old files

tools.printer.getInfoFromPython(output)[source]

Retrieves information from the python output

Parameters

output – output (dictionary)

Returns

list of r-values,r-expected and analysis IDs. None if no results are found. If there are results for combined analyses, returns the largest r-value and the corresponding r-expected from the combination.

tools.printer.getInfoFromSLHA(output)[source]

Retrieves information from the SLHA output

Parameters

output – output (string)

Returns

list of r-values,r-expected and analysis IDs. None if no results are found. If there are results for combined analyses, returns the largest r-value and the corresponding r-expected from the combination.

tools.printer.getInfoFromSummary(output)[source]

Retrieves information from the summary output

Parameters

output – output (string)

Returns

list of r-values,r-expected and analysis IDs. None if no results are found. If there are results for combined analyses, returns the largest r-value and the corresponding r-expected from the combination.

tools.printer.getSummaryFrom(output, ptype)[source]

Retrieves information about the output according to the printer type (slha,python or summary)

Parameters
  • output – output (dictionary for ptype=python or string for ptype=slha/summary)

  • ptype – Printer type (slha, python or summary)

Returns

Dictionary with the output information

tools.printer.printScanSummary(outputDict, outputFile)[source]

Method for creating a simple summary of the results when running SModelS over multiple files.

Parameters
  • outputDict – A dictionary with filenames as keys and the master printer flush dictionary as values.

  • outputFile – Path to the summary file to be written.

tools.proxyDBCreator module

class tools.proxyDBCreator.ProxyDBCreater(inputfile, rundir, verbose='info')[source]

Bases: object

create(servername, serverport)[source]
pprint(*args)[source]
run(really)[source]

now run the server :param really: if False, then only write out command

store(outputfile)[source]

store the outputfile

set a symlink from self.outputfile to default.pcl

tools.proxyDBCreator.main(args)[source]

tools.pyhfInterface module

class tools.pyhfInterface.PyhfData(nsignals, inputJsons, jsonFiles=None)[source]

Bases: object

Holds data for use in pyhf :ivar nsignals: signal predictions list divided into sublists, one for each json file :ivar inputJsons: list of json instances :ivar jsonFiles: optional list of json files :ivar nWS: number of workspaces = number of json files

checkConsistency()[source]

Check various inconsistencies of the PyhfData attributes

Parameters

zeroSignalsFlag – boolean identifying if all SRs of a single json are empty

getTotalYield()[source]

the total yield in all signal regions

getWSInfo()[source]

Getting informations from the json files

Variables

channelsInfo – list of dictionaries (one dictionary for each json file) containing useful information about the json files - :key signalRegions: list of dictonaries with ‘json path’ and ‘size’ (number of bins) of the ‘signal regions’ channels in the json files - :key otherRegions: list of strings indicating the path to the control and validation region channels

class tools.pyhfInterface.PyhfUpperLimitComputer(data, cl=0.95, includeCRs=False, lumi=None)[source]

Bases: object

Class that computes the upper limit using the jsons files and signal informations in the data instance of PyhfData

Parameters
  • data – instance of PyhfData holding the signals information

  • cl – confdence level at which the upper limit is desired to be computed

Variables
  • data – created from :param data:

  • nsignals – signal predictions list divided into sublists, one for each json file

  • inputJsons – list of input json files as python json instances

  • channelsInfo – list of channels information for the json files

  • zeroSignalsFlag – list boolean flags in case all signals are zero for a specific json

  • nWS – number of workspaces = number of json files

  • patches – list of patches to be applied to the inputJsons as python dictionary instances

  • workspaces – list of workspaces resulting from the patched inputJsons

;ivar workspaces_expected: list of patched workspaces with observation yields replaced by the expected ones :ivar cl: created from :param cl: :ivar scale: scale that is applied to the signal predictions, dynamically changes throughout the upper limit calculation :ivar alreadyBeenThere: boolean flag that identifies when the :ivar nsignals: accidentally passes twice at two identical values

backup()[source]
checkPyhfVersion()[source]

check the pyhf version, currently we need 0.6.1+

compute_invhess(x, data, model, index, epsilon=1e-05)[source]

if inv_hess is not given by the optimiser, calculate numerically by evaluating second order partial derivatives using 2 point central finite differences method :param x: parameter values given to pyhf.infer.mle.twice_nll taken from pyhf.infer.mle.fit - optimizer.x (best_fit parameter values) :param data: workspace.data(model) passed to pyhf.infer.mle.fit :param model: model passed to pyhf.infer.mle.fit :param index: index of the POI Note : If len(x) <=5, compute the entire hessian matrix and ind its inverse. Else, compute the hessian at the index of the POI and return its inverse (diagonal approximation) returns the inverse hessian at the index of the poi

exponentiateNLL(twice_nll, doIt)[source]

if doIt, then compute likelihood from nll, else return nll

getBestCombinationIndex()[source]

find the index of the best expected combination

getUpperLimitOnMu(expected=False, workspace_index=None)[source]
Compute the upper limit on the signal strength modifier with:
  • by default, the combination of the workspaces contained into self.workspaces

  • if workspace_index is specified, self.workspace[workspace_index] (useful for computation of the best upper limit)

Parameters
  • expected

    • if set to True: uses expected SM backgrounds as signals

    • else: uses self.nsignals

  • workspace_index

    • if different from None: index of the workspace to use for upper limit

    • else: choose best combo

Returns

the upper limit at self.cl level (0.95 by default)

getUpperLimitOnSigmaTimesEff(expected=False, workspace_index=None)[source]
Compute the upper limit on the fiducial cross section sigma times efficiency:
  • by default, the combination of the workspaces contained into self.workspaces

  • if workspace_index is specified, self.workspace[workspace_index] (useful for computation of the best upper limit)

Parameters
  • expected

    • if set to True: uses expected SM backgrounds as signals

    • else: uses self.nsignals

  • workspace_index

    • if different from None: index of the workspace to use for upper limit

    • else: choose best combo

Returns

the upper limit on sigma times eff at self.cl level (0.95 by default)

get_position(name, model)[source]
Parameters
  • name – name of the parameter one wants to increase

  • model – the pyhf model

Returns

the position of the parameter that has to be modified in order to turn positive the negative total yield

likelihood(mu=1.0, workspace_index=None, return_nll=False, expected=False)[source]

Returns the value of the likelihood. Inspired by the pyhf.infer.mle module but for non-log likelihood :param workspace_index: supply index of workspace to use. If None, choose index of best combo :param return_nll: if true, return nll, not llhd :param expected: if False, compute expected values, if True, compute a priori expected, if “posteriori” compute posteriori expected

lmax(workspace_index=None, return_nll=False, expected=False, allowNegativeSignals=False)[source]

Returns the negative log max likelihood :param return_nll: if true, return nll, not llhd :param workspace_index: supply index of workspace to use. If None, choose index of best combo :param expected: if False, compute expected values, if True, compute a priori expected, if “posteriori” compute posteriori expected :param allowNegativeSignals: if False, then negative nsigs are replaced with 0.

patchMaker()[source]

Method that creates the list of patches to be applied to the self.inputJsons workspaces, one for each region given the self.nsignals and the informations available in self.channelsInfo and the content of the self.inputJsons NB: It seems we need to include the change of the “modifiers” in the patches as well

Returns

the list of patches, one for each workspace

rescale(factor)[source]

Rescales the signal predictions (self.nsignals) and processes again the patches and workspaces

Returns

updated list of patches and workspaces (self.patches, self.workspaces and self.workspaces_expected)

rescaleBgYields(init_pars, workspace, model)[source]
Parameters
  • init_pars – list of initial parameters values one wants to increase in order to turn positive the negative total yields

  • workspace – the pyhf workspace

  • model – the pyhf model

Returns

the list of initial parameters values that gives positive total yields

restore()[source]
updateWorkspace(workspace_index=None, expected=False)[source]

Small method used to return the appropriate workspace

Parameters
  • workspace_index – the index of the workspace to retrieve from the corresponding list

  • expected – if False, retuns the unmodified (but patched) workspace. Used for computing observed or aposteriori expected limits. if True, retuns the modified (and patched) workspace, where obs = sum(bkg). Used for computing apriori expected limit.

welcome()[source]

greet the world

wsMaker(apriori=False)[source]

Apply each region patch (self.patches) to his associated json (self.inputJsons) to obtain the complete workspaces :param apriori: - If set to True: Replace the observation data entries of each workspace by the corresponding sum of the expected yields - Else: The observed yields put in the workspace are the ones written in the corresponfing json dictionary

Returns

the list of patched workspaces

tools.pyhfInterface.getLogger()[source]

Configure the logging facility. Maybe adapted to fit into your framework.

tools.pythia6Wrapper module

class tools.pythia6Wrapper.Pythia6Wrapper(configFile='<install>/smodels/etc/pythia.card', executablePath='<install>/smodels/lib/pythia6/pythia_lhe', srcPath='<install>/smodels/lib/pythia6/')[source]

Bases: WrapperBase

An instance of this class represents the installation of pythia6. nevents keeps track of how many events we run. For each event we only allow a certain computation time: if self.secondsPerEvent * self.nevents > CPU time, we terminate Pythia.

Parameters
  • configFile – Location of the config file, full path; copy this file and provide tools to change its content and to provide a template

  • executablePath – Location of executable, full path (pythia_lhe)

  • srcPath – Location of source code

checkFileExists(inputFile)[source]

Check if file exists, raise an IOError if it does not.

Returns

absolute file name if file exists.

replaceInCfgFile(replacements={'NEVENTS': 10000, 'SQRTS': 8000})[source]

Replace strings in the config file by other strings, similar to setParameter.

This is introduced as a simple mechanism to make changes to the parameter file.

Parameters

replacements – dictionary of strings and values; the strings will be replaced with the values; the dictionary keys must be strings present in the config file

run(slhafile, lhefile=None, unlink=True)[source]

Execute pythia_lhe with n events, at sqrt(s)=sqrts.

Parameters
  • slhafile – input SLHA file

  • lhefile – option to write LHE output to file; if None, do not write output to disk. If lhe file exists, use its events for xsecs calculation.

  • unlink – Clean up temp directory after running pythia

Returns

List of cross sections

setParameter(param='MSTP(163)', value=6)[source]

Modifies the config file, similar to .replaceInCfgFile.

It will set param to value, overwriting possible old values.

Remove temporary files.

Parameters

unlinkdir – remove temp directory completely

tools.pythia8Wrapper module

class tools.pythia8Wrapper.Pythia8Wrapper(configFile='<install>/smodels/etc/pythia8.cfg', executablePath='<install>/smodels/lib/pythia8/pythia8.exe', srcPath='<install>/smodels/lib/pythia8/')[source]

Bases: WrapperBase

An instance of this class represents the installation of pythia8.

Parameters
  • configFile – Location of the config file, full path; copy this file and provide tools to change its content and to provide a template

  • executablePath – Location of executable, full path (pythia8.exe)

  • srcPath – Location of source code

checkFileExists(inputFile)[source]

Check if file exists, raise an IOError if it does not.

Returns

absolute file name if file exists.

checkInstallation(compile: bool = True)[source]

Checks if installation of tool is correct by looking for executable and executing it. If check is False and compile is True, then try and compile it.

Returns

True, if everything is ok

chmod()[source]

chmod 755 on pythia executable, if it exists. Do nothing, if it doesnt exist.

getPythiaVersion()[source]

obtain the pythia version we wish to use, stored in file ‘pythiaversion’

getXmldoc()[source]

get the content of xml.doc

run(slhaFile, lhefile=None, unlink=True)[source]

Run pythia8.

Parameters
  • slhaFile – SLHA file

  • lhefile – option to write LHE output to file; if None, do not write output to disk. If lhe file exists, use its events for xsecs calculation.

  • unlink – clean up temporary files after run?

Returns

List of cross sections

Remove temporary files.

Parameters

unlinkdir – remove temp directory completely

tools.pythia8particles module

tools.reweighting module

tools.reweighting.calculateProbabilities(width, Leff_inner, Leff_outer)[source]

The fraction of prompt and displaced decays are defined as:

F_long = exp(-totalwidth*l_outer/gb_outer) F_prompt = 1 - exp(-totaltotalwidth*l_inner/gb_inner) F_displaced = 1 - F_prompt - F_long

Parameters
  • Leff_inner – is the effective inner radius of the detector, given in meters

  • Leff_outer – is the effective outer radius of the detector, given in meters

  • width – particle width for which probabilities should be calculated (in GeV)

Returns

Dictionary with the probabilities for the particle not to decay (in the detector), to decay promptly or displaced.

tools.reweighting.defaultEffReweight(element, Leff_inner=None, Leff_outer=None, minWeight=1e-10)[source]

Computes the lifetime reweighting factor for the element efficiency based on the lifetimes of all intermediate particles and the last stable odd-particle appearing in the element. The fraction corresponds to the fraction of decays corresponding to prompt decays to all intermediate BSM particles and to a long-lived decay (outside the detector) to the final BSM state.

Parameters
  • element – Element object or nested list of widths

  • minWeight – Lower cut for the reweighting factor. Any value below this will be taken to be zero.

  • Leff_inner – is the effective inner radius of the detector, given in meters. If None, use default value.

  • Leff_outer – is the effective outer radius of the detector, given in meters. If None, use default value.

Returns

Reweight factor (float)

tools.reweighting.defaultULReweight(element, Leff_inner=None, Leff_outer=None)[source]

Computes the lifetime reweighting factor for the element upper limit based on the lifetimes of all intermediate particles and the last stable odd-particle appearing in the element. The fraction corresponds to the fraction of decays corresponding to prompt decays to all intermediate BSM particles and to a long-lived decay (outside the detector) to the final BSM state.

Parameters
  • element – Element object

  • Leff_inner – is the effective inner radius of the detector, given in meters. If None, use default value.

  • Leff_outer – is the effective outer radius of the detector, given in meters. If None, use default value.

Returns

Reweight factor (float)

tools.reweighting.reweightFactorFor(element, resType='prompt', Leff_inner=None, Leff_outer=None)[source]

Computer the reweighting factor for the element according to the experimental result type. Currently only two result types are supported: ‘prompt’ and ‘displaced’. If resultType = ‘prompt’, returns the reweighting factor for all decays in the element to be prompt and the last odd particle to be stable. If resultType = ‘displaced’, returns the reweighting factor for ANY decay in the element to be displaced and no long-lived decays and the last odd particle to be stable. Not that the fraction of “long-lived (meta-stable) decays” is usually included in topologies where the meta-stable particle appears in the final state. Hence it should not be included in the prompt or displaced fractions.

Parameters
  • element – Element object

  • resType – Type of result to compute the reweight factor for (either ‘prompt’ or ‘displaced’)

  • Leff_inner – is the effective inner radius of the detector, given in meters. If None, use default value.

  • Leff_outer – is the effective outer radius of the detector, given in meters. If None, use default value.

Returns

probabilities (depending on types of decay within branch), branches (with different labels depending on type of decay)

tools.runSModelS module

tools.runSModelS.main()[source]
tools.runSModelS.run(inFile, parameterFile, outputDir, db, timeout, development)[source]

Provides a command line interface to basic SModelS functionalities.

Parameters
  • inFile – input file name (either a SLHA or LHE file) or directory name (path to directory containing input files)

  • parameterFile – File containing the input parameters (default = smodels/etc/parameters_default.ini)

  • outputDir – Output directory to write a summary of results to

  • db – supply a smodels.experiment.databaseObj.Database object, so the database doesn’t have to be loaded anymore. Will render a few parameters in the parameter file irrelevant. If None, load the database as described in parameterFile, If True, force loading the text database.

  • timeout – set a timeout for one model point (0 means no timeout)

  • development – turn on development mode (e.g. no crash report)

tools.runtime module

tools.runtime.experimentalFeatures()[source]

a simple boolean flag to turn experimental features on/off, can be turned on and off via options:experimental in parameters.ini.

tools.runtime.filetype(filename)[source]

obtain information about the filetype of an input file, currently only used to discriminate between slha and lhe files.

Returns

filetype as string(“slha” or “lhe”), None if file does not exist, or filetype is unknown.

tools.runtime.nCPUs()[source]

obtain the number of available CPU cores on the machine, for several platforms and python versions.

tools.simplifiedLikelihoods module

class tools.simplifiedLikelihoods.Data(observed, backgrounds, covariance, third_moment=None, nsignal=None, name='model', deltas_rel=0.2, lumi=None)[source]

Bases: object

A very simple observed container to collect all the data needed to fully define a specific statistical model

Parameters
  • observed – number of observed events per dataset

  • backgrounds – expected bg per dataset

  • covariance – uncertainty in background, as a covariance matrix

  • nsignal – number of signal events in each dataset

  • name – give the model a name, just for convenience

  • deltas_rel – the assumed relative error on the signal hypotheses. The default is 20%.

  • lumi – luminosity of dataset in 1/fb, or None

convert(obj)[source]

Convert object to numpy arrays. If object is a float or int, it is converted to a one element array.

correlations()[source]

Correlation matrix, computed from covariance matrix. Convenience function.

diagCov()[source]

Diagonal elements of covariance matrix. Convenience function.

isLinear()[source]

Statistical model is linear, i.e. no quadratic term in poissonians

isScalar(obj)[source]

Determine if obj is a scalar (float or int)

nsignals(mu)[source]

Returns the number of expected signal events, for all datasets, given total signal strength mu.

Parameters

mu – Total number of signal events summed over all datasets.

rel_signals(mu)[source]

Returns the number of expected relative signal events, for all datasets, given total signal strength mu. For mu=1, the sum of the numbers = 1.

Parameters

mu – Total number of signal events summed over all datasets.

sandwich()[source]

Sandwich product

totalCovariance(nsig)[source]

get the total covariance matrix, taking into account also signal uncertainty for the signal hypothesis <nsig>. If nsig is None, the predefined signal hypothesis is taken.

var_s(nsig=None)[source]

The signal variances. Convenience function.

Parameters

nsig – If None, it will use the model expected number of signal events, otherwise will return the variances for the input value using the relative signal uncertainty defined for the model.

zeroSignal()[source]

Is the total number of signal events zero?

class tools.simplifiedLikelihoods.LikelihoodComputer(data)[source]

Bases: object

Parameters

data – a Data object.

chi2()[source]

Computes the chi2 for a given number of observed events nobs given the predicted background nb, error on this background deltab, expected number of signal events nsig and the relative error on signal (deltas_rel). :param nsig: number of signal events :return: chi2 (float)

d2NLLdMu2(mu, theta_hat, allowZeroHessian=True)[source]

the hessian of the likelihood of mu, at mu, which is the Fisher information which is approximately the inverse of the covariance :param allowZeroHessian: if false and sum(observed)==0, then replace observed with expected

d2NLLdTheta2(theta)[source]

the Hessian of nll as a function of the thetas. Makes it easier to find the maximum likelihood.

dNLLdMu(mu, theta_hat=None)[source]

d (- ln L)/d mu, if L is the likelihood. The function whose root gives us muhat, i.e. the mu that maximizes the likelihood.

Parameters
  • mu – total number of signal events

  • theta_hat – array with nuisance parameters, if None then compute them

dNLLdTheta(theta)[source]

the derivative of nll as a function of the thetas. Makes it easier to find the maximum likelihood.

debug_mode = False
extendedOutput(extended_output, default=None)[source]
findAvgr(theta_hat)[source]

from the difference observed - background, find got inital values for lower and upper

findMuHat(allowNegativeSignals=False, extended_output=False, return_nll=False)[source]

Find the most likely signal strength mu via gradient descent given the relative signal strengths in each dataset (signal region).

Parameters
  • allowNegativeSignals – if true, then also allow for negative values

  • extended_output – if true, return also sigma_mu, the estimate of the error of mu_hat, and lmax, the likelihood at mu_hat

  • return_nll – if true, return nll instead of lmax in the extended output

Returns

mu_hat, i.e. the maximum likelihood estimate of mu, if extended output is requested, it returns mu_hat, sigma_mu – the standard deviation around mu_hat, and llhd, the likelihood at mu_hat

findMuHatViaBracketing(allowNegativeSignals=False, extended_output=False, nll=False)[source]

Find the most likely signal strength mu via a brent bracketing technique given the relative signal strengths in each dataset (signal region).

Parameters
  • allowNegativeSignals – if true, then also allow for negative values

  • extended_output – if true, return also sigma_mu, the estimate of the error of mu_hat, and lmax, the likelihood at mu_hat

  • nll – if true, return nll instead of lmax in the extended output

Returns

mu_hat, i.e. the maximum likelihood estimate of mu, if extended output is requested, it returns a dictionary with mu_hat, sigma_mu – the standard deviation around mu_hat, and lmax, i.e. the likelihood at mu_hat

findThetaHat(mu: float)[source]

Compute nuisance parameters theta that maximize our likelihood (poisson*gauss).

getSigmaMu(mu, theta_hat)[source]

Get an estimate for the standard deviation of mu at <mu>, from the inverse hessian

getThetaHat(nobs, nb, mu, covb, max_iterations)[source]

Compute nuisance parameter theta that maximizes our likelihood (poisson*gauss) – by setting dNLL/dTheta to zero :param mu: signal strength :returns: theta_hat

likelihood(mu: float, return_nll: bool = False)[source]

compute the profiled likelihood for mu. :param mu: float Parameter of interest, signal strength :param return_nll: if true, return nll instead of likelihood Returns profile likelihood and error code (0=no error)

llhdOfTheta(theta, nll=True)[source]

likelihood for nuicance parameters theta, given signal strength self.mu. notice, by default it returns nll :param theta: nuisance parameters :params nll: if True, compute negative log likelihood

lmax(return_nll=False, allowNegativeSignals=False)[source]

convenience function, computes likelihood for nsig = nobs-nbg, :param return_nll: return nll instead of likelihood :param allowNegativeSignals: if False, then negative nsigs are replaced with 0.

transform(expected: Union[str, bool])[source]

replace the actual observations with backgrounds, if expected is True or “posteriori”

class tools.simplifiedLikelihoods.UpperLimitComputer(cl: float = 0.95)[source]

Bases: object

Parameters

cl – desired quantile for limits

computeCLs(model: Data, expected: Union[bool, str] = False, trylasttime: bool = False, return_type: str = '1-CLs') float[source]

Compute the exclusion confidence level of the model (1-CLs) :param model: statistical model :param expected: if false, compute observed,

true: compute a priori expected, “posteriori”: compute a posteriori expected

Parameters
  • trylasttime – if True, then dont try extra

  • return_type – (Text) can be “CLs-alpha”, “1-CLs”, “CLs” CLs-alpha: returns CLs - 0.05 (alpha) 1-CLs: returns 1-CLs value CLs: returns CLs value

debug_mode = False
getCLsRootFunc(model: Data, expected: Optional[Union[bool, str]] = False, trylasttime: Optional[bool] = False) Tuple[source]

Obtain the function “CLs-alpha[0.05]” whose root defines the upper limit, plus mu_hat and sigma_mu :param model: statistical model :param expected: false: compute observed, true: compute a priori expected, “posteriori”: compute a posteriori expected :param trylasttime: if True, then dont try extra :return: mu_hat, sigma_mu, CLs-alpha

getUpperLimitOnMu(model, expected=False, trylasttime=False)[source]
upper limit on the signal strength multiplier mu

obtained from the defined Data (using the signal prediction

for each signal regio/dataset), by using the q_mu test statistic from the CCGV paper (arXiv:1007.1727).

Params expected

if false, compute observed, true: compute a priori expected, “posteriori”: compute a posteriori expected

Params trylasttime

if True, then dont try extra

Returns

upper limit on the signal strength multiplier mu

getUpperLimitOnSigmaTimesEff(model, expected=False, trylasttime=False)[source]
upper limit on the fiducial cross section sigma times efficiency,

summed over all signal regions, i.e. sum_i xsec^prod_i eff_i obtained from the defined Data (using the signal prediction for each signal region/dataset), by using the q_mu test statistic from the CCGV paper (arXiv:1007.1727).

Params expected

if false, compute observed, true: compute a priori expected, “posteriori”: compute a posteriori expected

Params trylasttime

if True, then dont try extra

Returns

upper limit on fiducial cross section

tools.slhaChecks module

tools.slhaChecks.main(args)[source]

tools.smodelsLogging module

class tools.smodelsLogging.ColorizedStreamHandler(stream=None)[source]

Bases: StreamHandler

Initialize the handler.

If stream is not specified, sys.stderr is used.

format(record)[source]

Format the specified record.

If a formatter is set, use it. Otherwise, use the default formatter for the module.

should_color()[source]
tools.smodelsLogging.getLogLevel(asString=False)[source]

obtain the current log level. :params asString: return string, not number.

tools.smodelsLogging.getLogger()[source]
tools.smodelsLogging.setLogLevel(level)[source]

set the log level of the central logger. can either be directly an integer ( e.g. logging.DEBUG ), or “debug”, “info”, “warning”, or “error”.

tools.smodelsTools module

tools.smodelsTools.main()[source]

tools.statsTools module

class tools.statsTools.StatsComputer(dataObject: Union[DataSet, CombinedDataSet, list], dataType: str, nsig: Union[None, float, List] = None, deltas_rel: Union[None, float] = None, allowNegativeSignals: bool = False)[source]

Bases: object

Initialise.

Parameters
  • dataObject – a smodels (combined)dataset or a list of theory predictions (for combination of analyses)

  • nsig – signal yield, either as float or as list

  • deltas_rel – relative error on signal. currently unused

AllowNegativeSignals

if True, negative values for the signal (mu) are allowed.

CLs(poi_test: float = 1.0, expected: Union[bool, str] = False) Optional[float][source]

compute CLs value for a given value of the poi

allowNegativeSignals
data
dataObject
dataType
deltas_sys
classmethod forAnalysesComb(theoryPredictions, deltas_rel)[source]

get a statscomputer for combination of analyses :param theoryPredictions: list of TheoryPrediction objects :param deltas_rel: relative error for the signal :returns: a StatsComputer

classmethod forMultiBinSL(dataset, nsig, deltas_rel)[source]

get a statscomputer for simplified likelihood combination.

Parameters
  • dataset – CombinedDataSet object

  • nsig – Number of signal events for each SR

Deltas_rel

Relative uncertainty for the signal

Returns

a StatsComputer

classmethod forPyhf(dataset, nsig, deltas_rel)[source]

get a statscomputer for pyhf combination.

Parameters
  • dataset – CombinedDataSet object

  • nsig – Number of signal events for each SR

Deltas_rel

Relative uncertainty for the signal

Returns

a StatsComputer

classmethod forSingleBin(dataset, nsig, deltas_rel)[source]

get a statscomputer for an efficiency map (single bin).

Parameters
  • dataset – DataSet object

  • nsig – Number of signal events for each SR

Deltas_rel

Relative uncertainty for the signal

Returns

a StatsComputer

classmethod forTruncatedGaussian(theorypred, corr: float = 0.6)[source]

get a statscomputer for truncated gaussians :param theorypred: TheoryPrediction object :param corr: correction factor: ULexp_mod = ULexp / (1. - corr*((ULobs-ULexp)/(ULobs+ULexp))) a factor of corr = 0.6 is proposed. :returns: a StatsComputer

getComputerAnalysesComb()[source]

Create computer from a single bin :param nsig: signal yields.

getComputerMultiBinSL()[source]

Create computer from a multi bin SL result

getComputerPyhf()[source]

Create computer for a pyhf result

getComputerSingleBin()[source]

Create computer from a single bin

getComputerTruncGaussian(**kwargs)[source]

Create computer for truncated gaussians

get_five_values(expected: Union[bool, str], return_nll: bool = False, check_for_maxima: bool = False) Dict[source]

return the Five Values: l(bsm), l(sm), muhat, l(muhat), sigma(mu_hat) :param check_for_maxima: if true, then check lmax against l(sm) and l(bsm)

correct, if necessary

likelihood(poi_test: float, expected: Union[bool, str], return_nll: bool) float[source]

simple frontend to individual computers

likelihoodComputer
maximize_likelihood(expected: Union[bool, str], return_nll: bool = False) dict[source]

simple frontend to the individual computers, later spey :param return_nll: if True, return negative log likelihood :returns: Dictionary of llhd (llhd at mu_hat), muhat, sigma_mu (sigma of mu_hat), optionally also theta_hat

nsig
poi_upper_limit(expected: Union[bool, str], limit_on_xsec: bool = False) float[source]
simple frontend to the upperlimit computers, later

to spey::poi_upper_limit

Parameters

limit_on_xsec – if True, then return the limit on the cross section

transform(expected)[source]

SL only. transform the data to expected or observed

upperLimitComputer

tools.stringTools module

tools.stringTools.cleanWalk(topdir)[source]

perform os.walk, but ignore all hidden files and directories

tools.stringTools.concatenateLines(oldcontent)[source]

of all lines in the list “oldcontent”, concatenate the ones that end with or ,

tools.timeOut module

exception tools.timeOut.NoTime(value=None)[source]

Bases: Exception

The time out exception. Raised when the running time exceeds timeout

class tools.timeOut.Timeout(sec)[source]

Bases: object

Timeout class using ALARM signal.

raise_timeout(*args)[source]

tools.toolBox module

class tools.toolBox.ToolBox[source]

Bases: object

A singleton-like class that keeps track of all external tools. Intended to make installation and deployment easier.

Constructor creates the singleton.

add(instance)[source]

Adds a tool by passing an instance to this method.

checkInstallation(make=False, printit=True, longL=False)[source]

Checks if all tools listed are installed properly, returns True if everything is ok, False otherwise.

compile()[source]

Tries to compile and install tools that are not yet marked as ‘installed’.

get(tool, verbose=True)[source]

Gets instance of tool from the toolbox.

initSingleton()[source]

Initializes singleton instance (done only once for the entire class).

installationOk(ok)[source]

Returns color coded string to signal installation issues.

listOfTools()[source]

Returns a simple list with the tool names.

tools.toolBox.main(args)[source]

tools.truncatedGaussians module

class tools.truncatedGaussians.TruncatedGaussians(upperLimitOnMu: float, expectedUpperLimitOnMu: float, corr: Optional[float] = 0.6, cl=0.95)[source]

Bases: object

likelihood computer based on the trunacated Gaussian approximation, see arXiv:1202.3415

Parameters
  • upperLimitOnMu – observed upper limit on signal strength mu

  • expectedUpperLimitOnMu – expected upper limit on signal strength mu

  • corr – correction factor: ULexp_mod = ULexp / (1. - corr*((ULobs-ULexp)/(ULobs+ULexp))) When comparing with likelihoods constructed from efficiency maps, a factor of corr = 0.6 has been found to result in the best approximations.

  • cl – confidence level

cl
corr
denominator
expectedUpperLimitOnMu
likelihood(mu: Optional[float], return_nll: Optional[bool] = False, allowNegativeSignals: Optional[bool] = True, corr: Optional[float] = 0.6, expected: Union[str, bool] = False) Union[None, float][source]

return the likelihood, as a function of mu :param mu: number of signal events, if None then mu = muhat :param return_nll: if True, return negative log likelihood :param allowNegativeSignals: if True, then allow muhat to become negative, else demand that muhat >= 0. In the presence of underfluctuations in the data, setting this to True results in more realistic approximate likelihoods.

Returns

likelihood (float)

lmax(return_nll: Optional[bool] = False, allowNegativeSignals: Optional[bool] = True, corr: Optional[float] = 0.6, expected: Union[bool, str] = False) Dict[source]

return the likelihood, as a function of mu :param mu: number of signal events, if None then mu = muhat :param return_nll: if True, return negative log likelihood :param allowNegativeSignals: if True, then allow muhat to become negative,

else demand that muhat >= 0. In the presence of underfluctuations in the data, setting this to True results in more realistic approximate likelihoods.

Returns

dictionary with likelihood (float), muhat, and sigma_mu

newCorrectionType = False
sigma_mu
upperLimitOnMu

tools.wrapperBase module

class tools.wrapperBase.WrapperBase[source]

Bases: object

An instance of this class represents the installation of an external tool.

An external tool encapsulates a tool that is executed via commands.getoutput. The wrapper defines how the tool is tested for proper installation and how the tool is executed.

absPath(path)[source]

Get the absolute path of <path>, replacing <install> with the installation directory.

basePath()[source]

Get the base installation path.

checkInstallation(compile=True)[source]

Checks if installation of tool is correct by looking for executable and executing it. If check is False and compile is True, then try and compile it.

Returns

True, if everything is ok

chmod()[source]

chmod 755 on executable, if it exists. Do nothing, if it doesnt exist.

compile()[source]

Try to compile the tool.

complain()[source]
defaulttempdir = '/tmp/'
installDirectory()[source]
Returns

the installation directory of the tool

pathOfExecutable()[source]
Returns

path of executable

tempDirectory()[source]

Return the temporary directory name.

tools.wrapperBase.ok(b)[source]
Returns

‘ok’ if b is True, else, return ‘error’.

tools.xsecBase module

class tools.xsecBase.ArgsStandardizer[source]

Bases: object

simple class to collect all argument manipulators

checkAllowedSqrtses(order, sqrtses)[source]

check if the sqrtses are ‘allowed’

checkNCPUs(ncpus, inputFiles)[source]
checkXsec_limit(args)[source]
getInputFiles(args)[source]

geth the names of the slha files to run over

getOrder(args)[source]

retrieve the order in perturbation theory from argument list

getParticles(args)[source]

extract the particles from argument list, default to None, then channels are chosen by the json file

getPythiaVersion(args)[source]
getSSMultipliers(multipliers)[source]
getSqrtses(args)[source]

extract the sqrtses from argument list

getjson(args)[source]

retrieve the path to the json file from argument list

queryCrossSections(filename)[source]
tempDir(args)[source]
writeToFile(args)[source]
class tools.xsecBase.XSecBase(maxOrder, slha_folder_name, maycompile=True)[source]

Bases: object

cross section computer class, what else?

Parameters
  • maxOrder – maximum order to compute the cross section, given as an integer if maxOrder == LO, compute only LO pythia xsecs if maxOrder == NLO, apply NLO K-factors from NLLfast (if available) if maxOrder == NLL, apply NLO+NLL K-factors from NLLfast (if available)

  • nevents – number of events for pythia run

  • pythiaVersion – pythia6 or pythia8 (integer)

  • maycompile – if True, then tools can get compiled on-the-fly

addXSecToFile(xsecs, slhafile, comment=None, complain=True)[source]

Write cross sections to an SLHA file.

Parameters
  • xsecs – a XSectionList object containing the cross sections

  • slhafile – target file for writing the cross sections in SLHA format

  • comment – optional comment to be added to each cross section block

  • complain – complain if there are already cross sections in file

xsecToBlock(xsec, inPDGs=(2212, 2212), comment=None, xsecUnit=1.00E+00[pb])[source]

Generate a string for a XSECTION block in the SLHA format from a XSection object.

Parameters
  • inPDGs – defines the PDGs of the incoming states (default = 2212,2212)

  • comment – is added at the end of the header as a comment

  • xsecUnit – unit of cross sections to be written (default is pb). Must be a Unum unit.

tools.xsecComputer module

class tools.xsecComputer.XSecComputer(maxOrder, nevents, pythiaVersion, maycompile=True, defaulttempdir: str = '/tmp/')[source]

Bases: XSecBase

cross section computer class, what else?

Parameters
  • maxOrder – maximum order to compute the cross section, given as an integer if maxOrder == LO, compute only LO pythia xsecs if maxOrder == NLO, apply NLO K-factors from NLLfast (if available) if maxOrder == NLL, apply NLO+NLL K-factors from NLLfast (if available)

  • nevents – number of events for pythia run

  • pythiaVersion – pythia6 or pythia8 (integer)

  • maycompile – if True, then tools can get compiled on-the-fly

  • defaulttempdir – the default temp directory

addCommentToFile(comment, slhaFile)[source]

add the optional comment to file

addHigherOrders(sqrts, slhafile)[source]

add higher order xsecs

addMultipliersToFile(ssmultipliers, slhaFile)[source]

add the signal strength multipliers to the SLHA file

applyMultipliers(xsecs, ssmultipliers)[source]

apply the given multipliers to the cross sections

compute(sqrts, slhafile, lhefile=None, unlink=True, loFromSlha=None, pythiacard=None, ssmultipliers=None)[source]

Run pythia and compute SUSY cross sections for the input SLHA file.

Parameters
  • sqrts – sqrt{s} to run Pythia, given as a unum (e.g. 7.*TeV)

  • slhafile – SLHA file

  • lhefile – LHE file. If None, do not write pythia output to file. If file does not exist, write pythia output to this file name. If file exists, read LO xsecs from this file (does not run pythia).

  • unlink – Clean up temp directory after running pythia

  • loFromSlha – If True, uses the LO xsecs from the SLHA file to compute the higher order xsecs

  • pythiaCard – Optional path to pythia.card. If None, uses smodels/etc/pythia.card

  • ssmultipliers – optionally supply signal strengh multipliers, given as dictionary of the tuple of the mothers’ pids as keys and multipliers as values, e.g { (1000001,1000021):1.1 }.

Returns

XSectionList object

computeForBunch(sqrtses, inputFiles, unlink, lOfromSLHA, tofile, pythiacard=None, ssmultipliers=None)[source]

compute xsecs for a bunch of slha files

computeForOneFile(sqrtses, inputFile, unlink, lOfromSLHA, tofile, pythiacard=None, ssmultipliers=None, comment=None)[source]

Compute the cross sections for one file.

Parameters
  • sqrtses – list of sqrt{s} tu run pythia, as a unum (e.g. [7*TeV])

  • inputFile – input SLHA file to compute xsecs for

  • unlink – if False, keep temporary files

  • lofromSLHA – try to obtain LO xsecs from SLHA file itself

  • tofile – False, True, “all”: write results to file, if “all” also write lower xsecs to file.

  • pythiacard – optionally supply your own runcard

  • ssmultipliers – optionally supply signal strengh multipliers, given as dictionary of the tuple of the mothers’ pids as keys and multipliers as values, e.g { (1000001,1000021):1.1 }.

  • comment – an optional comment that gets added to the slha file.

Returns

number of xsections that have been computed

getPythia()[source]

returns the pythia tool that is configured to be used

match(pids, theorypid)[source]

do the pids given by the user match the pids of the theorypred?

tools.xsecComputer.main(args)[source]

tools.xsecResummino module

class tools.xsecResummino.XSecResummino(maxOrder, slha_folder_name, sqrt=13, ncpu=1, maycompile=True, type_writting=None, verbosity='', json=None, particles=None, xsec_limit=None)[source]

Bases: XSecBase

cross section computer class (for resummino), what else?

Parameters
  • maxOrder – maximum order to compute the cross section, given as an integer if maxOrder == LO, compute only LO resummino xsecs if maxOrder == NLO, compute NLO resummino xsecs if maxOrder == NLL, compute NLO+NLL resummino xsecs

  • nevents – number of events for pythia run

  • pythiaVersion – pythia6 or pythia8 (integer)

  • sqrt – Center of mass energy to consider for the cross section calculation

  • xsec_limit – Value below which if mode == “check”, cross section at NLO order are not calculated

  • type – If all, put all the order in the slha file, if highest, only the hightest order.

  • json – Path to the json file with all the relevant informations concerning the resummino calculation

  • resummino_bin – Path to resummino executable

  • input_file_original – Path to the template input file of resummino

  • ncpu – Number of cpu used in parallel for the calculation

  • verbosity – Type of informations given in the logger file

are_crosssection(slha_file, order)[source]

check if the cross sections are already written, and remove the cross section written twice.

calculate_one_slha(particles, input_file, slha_file, output_file, num_try, order, log)[source]

log file management and launch resummino command. Prepare also the cross section list to write the cross section onto the slha file.

checkInstallation(compile: bool = True) bool[source]

check if resummino is already compiled. :param compile: if true, then attempt a compilation if not installed :returns: true if we have a compiled executable

create_routine_files(order, slha_folder_name)[source]

Prepare all the paths and everything before turning into parallel task. resumino.py is called here to avoid multi-tasking on one file. Create also tempfile to stock all data needed by resummino.

create_xsection(result, particle_1, particle_2, order, Xsections)[source]

Create cross section list filled with cross section objects, corresponding to all the channels calculated.

determine_channels()[source]

_summary_

function to find channels using a set of particles

Returns: tuple of:

string: Mode of writting for the slha cross section list: list of the daugther particle to consider in the calculation of the cross section

extract_N1_N2_C1(file_path)[source]

_summary_

function to extract the breaking term of the electrowikino part (SUSY) in an slha file. Args:

file_path (_string_): _path of the slha file_

Returns:

_float_: _N1 Mass of the neutralino 1 _float_: _N2 Mass of the neutralino 2 _float_: _C1 Mass of the chargino 1 _float_: _C2 Mass of the chargnino 2

extract_json()[source]

_summary_

function to extract all the informations in the resummino.py file

Returns: tuple of:

string: Mode of writting for the slha cross section list: list of the daugther particle to consider in the calculation of the cross section

extract_m1_m2_mu(file_path: PathLike) dict[source]

_summary_

function to extract the breaking term of the electrowikino part (SUSY) in an slha file. Args:

file_path (_string_): _path of the slha file_

Returns: dictionary of:

_int_: _M1 breaking term in SUSY models_ _int_: _M2 braking term in SUSY models_ _int_: _mu breaking term in SUSY models_

find_channels(slha_file)[source]
getVersion()[source]

retrieve the version from version_path, set self.version if it doesnt exist, set to default of 3.1.2

launch_all()[source]

Launch all the calculations of the slha files in parallel (limited by ncpu), with first the creation of every path needed for the calculations.

launch_command(resummino_bin, input_file, output_file, order)[source]

use resummino at the order asked by the user (order variable).

launch_resummino(input_file, slha_file, output_file, particle_1, particle_2, num_try, order, Xsections, log)[source]

Check everything before launching resummino.

modify_outgoing_particles(input_file, output_file, new_particle1, new_particle2)[source]

modify the output particles (mother particles) in the resummino .in file. First call the template (input_file), then write into the resummino .in file (output_file). Can also write directly onto the output_file.

modify_slha_file(file_before, file_after, slha_file)[source]

_summary_

Change all the informations in the .in files before launching calculations Args:

file_before (input file for resummino): template file_after (input file for resummino): input file ready for resummino

search_in_output(output_file)[source]

Search in the .out files of resummino (in tempfiles) to get the cross section asked by the users, then extract the LO,NLO and NLL+NLO. If you want to get the incertainties given by resummino, you have everything here in LO, NLO and NLL.

write_in_slha(output_file, slha_file, order, particle_1, particle_2, type_writing, Xsections, log)[source]

Organize here the way cross sections are written into the file (highest, all) and then create cross_section object to let smodels take care of the writing itself with the create_xsection method.

tools.xsecResummino.main(args: Namespace)[source]

the central entry point

Module contents