columnflow.tasks.framework.base#

Generic tools and base tasks that are defined along typical objects in an analysis.

Classes:

Requirements(*others, **kwargs)

BaseTask(*args, **kwargs)

OutputLocation(value)

Output location flag.

AnalysisTask(*args, **kwargs)

ConfigTask(*args, **kwargs)

ShiftTask(*args, **kwargs)

DatasetTask(*args, **kwargs)

CommandTask(*args, **kwargs)

A task that provides convenience methods to work with shell commands, i.e., printing them on the command line and executing them with error handling.

Functions:

wrapper_factory(base_cls, require_cls, enable)

rtype:

law.task.base.Register

class Requirements(*others, **kwargs)[source]#

Bases: DotDict

class BaseTask(*args, **kwargs)[source]#

Bases: Task

Attributes:

task_namespace

This value can be overridden to set the namespace that will be used.

task_namespace = 'cf'#

This value can be overridden to set the namespace that will be used. (See Task.namespaces_famlies_and_ids) If it’s not specified and you try to read this value anyway, it will return garbage. Please use get_task_namespace() to read the namespace.

Note that setting this value with @property will not work, because this is a class level value.

class OutputLocation(value)[source]#

Bases: Enum

Output location flag.

class AnalysisTask(*args, **kwargs)[source]#

Bases: BaseTask, SandboxTask

Attributes:

sandbox

Classes:

output_collection_cls

alias of SiblingFileCollection

Methods:

modify_param_values(params)

Hook to modify command line arguments before instances of this class are created.

req_params(inst, **kwargs)

Returns parameters that are jointly defined in this class and another task instance of some other class.

get_known_shifts(config_inst, params)

Returns two sets of shifts in a tuple: shifts implemented by _this_ task, and depdenent shifts implemented by upstream tasks.

find_config_objects(names, container, object_cls)

Returns all names of objects of type object_cls known to a container (e.g.

resolve_config_default(task_params, param[, ...])

Resolves a given parameter value param, checks if it should be placed with a default value when empty, and in this case, does the actual default value resolution.

resolve_config_default_and_groups(...[, ...])

This method is similar to resolve_config_default() in that it checks if a parameter value param is empty and should be replaced with a default value.

store_parts()

Returns a law.util.InsertableDict whose values are used to create a store path.

local_path(*path[, store, fs])

Joins path fragments from store (defaulting to default_store), store_parts() and path and returns the joined path.

local_target(*path[, dir, store, fs])

Creates either a local file or directory target, depending on dir, forwarding all path fragments, store and fs to local_path() and all kwargs the respective target class.

wlcg_path(*path)

Joins path fragments from store_parts() and path and returns the joined path.

wlcg_target(*path[, dir, fs])

Creates either a remote WLCG file or directory target, depending on dir, forwarding all path fragments to wlcg_path() and all kwargs the respective target class.

target(*path[, location])

sandbox = None#
output_collection_cls#

alias of SiblingFileCollection

classmethod modify_param_values(params)[source]#

Hook to modify command line arguments before instances of this class are created.

Return type:

dict

classmethod req_params(inst, **kwargs)[source]#

Returns parameters that are jointly defined in this class and another task instance of some other class. The parameters are used when calling Task.req(self).

Return type:

dict

classmethod get_known_shifts(config_inst, params)[source]#

Returns two sets of shifts in a tuple: shifts implemented by _this_ task, and depdenent shifts implemented by upstream tasks.

Return type:

tuple[set[str], set[str]]

classmethod find_config_objects(names, container, object_cls, object_groups=None, accept_patterns=True, deep=False)[source]#

Returns all names of objects of type object_cls known to a container (e.g. od.Analysis or od.Config) that match names. A name can also be a pattern to match if accept_patterns is True, or, when given, the key of a mapping object_group that matches group names to object names. When deep is True the lookup of objects in the container is recursive. Example:

find_config_objects(["st_tchannel_*"], config_inst, od.Dataset)
# -> ["st_tchannel_t", "st_tchannel_tbar"]
Return type:

list[str]

classmethod resolve_config_default(task_params, param, container='config_inst', default_str=None, multiple=False)[source]#

Resolves a given parameter value param, checks if it should be placed with a default value when empty, and in this case, does the actual default value resolution.

This resolution is triggered only in case param refers to RESOLVE_DEFAULT, a 1-tuple containing this attribute, or None, If so, the default is identified via the default_str from an order.AuxDataMixin container and points to an auxiliary that can be either a string or a function. In the latter case, it is called with the task class, the container instance, and all task parameters. Note that when no container is given, param is returned unchanged.

When multiple is True, a tuple is returned. If multiple is False and the resolved parameter is an iterable, the first entry is returned.

Example:

def resolve_param_values(params):
    params["producer"] = AnalysisTask.resolve_config_default(
        params,
        params.get("producer"),
        container=params["config_inst"]
        default_str="default_producer",
        multiple=True,
    )

config_inst = od.Config(
    id=0,
    name="my_config",
    aux={"default_producer": ["my_producer_1", "my_producer_2"]},
)

params = {
    "config_inst": config_inst,
    "producer": RESOLVE_DEFAULT,
}
resolve_param_values(params)  # sets params["producer"] to ("my_producer_1", "my_producer_2")

params = {
    "config_inst": config_inst,
    "producer": "some_other_producer",
}
resolve_param_values(params)  # sets params["producer"] to "some_other_producer"

Example where the default points to a function:

Return type:

str | tuple | Any | None

params = {“config_inst”: config_inst, “ml_model”: “some_ml_model”, “inference_model”: “my_inference_model”} resolve_param_values(params) # sets params[“ml_model”] to “some_ml_model”

classmethod resolve_config_default_and_groups(task_params, param, container='config_inst', default_str=None, groups_str=None)[source]#

This method is similar to resolve_config_default() in that it checks if a parameter value param is empty and should be replaced with a default value. See the referenced method for documentation on task_params, param, container and default_str.

What this method does in addition is that it checks if the values contained in param (after default value resolution) refers to a group of values identified via the groups_str from the order.AuxDataMixin container that maps a string to a tuple of strings. If it does, each value in param that refers to a group is expanded by the actual group values.

Example:

config_inst = od.Config(
    id=0,
    name="my_config",
    aux={
        "default_producer": ["features_1", "my_producer_group"],
        "producer_groups": {"my_producer_group": ["features_2", "features_3"]},
    },
)

params = {"producer": RESOLVE_DEFAULT}

AnalysisTask.resolve_config_default_and_groups(
    params,
    params.get("producer"),
    container=config_inst,
    default_str="default_producer",
    groups_str="producer_groups",
)
# -> ("features_1", "features_2", "features_3")
Return type:

tuple[str]

store_parts()[source]#

Returns a law.util.InsertableDict whose values are used to create a store path. For instance, the parts {"keyA": "a", "keyB": "b", 2: "c"} lead to the path “a/b/c”. The keys can be used by subclassing tasks to overwrite values.

local_path(*path, store=None, fs=None)[source]#

Joins path fragments from store (defaulting to default_store), store_parts() and path and returns the joined path. In case a fs is defined, it should refer to the config section of a local file system, and consequently, store is not prepended to the returned path as the resolution of absolute paths is handled by that file system.

local_target(*path, dir=False, store=None, fs=None, **kwargs)[source]#

Creates either a local file or directory target, depending on dir, forwarding all path fragments, store and fs to local_path() and all kwargs the respective target class.

wlcg_path(*path)[source]#

Joins path fragments from store_parts() and path and returns the joined path.

The full URI to the target is not considered as it is usually defined in [wlcg_fs] sections in the law config and hence subject to wlcg_target().

wlcg_target(*path, dir=False, fs=default_wlcg_fs, **kwargs)[source]#

Creates either a remote WLCG file or directory target, depending on dir, forwarding all path fragments to wlcg_path() and all kwargs the respective target class. When None, fs defaults to the default_wlcg_fs class level attribute.

target(*path, location=None, **kwargs)[source]#
class ConfigTask(*args, **kwargs)[source]#

Bases: AnalysisTask

Methods:

store_parts()

Returns a law.util.InsertableDict whose values are used to create a store path.

store_parts()[source]#

Returns a law.util.InsertableDict whose values are used to create a store path. For instance, the parts {"keyA": "a", "keyB": "b", 2: "c"} lead to the path “a/b/c”. The keys can be used by subclassing tasks to overwrite values.

class ShiftTask(*args, **kwargs)[source]#

Bases: ConfigTask

Methods:

modify_param_values(params)

When "config" and "shift" are set, this method evaluates them to set the global shift.

store_parts()

Returns a law.util.InsertableDict whose values are used to create a store path.

classmethod modify_param_values(params)[source]#

When “config” and “shift” are set, this method evaluates them to set the global shift. For that, it takes the shifts stored in the config instance and compares it with those defined by this class.

store_parts()[source]#

Returns a law.util.InsertableDict whose values are used to create a store path. For instance, the parts {"keyA": "a", "keyB": "b", 2: "c"} lead to the path “a/b/c”. The keys can be used by subclassing tasks to overwrite values.

class DatasetTask(*args, **kwargs)[source]#

Bases: ShiftTask

Methods:

get_known_shifts(config_inst, params)

Returns two sets of shifts in a tuple: shifts implemented by _this_ task, and depdenent shifts implemented by upstream tasks.

store_parts()

Returns a law.util.InsertableDict whose values are used to create a store path.

create_branch_map()

Define the branch map for when this task is used as a workflow.

htcondor_destination_info(info)

Hook to modify the additional info printed along logs of the htcondor workflow.

Attributes:

file_merging_factor

Returns the number of files that are handled in one branch.

classmethod get_known_shifts(config_inst, params)[source]#

Returns two sets of shifts in a tuple: shifts implemented by _this_ task, and depdenent shifts implemented by upstream tasks.

Return type:

tuple[set[str], set[str]]

store_parts()[source]#

Returns a law.util.InsertableDict whose values are used to create a store path. For instance, the parts {"keyA": "a", "keyB": "b", 2: "c"} lead to the path “a/b/c”. The keys can be used by subclassing tasks to overwrite values.

property file_merging_factor#

Returns the number of files that are handled in one branch. Consecutive merging steps are not handled yet.

create_branch_map()[source]#

Define the branch map for when this task is used as a workflow. By default, use the merging information provided by file_merging_factor to return a dictionary which maps branches to one or more input file indices. E.g. 1 -> [3, 4, 5] would mean that branch 1 is simultaneously handling input file indices 3, 4 and 5.

htcondor_destination_info(info)[source]#

Hook to modify the additional info printed along logs of the htcondor workflow.

class CommandTask(*args, **kwargs)[source]#

Bases: AnalysisTask

A task that provides convenience methods to work with shell commands, i.e., printing them on the command line and executing them with error handling.

Methods:

run(**kwargs)

The task run method, to be overridden in a subclass.

run(**kwargs)[source]#

The task run method, to be overridden in a subclass.

See Task.run

wrapper_factory(base_cls, require_cls, enable, cls_name=None, attributes=None, docs=None)[source]#
Return type:

law.task.base.Register

Factory function creating wrapper task classes, inheriting from base_cls and WrapperTask, that do nothing but require multiple instances of require_cls.

Unless cls_name is defined, the name of the created class defaults to the name of require_cls plus “Wrapper”. Additional attributes are added as class-level members when given.

The instances of require_cls to be required in the requires() method can be controlled by task parameters. These parameters can be enabled through the string sequence enable, which currently accepts:

  • configs, skip_configs

  • shifts, skip_shifts

  • datasets, skip_datasets

This allows to easily build wrapper tasks that loop over (combinations of) parameters that are either defined in the analysis or config, which would otherwise lead to mostly redundant code. Example:

class MyTask(DatasetTask):
    ...

MyTaskWrapper = wrapper_factory(
    base_cls=ConfigTask,
    require_cls=MyTask,
    enable=["datasets", "skip_datasets"],
)

# this allows to run (e.g.)
# law run MyTaskWrapper --datasets st_* --skip-datasets *_tbar

When building the requirements, the full combinatorics of parameters is considered. However, certain conditions apply depending on enabled features. For instance, in order to use the “configs” feature (adding a parameter “–configs” to the created class, allowing to loop over a list of config instances known to an analysis), require_cls must be at least a ConfigTask accepting “–config” (mind the singular form), whereas base_cls must

explicitly not.