Massive Operations

Ophidia allows users to apply the same operation over more data cubes by submitting only one command: the massive operations.

Basic usage

A massive operation consists of one or more single operations having the same parameters except the value of the parameter that identifies the cube to be processed or the file to be imported. An example of massive operation is

oph_reduce cube=[container=foo];operation=avg;

The rationale of this command is to apply data reduction over every data cube of container foo. In fact the command is internally converted in a list of single operations like

oph_reduce cube=PID1;operation=avg;
oph_reduce cube=PID2;operation=avg;
oph_reduce cube=PIDn;operation=avg;

where PIDs are the Ophidia identifiers of the data cubes stored within the container foo.


Most of the filters for massive operations can be also applied with negation. For example to apply the previous command to all cubes not belonging to container foo.

oph_reduce cube=[container!=foo];operation=avg;

Refer to massive commands reference appendix to see all the filters and if these can be negated.

Cube operations

The command associated with a cube operation follows the format of the specific operation to be executed except for the parameters cube and cubes, which have to be set to a string of semicolon-separated filters enclosed by square brackets. The filters are key-value pairs and can be used to identify the set of cubes which the massive operation has to be applied to. There are several filters to finely select the cubes to be processed. For example, the massive operation

oph_publish cube=[container=foo;level=4]

can be adopted to publish every level-4 cubes in container foo; the following string

oph_apply cube=[metadatakey=model;metadatavalue=foo];query=somequery;

can be used to apply the query somequery to all the cubes having the attribute model set to foo.

Import operations

The command associated with an import operation follows the format of an import operation except for the parameter src_path, which has to be set to a string of semicolon-separated filters. In this case the filters can be used to select the files to be imported. The massive operation

oph_importnc src_path=[path=/path/to/files;file=foo*.nc];measure=foo;

allows to import all the file whose name matches with the pattern foo*.nc contained in the folder /path/to/files. By default, path folder is BASE_SRC_PATH (see oph_configuration for additional information).

Filters for massive operations

The complete filter list can be obtained by typing

man oph_massive

in Ophidia Terminal. The same information is available in the massive commands reference appendix.

Output of a massive operation

The output of a massive operation reports some information regarding each (single) sub-task associated with the operation. In particular, the output consists of two objects:

  1. a text object Massive Operation Status, which reports the final status of the operation;
  2. a table object Massive Operation Task List, which reports a list with some information about each sub-task: job identifier, Marker ID and Exit Status.

Job identifier is an URL that can be used to access session web resources related to the task. See Session Management section for additional information on these web resources

The table also includes a Parent Marker ID that is associated with the massive task and can be used to retrieve its output later by using the command view.

Note that a massive operation is considered Successful only if all the sub-tasks succeeded.

Advanced features

In case the user is not sure of the list of the objects (cubes or files) which a massive operation will be applied to, the user can append the key-value pair run=no to filter string and submit the resulting command, thus retrieving the list without executing the massive operation actually.

If the user wishes to set only the filter path, the key path can be omitted.

If the user wishes to set only the filter cube_filter, the key cube_filter can be omitted.

Finally, the following strings are some interesting filters for massive operations:


means all the cubes in current working virtual directory


means all the cubes in current working virtual directory or related sub-directories

Commands at glance

How to show the cubes which a massive operation will be applied to without executing it?

oph_foo [...other filters...;run=no]

How to filter cubes whose numerical identifiers (the last numbers of PIDs) are between 100 and 200?

oph_foo cube=[100:200]

How to delete all the cubes in current working directory?

oph_delete cube=[*]

How to delete all the cubes of a session? (from root directory)

oph_delete cube=[recursive=yes]

How to import CMIP5 files without specifying the related measure name? (it is extracted from file names)

oph_importnc src_path=[path=/path/to/files;convention=cmip5]

How to limit the folder depth in case of recursive importation of files?

oph_importnc src_path=[path=/path/to/files;recursion=yes;depth=3]