Tokens

The GRID_LRT.Token module is responsible for interactions with CouchDB using the PiCaS token framework. It contains a Token_Handler object which manages a single _design document on CouchDB, intended for a set of jobs that are logically linked together. In the LOFAR Surveys case, this holds the jobs of a single Observation. Additionally a Token_Set object can create batch tokens, upload attachments to them in bulk and change Token fields in bulk as well. This module is used in combination with the srmlist class to automatically create sets of jobs with N files each.

Token.py

Location: GRID_LRT/token.py Imports:

>>> from GRID_LRT.token import TokenHandler
>>> from GRID_LRT.token import TokenSet

TokenHandler

class GRID_LRT.token.TokenHandler(t_type='token', srv='https://picas-lofar.grid.surfsara.nl:6984', uname='', pwd='', dbn='')[source]

The TokenHandler class uses couchdb to create, modify and delete tokens and views, to attach files, or download attachments and to easily modify fields in tokens. It’s initiated with the token_type, server, username, password and name of database.

__init__(t_type='token', srv='https://picas-lofar.grid.surfsara.nl:6984', uname='', pwd='', dbn='')[source]
>>> #Example creation of a token of token_type 'test'
>>> from GRID_LRT.auth.get_picas_credentials import picas_cred
>>> pc=picas_cred() #Gets picas_credentials
>>>
>>> th=token.TokenHandler( t_type="test",
srv="https://picas-lofar.grid.surfsara.nl:6984", uname=pc.user,
pwd=pc.password, dbn=pc.database ) #creates object to 'handle' Tokens
>>> th.add_overview_view()
>>> th.add_status_views() #Adds 'todo', 'done', 'locked' and 'error' views
>>> th.load_views()
>>> th.views.keys()
>>> th.reset_tokens(view_name='error') # resets all tokens in 'error' view
>>> th.set_view_to_status(view_name='done','processed')
add_attachment(token, filehandle, filename='test')[source]

Uploads an attachment to a token

Args:
param token:The Token _id as recorded in the CouchDB database
type token:str
param filehandle:
 the file handle to the file which to upload open(filepath,’r’)
type tok_config:
 os.file()
param filename:The name of the attachment
type tok_config:
 str
add_mapreduce_view(view_name='test_mapred_view', cond='doc.PIPELINE_STEP == "pref_cal1" ')[source]

While the overview_view is applied to all the tokens in the design document, this ‘mapreduce’ view is useful if instead of regular view, you want to filter the tokens and display the user with a ‘mini-oververview’ view. This way you can check the status of a subset of the tokens.

add_overview_view()[source]

Helper function that creates the Map-reduce view which makes it easy to count the number of jobs in the ‘locked’,’todo’,’downloading’,’error’ and ‘running’ states

add_status_views()[source]

Adds the ‘todo’, locked, done and error views. the TODO view is necessary for the worker node to find an un-locked token

add_view(view_name='test_view', cond='doc.lock > 0 && doc.done > 0 && doc.output < 0 ', emit_value='doc._id', emit_value2='doc._id')[source]

Adds a view to the db, needs a view name and a condition. Emits all tokens with the type of TokenHandler.t_type, that also match the condition

Parameters:
  • view_name (str) – The name of the new view to be created
  • cond – A string containing the condition which all tokens of the view must match.

It can include boolean operators ‘>’, ‘<’and ‘&&’. The token’s fields are refered to as ‘doc.field’ :type cond: str :param emit_value: The (first) emit value that is returne by the view. If you look on couchdb/request the tokens from the view, you’ll get two values. This will be the first (typically the token’s ID) :type emit_value: str :param emit_value2: The second emit value. You can thus return the token’s ID and its status for example :type emit_value2: str

archive_a_token(tokenid, delete=False)[source]

Dumps the token data into a yaml file and saves the attachments returns list of archived files

archive_tokens(delete_on_save=False, compress=True)[source]

Archives all tokens and attachments into a folder

archive_tokens_from_view(viewname, delete_on_save=False)[source]
clear_all_views()[source]

Iterates over all views in the design document and deletes all tokens from those views. Finally, removes the views from the database

create_token(**kw)[source]

Creates a token, appends string to token ID if requested and adds user requested keys through the dict keys{}

Parameters:
  • keys (dict) – A dictionary of keys, which will be uploaded to the CouchDB document. The supported values for a key are str,int,float and dict
  • append (str) – A string which is appended to the end of the tokenID, useful for adding an OBSID for example
  • attach – A 2-item list of file to be attached to the token.

The first value is the file handle and the second is a string with the attachment name. ex: [open(‘/home/apmechev/file.txt’,’r’),”file.txt”] :type attach: list :return: A string with the token ID :rtype: str

del_view(view_name='test_view')[source]
Deletes the view with view name from the _design/${token_type} document
and from the token_Handler’s dict of views
Parameters:view_name (str) – The name of the view which should be removed
delete_tokens(view_name='test_view', key=None)[source]

Deletes tokens from view view_name

exits if the view doesn’t exist

User can select which tokens within the view to delete

>>> t1.delete_tokens("todo",["OBSID","L123456"])
>>> #Only deletes tokens with OBSID key = L123456
>>> t1.delete_tokens("error") # Deletes all error tokens
Parameters:
  • view_name (str) – Name of the view from which to delete tokens
  • key – key-value pair that selects which tokens to delete

(by default empty == delete all token) :type key: list

get_attachment(token, filename, savename=None)[source]

Downloads an attachment from a CouchDB token. Optionally a save name can be specified.

list_attachments(token)[source]

Lists all of the filenames attached to a couchDB token

list_tokens_from_view(view_name)[source]

Returns all tokens from a viewname

load_views()[source]

Helper function to get the current views on the database. Updates the internal self.views variable

purge_tokens()[source]

Deletes ALL tokens associated with this token_type and removes all views. Also removes the design document from the database

remove_error()[source]

Removes all tokens in the error view

reset_tokens(view_name='test_view', key=None, del_attach=False)[source]

resets all tokens in a view, optionally can reset all tokens in a view who have key-value pairs matched by key[0],key[1]

>>> t1.reset_token("error")
>>> t1.reset_token("error",key=["OBSID","L123456"])
>>> t1.reset_token("error",key=["scrub_count",6])
set_view_to_status(view_name, status)[source]

Sets the status to all tokens in ‘view’ to ‘status eg. Set all locked tokens to error or all error tokens to todo it also locks the tokens!

TokenSet

class GRID_LRT.token.TokenSet(th=None, tok_config=None)[source]

The TokenSet object can automatically create a group of tokens from a yaml configuration file and a dictionary. It keeps track internally of the set of tokens and allows users to batch attach files to the entire TokenSet or alter fields of all tokens in the set.

__init__(th=None, tok_config=None)[source]

The TokenSet object is created with a TokenHandler Object, which is responsible for the interface to the CouchDB views and Documents. This also ensures that only one job type is contained in a TokenSet.

Args:
param th:The TokenHandler associated with the job tokens
type th:GRID_LRT.Token.TokenHandler
param tok_config:
 Location of the token yaml file on the host FileSystem
type tok_config:
 str
raises:AttributeError, KeyError
add_attach_to_list(attachment, tok_list=None, name=None)[source]

Adds an attachment to all the tokens in the TokenSet, or to another list of tokens if explicitly specified.

add_keys_to_list(key, val, tok_list=None)[source]
create_dict_tokens(iterable={}, id_prefix='SB', id_append='L000000', key_name='STARTSB', file_upload=None)[source]

A function that accepts a dictionary and creates a set of tokens equal to the number of entries (keys) of the dictionary. The values of the dict are a list of strings that may be attached to each token if the ‘file_upload’ argument exists.

Args:
param iterable:The dictionary which determines how many tokens will be created.

The values are attached to each token :type iterable: dict :param id_append: Option to append the OBSID to each Token :type id_append: str :param key_name: The Token field which will hold the value of the dictionary’s keys for each Token :type key_name: str :param file_upload: The name of the file which to upload to the tokens (typically srm.txt) :type file_upload: str

tokens
update_local_tokens()[source]