cmd2: a framework for building command line interpreters

Version: 0.4.0
Date: 2008-07-14
Author: Michele Simionato
E-mail:michele.simionato@gmail.com
URL:http://www.phyast.pitt.edu/~micheles/python/cmd2.html
Installation:easy_install cmd2
License:BSD license

Abstract

cmd2 is a framework for building command line interpreters. It is intended as a replacement for the standard library module cmd and it improves over it in a number of ways. It provides two classes: cmd2.Cmd, which is intended for compatibility with cmd.Cmd, and cmd2.CLI, which is the recommended way for writing new command line interfaces. Both Cmd and CLI are meant to be used primarily via composition, as wrappers, and not via inheritance. They provides many features, such as a general mechanism to send commands to the command loop, a general mechanism to manage exceptions and a builtin resource management mechanism. Moreover cmd2.CLI provides support for multiple argument commands, for query string arguments, for scripting and automatic testing. A few things could change in the near future, so the current version should be considered in alpha status (which does not mean it is of alpha quality). Also notice that I did not test this version under Windows.

Contents

cmd2 vs cmd

Python ships with a standard library module named cmd which provides a simple framework for building command line interpreters. cmd2 does the same, only better. Since cmd2 is intended to appeal to users familiar with cmd ( if you do not know cmd already, you MUST have a look at its documentation before continuing [1]) 99% of code using cmd should work unchanged with cmd2. However, since I wanted to fix some perceived shortcomings of cmd, after much reflection I decided to break full compatibility with it. Actually, there is only a single change: whereas the base class cmd.Cmd has signature

cmd.Cmd(self, completekey='tab', stdin=None, stdout=None)

cmd2.Cmd has signature

cmd2.Cmd(self, innerobj=None, **kwargs)

All the not None keyword arguments are copied into the cmd2.Cmd instance. As a consequence the keyword arguments completekey, stdin and stdout are treated as in cmd.Cmd (i.e. stdin=None means sys.stdin and stdout=None means sys.stdout). Still, there is a compatibility breaking for code passing the completekey argument as first argument. However, since changing the traditional tab completion key is rather uncommon (I never did it in my life), I convinced myself that the compatibily breaking was acceptable and I decided to move the nearly useless completekey default argument to a keyword argument. The only default argument now is called innerobj. The change allows to implement a very neat trick. Suppose innerobj is an object with do_ and/or help_ methods; then cmd2.Cmd is able to dispatch on those methods. This allows using composition instead of inheritance [2] as the main mechanism for developing with cmd2.Cmd.

Let me give a concrete example. Consider the following dummy Job class:

class Job(object):
    def __init__(self, name=''):
        self.name = name
        self.state = 'not started'
    def do_run(self, *args):
        'Start the job'
        self.state = 'started'
        print self
    def do_stop(self, *args):
        'End the job'
        self.state = 'ended'
        print self
    def do_pause(self, *args):
        'Pause the job'
        self.state = 'paused'
        print self
    def do_resume(self, *args):
        'Resume the job'
        self.state = 'resumed'
        print self
    def do_raise_exc(self, *args):
        'Simulate a method raising an exception'
        raise RuntimeError('some error')
    def __str__(self):
        return 'job %(name)s %(state)s' % vars(self)

Here the Job class is the application and the do_ methods are the public methods, the ones for which we want to write a (command line) interface. Having at our disposal only the traditional cmd.Cmd class, we would be tempted to use multiple inheritance. However, that would be a bad idea, both for technical and philosophical reasons. One technical issue is the fact that the signatures of cmd.Cmd and Job are different, so that one has to define a specific __init__ method for the child class. This is not convenient and fragile with respect to future extensions. Moreover you have the usual drawback of multiple inheritance, i.e. you get methods coming from everywhere (spaghetti inheritance) and coupling of the code, which makes maintenance difficult. But the biggest issue is a philosophical one: an application is not the same as its interface, so using inheritance (the is-a relationship) is just plain wrong: still, with cmd.Cmd it easy very easy to end up with some mixture of application and interface. On the contrary, the automatic dispatching mechanism of cmd2.Cmd on the inner object, allows you to define a command line interface (CLI) for a job object without requiring inheritance at all, simply by wrapping the job object:

>>> jobCLI = cmd2.Cmd(Job())

In other words, cmd2.Cmd is not just an abstract base class as cmd.Cmd, it can actually be usefully instantiated and you can do quite a lot without subclassing. Here is an example of usage:

>> jobCLI.cmdloop()
Cmd> help

Documented commands (type help <topic>):
========================================

 help  pause  raise_exc  resume  run  stop

 Cmd> help pause
 Pause the job
 Cmd> help raise_exc
 Simulate a method raising an exception
 Cmd> help resume
 Resume the job
 Cmd> help run
 Start the job
 Cmd> help stop
 End the job
 Cmd> run
 job  started
 Cmd> pause
 job  paused
 Cmd> resume
 job  resumed
 Cmd> stop
 job  ended
 Cmd> EOF

cmd.Cmd is an old-style class, whereas cmd2.Cmd is a new-style class, so you can use properties on it.

Internally cmd2.Cmd gets lots of its functionality from cmd.Cmd, but it does so by copying methods from cmd.Cmd and not by inheriting from it. This is on purpose, since it will be too nasty to inherit from a base class and to change the __init__ method in a subtly incompatible way. Also, I do not believe in long inheritance hierarchies.

Command loop objects are designed to be composable, i.e. if clo is command loop object and C1, C2, ... CN are command loop classes/factories, then C1(C2(... CN(clo))) is a command loop object. So instead of adding methods to a single gigantic class, it is possible to compose many small classes.

[1]http://www.python.org/doc/lib/module-cmd.html
[2]I cannot stress too much the advantages of composition over inheritance. Inheritance just does not scale.

The .send method

The major enhancements present in cmd2.Cmd is the introduction of a .send method: it is now possible to send commands to the command loop directly, without going through a file-like object or the terminal. This has many advantages: the most important one is that it is easier to write custom. command loops. Internally all commands are sent to the command loop via .send, therefore if you override .send for any reason (for instance for logging purposes) you can be sure that the commands will be logged both when read from a file-like object and when read from the terminal. Notice that calling .send automatically invokes the usual hooks precmd and postcmd, which are run before the execution of the command and after the execution of the command respectively (postcmd only runs if the command executed correctly, as usual).

Another advantage of the .send method is that you can doctest your command interpreter for free:

>>> send = cmd2.Cmd(Job(), stderr=sys.stdout).send
>>> send('run')
job  started
<Command run('') FINISHED>
>>> send('pause')
job  paused
<Command pause('') FINISHED>
>>> send('resume')
job  resumed
<Command resume('') FINISHED>
>>> send('stop')
job  ended
<Command stop('') FINISHED>

If you enter a non-existent command .send returns None:

>>> send('resme')
*** Unknown syntax: resme

If you enter a valid input .send returns a Command object with attributes .line (the passed line), lineno (the line number), .name (the name of the command), args (the passed arguments) and state (one of NOT STARTED, STARTED, FINISHED, ABORTED). Moreover, if the command failed for some reason, the attributes .exctype, .exc and .tb store the exception class, the exception and the traceback respectively (they are all None if the command completed successfully). By default in cmd2.Cmd exceptions are not trapped for compatibility with cmd.Cmd:

>>> send('raise_exc')
Traceback (most recent call last):
   ...
RuntimeError: some error

If you want to trap the exceptions, you must do it explicitely, by providing a onfailure hook, or you need to use the CLI class, which overrides the onfailure hook to eat all exceptions except KeyboardInterrupt and SystemExit.

>>> send = cmd2.CLI(Job()).send
>>> send('raise_exc')
<Command raise_exc() ABORTED: some error>

Notice that even if the CLI class eats all the exceptions, they are not lost, since they are stored in the command object, so client code can re-raise the original exception, if needed:

>>> cmd = send('raise_exc')
>>> if cmd.exc: raise cmd.exctype, cmd.exc, cmd.tb
Traceback (most recent call last):
   ...
RuntimeError: some error

Managing exceptions

Since Python does not allow to resume a computation after an exception has occurred, the only way to make robust the command loop against exceptions is to trap all the exceptions in place. This is the job of the onfailure hook, which by default re-raise the exception. If you need a finer control, you may subclass the Cmd class by overriding the onfailure(self, cmd) method or, if you dislike inheritance, you may just pass an onfailure(cmd) function to the Cmd object. You can even attach do_ functions to the Cmd object directly, as in this example:

def do_invert(n):
    print 1 / float(n)
def trapZeroDivisionError(cmd):
   if isinstance(cmd.exc, ZeroDivisionError):
      print 'ZeroDivisionError on line %s: %s' % (cmd.lineno, cmd.line)
   else:
      raise cmd.exctype, cmd.exc, cmd.tb
>>> inverter = Cmd(do_invert=do_invert, onfailure=trapZeroDivisionError)

Inverter objects trap solely ZeroDivisionErrors and not other errors:

>>> cmd = inverter.send('invert 2')
0.5
>>> cmd = inverter.send('invert 0')
ZeroDivisionError on line 2: invert 0
>>> cmd
<Command invert('0') ABORTED: float division>
>>> inverter.send('invert x')
Traceback (most recent call last):
   ...
ValueError: invalid literal for float(): x

Conceptually the managing of exceptions should be left to client code, not to the framework. If you want to be able to recover from unexpected errors a good approach is to silence all the exceptions in the inner loop and to process them outside. The caller of .send can perform additional processing of the exception, if need there is. This approach makes more sense that overriding the onfailure with an uber-general hook able to figure out all possible usage of your code. The class cmd2.CLI implements this approach and provides many other facilities too, so it is the recommended way of writing command line interfaces.

Using Python 2.5 extended generators it is very easy to wrap command loops. Here is an example of a consumer postprocessing the exceptions coming from a CLI object:

@consumer
def error_processor(cli):
    """
    For simplicity this is implemented as an infinite loop but you could
    add a termination condition and put this inside a 'with' block.
    """
    line = yield # wait until the client send a line
    while True:
        cmd = cli.send(line) # send the line to the inner loop
        if cmd.exc: # the command failed
           cmd.excname = cmd.exctype.__name__
           print '%(excname)s on line %(lineno)s: %(line)s\n%(exc)s' % \
                   vars(cmd)
        line = yield cmd # return the processed command and wait for input

The consumer decorator should be familiar to Python 2.5 programmers since it is referenced in the extended generators PEP (http://www.python.org/dev/peps/pep-0342) and can be defined as follows:

def consumer(gen):
    "Convert a Python 2.5 generator into a consumer"
    def newgen(*args, **kw):
        g = gen(*args, **kw)
        g.next()
        return g
    return functools.update_wrapper(newgen, gen)

Let me show that everything works:

>>> i = error_processor(CLI(do_invert=do_invert, onfailure=lambda cmd: None))
>>> cmd = i.send('invert x') # send an invalid instruction
ValueError on line 1: invert x
invalid literal for float(): x
>>> cmd = i.send('invert 2') # send a valid instruction
0.5

Managing resources

cmd2.Cmd provides many features over cmd.Cmd. In particular, it provides methods close, __enter__ and __exit__, so that its instances satisfy the resource protocol introduced in Python 2.5. By the default the __enter__ method invokes the .preloop hook and returns self whereas the __exit__ method invoke the .postloop hook if no exception occurred and the .close method in any case. The default close method calls the .close method of the inner object (if any) and sets the .finished attribute to True. When writing a custom command loop, if you put your finalization code in the close method and you use the with statement, then the finalization code is guaranteed to be run even in case of exceptions. Contrast that with the code in the usual the .postcmd and .postloop hooks which will not be run automatically in case of exceptions. You are supposed to write your own command loops more or less as follows:

with cmd2.Cmd(application) as cli:
    for line in commands:
        cli.send(line) # possibly add error processing here

For instance, if you have a Logger object like the following

class Logger(object):
   def __init__(self, name):
      self.logfile = file(name, 'a')
   def do_write(self, msg):
      self.logfile.write(msg + '\n')
   def close(self):
      self.logfile.close()

the log file will be closed if you wrap it with a Cmd object and you call the .close method:

>>> loggerCLI = Cmd(Logger('/tmp/x.log'))
>>> loggerCLI.send('write something')
<Command write('something') FINISHED>
>>> loggerCLI.close()

As it could be expected, if you try to send a command once you exited from the command loop, send will raise a StopIteration error:

>>> loggerCLI.send('write something else')
Traceback (most recent call last):
  ...
StopIteration: The command loop was closed!

The choice of the StopIteration exception was made in order to make Cmd instances more similar to Python 2.5 consumers, for the purposes of duck typing. The idea is that you can treat a consumer like the error_processor defined in the previous paragraph as a Cmd object and that you can wrap many of them one inside another, to extend the set of recognized commands, or just to extend the managing of exceptions.

Multiple arguments

Another nice enhancement present in cmd2.Cmd (where it is disabled by default) and in cmd2.CLI (where is enabled by default) is the ability to define multi-argument commands. Traditionally, cmd.Cmd allows to define only single-argument commands, where the argument is everything in the command line except the command name. For compatibility reasons, by default cmd2.Cmd behaves in the same way. It is possibile to define multi-argument commands by overriding the .splitargs method or by passing a specific splitargs function. Typically, you want to split the arguments the same way the shell does it. Then, you should pass shlex.split as splitargs function [3] (this is what cmd2.CLI does for you). Here is an example of usage. Suppose you want to manage a list of jobs. Then, you need to dispatch on the job number. The following JobManager class does the job:

class JobManager(cmd2.CLI):

    nulljob = NullObject()

    def getjob(self, jobno):
        try:
            jobno = int(jobno)
            if jobno < 0  or jobno >= len(self.innerobj):
               raise ValueError
        except ValueError:
            self.out('jobno %r is invalid\n', jobno)
            return self.nulljob
        else:
            return self.innerobj[jobno]

    def do_run(self, jobno, *args):
        self.getjob(jobno).do_run(*args)

    def do_pause(self, jobno, *args):
        self.getjob(jobno).do_pause(*args)

    def do_resume(self, jobno, *args):
        self.getjob(jobno).do_resume(*args)

    def do_stop(self, jobno, *args):
        self.getjob(jobno).do_stop(*args)

    def do_status(self, jobno=None):
        if not jobno:
            for job in self.innerobj:
                self.out('%s\n', job)
        else:
            self.out('%d\n', self.getjob(jobno))

    def do_raise_exc(self, *args):
        'Simulate a method raising an exception'
        raise RuntimeError('some error')

Here we took advantage of two utility methods out and err of cmd2.CLI: they write on .stdout and .stderr respectively, by flushing the buffer. Moreover, we inherited from CLI.splitargs method, defined as follows:

def splitargs(self, arg):
    'Split the argument string using shlex.split'
    return shlex.split(arg, comments=self.comment_chars)

Finally, we have used the Null Object pattern, i.e. if the user enters an invalid job number we returns an object which methods are all do-nothing operations:

class NullObject(object):
    '''Implements the NullObject pattern.

    >>> n = NullObject()
    >>> n.dosomething(1,2,3)
    '''
    def __getattr__(self, name):
        return lambda *a, **k: None
    def __repr__(self):
        return 'None'
    def __nonzero__(self):
        return False
    def __iter__(self):
        return ()
    def __call__(self, *a, **k):
        return None

This trick avoids the need to check explicitly in every do_ method that the user has entered a valid job number. Here is how it works:

>>> cli = JobManager([Job('0'), Job('1'), Job('2')], stdout=sys.stdout)
>>> cli.send('run 1 arg1')
job 1 started
<Command run('1', 'arg1') FINISHED>
>>> cli.send('run 2 arg2')
job 2 started
<Command run('2', 'arg2') FINISHED>
>>> cli.send('pause 1')
job 1 paused
<Command pause('1') FINISHED>
>>> cli.send('status')
job 0 not started
job 1 paused
job 2 started
<Command status() FINISHED>

Sometimes, you may want to use a different splitting mechanism. For instance, suppose you want to offer an interface over a sequence object (i.e. a list, a tuple), by exposing a getslice multiple argument method, which accepts three string arguments (start, end, [step]); in such a case the natural, pythonic separator is a colon:

class SequenceCLI(cmd2.Cmd):
    # assumes innerobj is a sequence
    def splitargs(self, arg):
        return arg.split(':')
    def do_getslice(self, start, end, step='1'):
        print self.innerobj[int(start):int(end):int(step)]
>>> cli = SequenceCLI(range(10))
>>> cmd = cli.send('getslice 0:5:2')
[0, 2, 4]
>>> cmd = cli.send('getslice 0:5')
[0, 1, 2, 3, 4]

Passing the wrong number of arguments displays a convenient error message:

>> cmd = cli.send('getslice 0')
Wrong arguments: expected ['start', 'end', 'step'], passed ['0']
[3]the documentation for the shlex module is here: http://docs.python.org/dev/lib/module-shlex.html

Integration with asynchronous frameworks

One can take advantage of the .send method to integrate command line interfaces with asynchronous frameworks. In particular, the callback functions of the framework can send commands to the command loop. I will give just an example involving the asyncore framework (which is not the best asynchronous framework in the world, but it is has the advantage of being included in the standard library). Suppose we want to send commands to an application via the good old telnet protocol. We need a CLI handler, which instantiates a cmd2.CLI object for each client:

class CLIHandler(asynchat.async_chat):
    """
    Handle a telnet connection using a CLI instantiated at each new connection.
    """
    terminator = '\r\n' # the standard one for telnet

    def __init__(self, socket, objfactory, CLI=CLI):
        asynchat.async_chat.__init__(self, socket)
        self.set_terminator(self.terminator)
        self.cli = CLI(objfactory(), stdout=self, stderr=self,
                       onfailure=self.manage_exc)
        self.cli.out(self.cli.prompt)
        self.data = []

    def manage_exc(self, cmd):
        tb_lines = traceback.format_exception(cmd.exctype, cmd.exc, cmd.tb)
        self.log_info(''.join(tb_lines)) # on the server
        self.cli.err('%s on line %s, %s%s', # back to the client
            cmd.exctype.__name__, cmd.lineno, cmd.line, self.terminator)

    def write(self, data):
        if data.endswith('\n') and not data.endswith(self.terminator):
            data = data[:-1] + self.terminator # fix newlines
        self.push(data)

    def flush(self): # for file-like compatibility
        pass

    def collect_incoming_data(self, data):
        self.data.append(data)

    def found_terminator(self):
        command = ''.join(self.data)
        self.log('Received command %r from %s' % (command, self.addr))
        if command == 'EOF':
            self.handle_close()
        else:
            self.cli.send(command)
            self.data = [] # be prepared for the next command
            self.cli.out(self.cli.prompt)

Here the found_terminator callback send the command received from the client to the command loop, which in turns writes back the response on the client socket, using the .out method. When the client sends a 'quit' command, the handler closes the connection.

The handler in turns is instantiated by a server class:

class CLIServer(asyncore.dispatcher):
    """
    Asynchronous socket server dispatching to CLIHandler.
    """

    CLIHandler = CLIHandler

    def __init__(self, port, factory, CLI=CLI):
        self.port = port
        self.factory = factory
        self.CLI = CLI
        asyncore.dispatcher.__init__(self)
        self.create_socket(socket.AF_INET, socket.SOCK_STREAM)
        self.bind(('', port))
        self.listen(5)

    def handle_accept(self):
        clientsock, clientaddr = self.accept()
        self.log('Connected from %s' % str(clientaddr))
        self.CLIHandler(clientsock, self.factory, self.CLI)

    def serve(self):
        try:
            asyncore.loop()
        finally:
            asyncore.close_all()

For instance, suppose you want to make available the Job interface to many clients over the telnet protocol, and suppose all the clients can manage the same job:

def exampleCLIServer():
    def jobmanager():
        "A factory of JobManagers"
        return JobManager([Job('1'), Job('2')])
    CLIServer(8022, jobmanager).serve()

You may test that everything works by starting the server

$ python -c'import doc; doc.testCLIServer()'

and by connecting to the server with

$ telnet localhost 8022

Notice that the approach discussed here works if the do_ methods execute very fast: if for some reason they are blocking, a client invoking a slow method can block the server and all the other clients. cmd2 does not provide any facility out of the box to solve this problem. If you have blocking methods, you are supposed to convert them in nonblocking methods by yourself. One possibily is to use the processing module [4] (which has been accepted from inclusion in the standard library for Python 2.6) to run the blocking methods in a separate process. This can be implemented with a run_in_process decorator:

def run_in_process(proc):
    "A simple decorator to convert blocking functions into nonblocking functions"
    def wrapper(*args):
       processing.Process(target=proc, args=args).start()
    return functools.update_wrapper(wrapper, proc)
class Async(CLI):
    @run_in_process
    def do_wait2(self):
       time.sleep(2)
       self.out('Slow operation terminated\n')

Try to run Async().cmdloop() and you will say that the command line interface stays responsive even if you invoke the blocking method, since it is run in a separate process, but you will get your feedback after a while, when the method terminates.

The approach suggested here is extremely primitive, since we did not provide facilities to inspect the commands running as separated processes, to stop/resume/kill them, to manage exceptions occurring in them, etc, but all that is beyond the scope of the cmd2 module.

[4]http://pyprocessing.berlios.de/

Query string arguments

By default the arguments of the do_ methods are plain strings, however (optionally) you may input dictionaries using a query string notation. In order to do so, you must set to True the parse_qs flag on a cmd2.Cmd instance or you must set to True the parse_qs class attribute. Arguments containing an equal sign are interpreted as query strings and converted in a dictionary of strings (for instance the string 'a=1' is converted into the dictionary {'a': '1'}) or list of strings (for instance the string 'a=1&a=2' is converted into the dictionary {'a': ['1', '2']}).

cmd2.CLI which has parse_qs set as well as many other features, so you should subclass from cmd2.CLI if you want to enable the query string arguments feature.

Here is a simple example:

class QS_CLI(cmd2.CLI):
    parse_qs = True

    def do_print(self, *args):
        for arg in args:
            print '-----------------'
            if hasattr(arg, 'items'): # dict-like argument
                for n, v in arg.items():
                    print n, v
            else:
                print 'simple arg:', arg
>>> send = QS_CLI().send
>>> cmd = send('print 1 a=1 b=1&c=2&c=3 d=')
-----------------
simple arg: 1
-----------------
a ['1']
-----------------
b ['1']
c ['2', '3']
-----------------
d ['']

In order to show a less trivial example, suppose we want to extend the job manager with the ability of running/pausing/resuming/stopping many jobs with a single command. Here is a possible implementation:

class MultiJobManager(cmd2.CLI):
    "An interface built over a JobManager interface"
    # form is a dictionary with job names as keys and job arguments as values
    def jobs_n_args(self, form):
        for jobname, args in form.iteritems():
            jobno = jobname[3:]
            yield self.innerobj.getjob(jobno), args
    def do_run(self, form):
        for job, args in self.jobs_n_args(form):
            job.do_run(*args)
    def do_pause(self, form):
        for job, args in self.jobs_n_args(form):
            job.do_pause(*args)
    def do_resume(self, form):
        for job, args in self.jobs_n_args(form):
            job.do_resume(*args)
    def do_stop(self, form):
        for job, args in self.jobs_n_args(form):
            job.do_stop(*args)
>>> cli = MultiJobManager(JobManager([Job('0'), Job('1'), Job('2')]))

For instance, suppose we want to run job '1' with argument 'arg1' and job '2' with arguments 'arg2' and 'arg3'. We may do so by using a query string notation:

>>> cli.send('run job1=arg1&job2=arg2&job2=arg3')
job 1 started
job 2 started
<Command run({'job2': ['arg2', 'arg3'], 'job1': ['arg1']}) FINISHED>

This example is also interesting since it is a case of CLI-object wrapping another CLI-object: since the do_ (and help_) methods of the inner object are visible to the outer object, that means in particular that the do_status method of the JobManager is accessibile to the MultiJobManager instance, even if the MultiJobManager does not define a do_status method. This is similar to inheritance, but it is not inheritance because of two important differences:

  1. in inheritance, all methods are inherited whereas here only the do_ (and help_) methods are transmitted;
  2. the transmitted methods receive as first argument the inner object, not the outer object, whereas in inheritance inherited methods receive as first argument the subclass instance, not an instance of the original class.

Scripting and testing support

The nice thing about command line interfaces is that they are naturally well suited for scripting and testing. Suppose for instance you have defined an interpreter like the following:

# multijobmanager.py
import sys
from doc import *

if __name__ == '__main__':
    jm = JobManager([Job('0'), Job('1'), Job('2')])
    CLI(MultiJobManager(jm), prompt='').cmdloop()

You can run it both interactively and in batch mode. To run it in batch mode, just write the set of commands you want to execute in a file:

$ cat example.script
# multijobmanager.py script
# start 0,1,2 with no args
run job0=&job1=&job2=
pause job0=
resume job0=
stop job2=
# start 1 with arg1 and 2 with arg2
run job1=arg1&job2=arg2
stop job1=

Now you can run them together

>>> print os.popen('python multijobmanager.py < ./example.script').read()
job 0 started
job 1 started
job 2 started
job 0 paused
job 0 resumed
job 2 ended
job 1 started
job 2 started
job 1 ended
<BLANKLINE>

and check that the actual output is consistent with the expected output. Here I did just that in a doctest.

cmd2.CLI improves the support for scripting by adding recognition of comments. For compatibility with cmd by default comment recognition is disabled in cmd2.Cmd but you can enable it by setting the comment_chars attribute which cmd2.CLI is set to #, i.e. lines starting with # are not executed. Notice that commented lines do increment the line counter so that you get the correct lineno in the error message, should your script contain an error.

Integration with web frameworks

A bonus of cmd2.CLI-based interfaces is the fact that they can very easily be integrated with Web interfaces. For that purpose cmd2.CLI provides a .call_cmd method which is able to dispatch automatically to commands taking a query string argument.

Consider for instance the following scenario: you want to write a web application for a small intranet, where you have a predefined (small) set of users with different permissions. You want each user to be able to send commands to his own command line interface via the Web. The easiest way to do that is by keeping all the command line interfaces in memory, into a dictionary {<username> : <CLI object>}. Here is a simple example of a WSGI-compatible web interface which uses solely functionalities available in the standard library (you can do much better by using a Web framework, but these notes are intended to be pedagogical and framework-agnostic):

class JobManagerApp(object):
    """A stoppable WSGI application."""
    def __init__(self, cli_dict):
        self.cli_dict = cli_dict

    def stop(self):
        self.running = False

    def __call__(self, env, resp):
        input_ = env.get('wsgi.input')
        user = env.get('REMOTE_USER', 'pippo') # blatantly insecure
        cli = self.cli_dict[user]
        cli.joblist = cli.innerobj.innerobj
        cli.user = user
        resp('200 OK', [('Content-type', 'text/html')])
        fs = cgi.FieldStorage(input_, environ=env, keep_blank_values=1)
        formdict = dict((k, fs.getlist(k)) for k in fs.keys())
        if not formdict:
            return self.inputpage(cli)
        elif 'STOP' in formdict:
            self.running = False
            return ['Stopped']
        else:
            cmd = cli.call_cmd(formdict)
            if cmd.exc:
                return self.errorpage(cli)
            else:
                return self.outputpage(cli)

    def close(self):
        for cli in self.cli_dict.values():
            cli.close()

    def inputpage(self, cli): # poor man template
        yield '<form method="post">'
        for jobno, job in enumerate(cli.joblist):
            name = "job%s" % jobno
            bt = ('<div>%(name)s <input type="checkbox" name="%(name)s"/>'
                  '<input type="text" name="%(name)s"/></div>'
                  % locals())
            yield bt
        for meth in ('run', 'pause', 'resume', 'stop', 'raise_exc'):
            yield ('<input type="submit" name="cmd" value="%(meth)s"/>'
                   '<br/>') % locals()
        yield '</form>'

    def outputpage(self, cli): # poor man template
        yield 'Current status of the jobs for user %s' % cli.user
        yield '<ol>'
        for job in cli.joblist:
            yield '<li>%s</li>' % job
        yield '</ol>'
        yield 'Click <a href="/">here</a> to run another command'

    def errorpage(self, cli): # poor man template
        yield '<pre>'
        yield cgi.escape(cli.traceback_str)
        yield '</pre>'

You may test it by using the simple WSGI server in the standard library and some support code like the following:

def serve_app(app, host, port):
    from wsgiref.simple_server import make_server
    server = make_server(host, port, app)
    app.running = True
    try: # exit with CTRL-C
       while app.running:
          server.handle_request()
    finally:
       app.running = False
       if hasattr(app, 'close'): # run finalization code, if any
          app.close()
       server.server_close()
def run_job_manager_app():
    '''
    Initializes the application by defining a dictionary
    {username : command line interface} and starts it.
    '''
    pippo_jobs = [Job('admin-1'), Job('admin-2'), Job('admin-3')]
    lippo_jobs = [Job('anonymous-1')]
    cli_dict = dict(
        pippo = MultiJobManager(JobManager(pippo_jobs)),
        lippo = MultiJobManager(JobManager(lippo_jobs)),
        )
    serve_app(JobManagerApp(cli_dict), host='', port=8000)

You may test the application by hand by running the command

$ python -c'import doc; doc.run_job_manager_app()'

and by pointing your browser at http://localhost:8000.