smiley – Python Application Tracer

Smiley is a tool for spying on your Python programs and recording their activities. It can be used for post-mortem debugging, performance analysis, or simply understanding what parts of a complex program are actually used in different code paths.

Contents:

Quick Start

Installing

Install with pip:

$ pip install smiley

Using

This quick-start is not a complete reference for the command line program and its options. Use the help subcommand for more details.

In one terminal window, run the monitor command:

$ smiley monitor

In a second terminal window, use smiley to run an application. This example uses test.py from the test_app directory in the smiley source tree.

$ smiley run ./test.py
args: ['./test.py']
input = 10
Leaving c() [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
Leaving b()
Leaving a()

The monitor session will show the execution path and local variables for the app.

Starting new run: ./test.py
test.py:   1: import test_funcs
test.py:   1: import test_funcs
test_funcs.py:   1: import sys
test_funcs.py:   1: import sys
test_funcs.py:   3: def gen(m):
test_funcs.py:   8: def c(input):
test_funcs.py:  14: def b(arg):
test_funcs.py:  21: def a():
test_funcs.py:  21: return>>> None
test.py:   3: if __name__ == '__main__':
test.py:   4:     test_funcs.a()
test_funcs.py:  21: def a():
test_funcs.py:  22:     print 'args:', sys.argv
test_funcs.py:  23:     b(2)
test_funcs.py:  14: def b(arg):
                    arg = 2
test_funcs.py:  15:     val = arg * 5
                    arg = 2
test_funcs.py:  16:     c(val)
                    arg = 2
                    val = 10
test_funcs.py:   8: def c(input):
                    input = 10
test_funcs.py:   9:     print 'input =', input
                    input = 10
test_funcs.py:  10:     data = list(gen(input))
                    input = 10
test_funcs.py:   3: def gen(m):
                    m = 10
test_funcs.py:   4:     for i in xrange(m):
                    m = 10
test_funcs.py:   5:         yield i
                    i = 0
                    m = 10
test_funcs.py:   5: return>>> 0
test_funcs.py:   5:         yield i
                    i = 0
                    m = 10
test_funcs.py:   4:     for i in xrange(m):
                    i = 0
                    m = 10
test_funcs.py:   5:         yield i
                    i = 1
                    m = 10
test_funcs.py:   5: return>>> 1
test_funcs.py:   5:         yield i
                    i = 1
                    m = 10
test_funcs.py:   4:     for i in xrange(m):
                    i = 1
                    m = 10
test_funcs.py:   5:         yield i
                    i = 2
                    m = 10
test_funcs.py:   5: return>>> 2
test_funcs.py:   5:         yield i
                    i = 2
                    m = 10
test_funcs.py:   4:     for i in xrange(m):
                    i = 2
                    m = 10
test_funcs.py:   5:         yield i
                    i = 3
                    m = 10
test_funcs.py:   5: return>>> 3
test_funcs.py:   5:         yield i
                    i = 3
                    m = 10
test_funcs.py:   4:     for i in xrange(m):
                    i = 3
                    m = 10
test_funcs.py:   5:         yield i
                    i = 4
                    m = 10
test_funcs.py:   5: return>>> 4
test_funcs.py:   5:         yield i
                    i = 4
                    m = 10
test_funcs.py:   4:     for i in xrange(m):
                    i = 4
                    m = 10
test_funcs.py:   5:         yield i
                    i = 5
                    m = 10
test_funcs.py:   5: return>>> 5
test_funcs.py:   5:         yield i
                    i = 5
                    m = 10
test_funcs.py:   4:     for i in xrange(m):
                    i = 5
                    m = 10
test_funcs.py:   5:         yield i
                    i = 6
                    m = 10
test_funcs.py:   5: return>>> 6
test_funcs.py:   5:         yield i
                    i = 6
                    m = 10
test_funcs.py:   4:     for i in xrange(m):
                    i = 6
                    m = 10
test_funcs.py:   5:         yield i
                    i = 7
                    m = 10
test_funcs.py:   5: return>>> 7
test_funcs.py:   5:         yield i
                    i = 7
                    m = 10
test_funcs.py:   4:     for i in xrange(m):
                    i = 7
                    m = 10
test_funcs.py:   5:         yield i
                    i = 8
                    m = 10
test_funcs.py:   5: return>>> 8
test_funcs.py:   5:         yield i
                    i = 8
                    m = 10
test_funcs.py:   4:     for i in xrange(m):
                    i = 8
                    m = 10
test_funcs.py:   5:         yield i
                    i = 9
                    m = 10
test_funcs.py:   5: return>>> 9
test_funcs.py:   5:         yield i
                    i = 9
                    m = 10
test_funcs.py:   4:     for i in xrange(m):
                    i = 9
                    m = 10
test_funcs.py:   4: return>>> None
test_funcs.py:  11:     print 'Leaving c()', data
                    data = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
                    input = 10
test_funcs.py:  11: return>>> None
test_funcs.py:  17:     print 'Leaving b()'
                    arg = 2
                    val = 10
test_funcs.py:  18:     return val
                    arg = 2
                    val = 10
test_funcs.py:  18: return>>> 10
test_funcs.py:  24:     print 'Leaving a()'
test_funcs.py:  24: return>>> None
test.py:   4: return>>> None
Finished run

Passing Arguments to Traced Programs

The arguments to run are interpreted as a new command to be executed as though it was run directly, but with tracing enabled.

A simple command without options can be run directly:

$ smiley run ./test.py

If the command takes options, the argument parser for run needs to be told to ignore them by using -- to separate the command sequence from the options for run:

$ smiley run -- ./test.py -e

Command Reference

The main program for Smiley is smiley. It includes several sub-commands.

run

Run an application and trace its execution.

monitor

Listen for trace data from an application running under the run command.

record

Listen for trace data from an application running under the run command and write it to a database for later analysis.

list

Show the runs previously recorded in the database.

delete

Delete runs from the database.

replay

Given a single run id, dump the data from that run in the same format as the monitor command.

server

Run a web server for browsing the pre-recorded run data collected by the record command.

report

Export a set of HTML files describing the pre-recorded run data collected by the record command.

stats show

Show the profiling data from a run.

stats export

Dump the profiling data from a run to a local file.

help

Get help for the smiley command or a subcommand.

Server Mode

The server command starts a web server on a local port to provide a user interface for browsing through the run data captured by record and run. It connects to the same database, so as new runs are captured they appear in the user interface.

Run List

The server listens on http://127.0.0.1:8080 by default. Visiting that page in a browser causes the server to return a list of the runs found in the database in reverse chronological order. For each run the list shows its id, “description”, start and end times, and any final error message. Clicking the “X” link in the far left column will permanently delete that run from the database.

List of several runs for Smiley's test app

Run Details

Clicking on one of the run id values opens the detailed information recorded for that run. The details page shows the state of the program line-by-line as it runs, including where the program control is (filename, line number, and source line) as well as local variables and the return values from functions. This is the same information reported by monitor and replay, in a format that is easier to read.

Line-by-line information about what the test application did as it ran

Source Files

Each of the filenames in the run details view links to a page showing the full source of the Python file as it was at the time of the program’s execution.

The contents of test_funcs.py from the demonstration run

File List

For an application with many source files, it may be more convenient to examine the source by navigating to the file list view and choosing the file from the list.

The test application only contains two files

Profiler Statistics

The stats view shows the profiler output for the run, sorted by cumulative time. As with the run details, each file name links to the full source for the module.

Profiler statistics from the test application

Call Graph

The call graph view uses gprof2dot and graphviz to produce a tree diagram showing how much time is used in different parts of the program, to make it easier to focus on the areas that use the most time.

The call tree from the test application

Note

In order for this page to work, you must have the dot command installed. Installing smiley should install gprof2dot automatically.

Frequently Asked Questions

What’s with the name?

George Smiley is a character in popular spy novels by John LeCarre.

Release History

dev

  • Add delete to delete runs.
  • Add delete link to the web view’s runs page.
  • Add report to produce static HTML output for a run.
  • Fix a problem with the “ignore” path management under virtualenv.

0.6

  • Update the web view to only show changes in variables. The calculation of changes is very rough, and just compares the current set of variables to the previous set, which might be in a completely unrelated scope.
  • Update the web view to show consecutive lines executed together as a single block. A new block is started for each call into a function or when the value of a previously-seen local variable changes.
  • Update the web view to show comments near the source line being executed as further context.
  • Simplify calculation of local variable changes.
  • Tighten up the run view output to allow for wider lines and reduce clutter.
  • Make the tests pass under python 3.3. Still not doing any live testing with python3 apps, but this is a start.
  • Add an option to run to include modules from the standard library. This is disabled by default.
  • Add an option to run to include modules from the site-packages directory (for third-party installed modules). This is enabled by default.
  • Add an option to run to include a specific package in the trace output by name on the command line.
  • Updated to Bootstrap 3 CSS framework.
  • Add pagination support to the detailed trace report.

0.5

  • Add a call graph image, built with gprof2dot and graphviz.
  • Add Server Mode documentation.
  • Clean up template implementations.
  • Clean up navigation and breadcrumbs.

0.4

0.3

  • Add record command.
  • Add list command.
  • Add web ui and server command
  • Add mode option to run to allow writing results directly to a database file.

0.2

Use the script runner code from coverage instead of reinventing it.

0.1

First public release. Includes basic functionality of runner and monitor.

Indices and tables