Home My Page Projects Code Snippets Project Openings diderot
Summary Activity Tracker Tasks SCM

SCM Repository

[diderot] View of /benchmarks/README
ViewVC logotype

View of /benchmarks/README

Parent Directory Parent Directory | Revision Log Revision Log

Revision 1932 - (download) (annotate)
Wed Jun 27 12:48:08 2012 UTC (10 years ago) by jhr
File size: 2119 byte(s)
  update README
Instructions for the Diderot benchmark suite.


You must have Diderot and TEEM installed.  See the Diderot INSTALL instructions for
more details.


Run autoconf:

  % autoconf -Iconfig

To configure the benchmark tree you will need to provide both a path to the TEEM
installation and a path to the Diderot compiler (diderotc).  The invocation will

  % ./configure --with-teem=/path/to/teem --with-diderotc=/path/to/diderotc

where "/path/to/teem" is the full path of the directory
containing the "lib" and "include" directories containing
"libteem.{a,so,dylib}" and "teem/*.h", respectively.




To run a benchmark, use the command

	% ./scripts/run-one.sh bmark nruns nprocs nworkers

from the root of the benchmark tree.  The command-line arguments are

	bmark	 -- the name of the benchmark program (i.e., its directory name)
	nruns	 -- the number of runs of each instance
	nprocs	 -- the maximum number of processors for the parallel versions;
		    specifying 0 will omit the parallel tests.
        nworkers -- max number of workers/CU for GPU version (0 means no GPU run)

Running this script will produce two files.


(where "bmark" is the name of the benchmark and "date" is a timestamp).  The
report file contains the timing results, while the log file contains the
transcript of the build process.


Adding a benchmark is straightforward.  Create a directory for the benchmark in the
benchmarks/programs directory.  Any required data files should be placed in the
benchmarks/data directory.  The benchmark should have two source files with the

        bmark-teem.c            The TEEM version implemented in C
        bmark-diderot.diderot   The Diderot version

Each benchmark has a Makefile, but these are generated as part of the configuration
process, so once a bencmark has been added you will need to rerun configure.

ViewVC Help
Powered by ViewVC 1.0.0