SCM Repository
Annotation of /benchmarks/README
Parent Directory
|
Revision Log
Revision 1577 - (view) (download)
1 : | jhr | 1574 | Instructions for the Diderot benchmark suite. |
2 : | |||
3 : | =============== | ||
4 : | PREREQUISITES | ||
5 : | =============== | ||
6 : | |||
7 : | You must have Diderot and TEEM installed. See the Diderot INSTALL instructions for | ||
8 : | more details. | ||
9 : | |||
10 : | =============== | ||
11 : | CONFIGURATION | ||
12 : | =============== | ||
13 : | |||
14 : | Run autoconf: | ||
15 : | |||
16 : | % autoconf -Iconfig | ||
17 : | |||
18 : | To configure the benchmark tree you will need to provide both a path to the TEEM | ||
19 : | installation and a path to the Diderot compiler (diderotc). The invocation will | ||
20 : | |||
21 : | % ./configure --with-teem=/path/to/teem --with-diderotc=/path/to/diderotc | ||
22 : | |||
23 : | where "/path/to/teem" is the full path of the directory | ||
24 : | containing the "lib" and "include" directories containing | ||
25 : | "libteem.{a,so,dylib}" and "teem/*.h", respectively. | ||
26 : | |||
27 : | =============== | ||
28 : | INSTALLATION | ||
29 : | =============== | ||
30 : | |||
31 : | TBA | ||
32 : | |||
33 : | =============== | ||
34 : | jhr | 1577 | RUNNING BENCHMARKS |
35 : | =============== | ||
36 : | |||
37 : | To run a benchmark, use the command | ||
38 : | |||
39 : | % ./scripts/run-one.sh bmark nruns nprocs | ||
40 : | |||
41 : | from the root of the benchmark tree. The command-line arguments are | ||
42 : | |||
43 : | bmark -- the name of the benchmark program (i.e., its directory name) | ||
44 : | nruns -- the number of runs of each instance | ||
45 : | nprocs -- the maximum number of processors for the parallel versions; | ||
46 : | specifying 0 will omit the parallel tests. | ||
47 : | |||
48 : | Running this script will produce two files. | ||
49 : | |||
50 : | bmark-report.date | ||
51 : | bmark-log.date | ||
52 : | |||
53 : | (where "bmark" is the name of the benchmark and "date" is a timestamp). The | ||
54 : | report file contains the timing results, while the log file contains the | ||
55 : | transcript of the build process. | ||
56 : | |||
57 : | =============== | ||
58 : | jhr | 1574 | ADDING BENCHMARKS |
59 : | =============== | ||
60 : | |||
61 : | Adding a benchmark is straightforward. Create a directory for the benchmark in the | ||
62 : | benchmarks/programs directory. Any required data files should be placed in the | ||
63 : | benchmarks/data directory. The benchmark should have two source files with the | ||
64 : | names | ||
65 : | |||
66 : | bmark-teem.c The TEEM version implemented in C | ||
67 : | bmark-diderot.diderot The Diderot version | ||
68 : | |||
69 : | Each benchmark has a Makefile, but these are generated as part of the configuration | ||
70 : | process, so once a bencmark has been added you will need to rerun configure. |
root@smlnj-gforge.cs.uchicago.edu | ViewVC Help |
Powered by ViewVC 1.0.0 |