18-344 software tools readme ---------------------------- This is the readme for the extra software packages for 18-344. For this course you will need to use Pin, Destiny, and the SPEC2017 benchmark suite. Setting up environment variables: --------------------------------- We have already compiled all the software packages you will need for this class, and the executables reside in the `/afs/ece.cmu.edu/class/ece344/opt` directory. We provide a script for setting up PATHs and environment variables for the softwares we will be using, which you can run by executing the following command: ``` source /afs/ece/class/ece344/bin/setup344 ``` You may add this line to the file called .bashrc in your home directory ($HOME/.bashrc), which should automatically execute everytime you login. Alternatively, you can run these commands directly from the command line. At this point, you should be able to run the software packages from your $HOME directory. We describe next how to execute each software package. Running Pin: ------------ Pin is a binary instrumentation framework that you will be using to build computer architecture simulators. To run pin, you will need two components: the workload binary that you want to instrument and your pintool that dictates how the binary is instrumented. The exact command to run pin is as follows: ``` pin -t /.so -- ``` Here's an example: `pin -t /home/18344student/work/cachesim.so -- /home/18344student/work/testprog -a13 -b44` Note that the binary to be instrumented (along with its own command line arguments) comes after `--` in the pin command. If you have not setup the $PATH environment variable, you will need to specify the full path to the pin command in the 344 AFS space. provide a way for you to configure your pintool at runtime, through command-line arguments. Specifically, Pin provides command line argument handling through a mechanism called Knobs. Let's look at an example Knob that reads in the name of the file that the pintool will store its output results in. The specific code for defining this Knob will be as follows (this will be in the pintool code, not on the command line): ``` KNOB KnobOutputFile(KNOB_MODE_WRITEONCE, "pintool", "o", "memcount.stats", "output filename to store results in"); ``` Here, the pintool takes takes a string Knob argument, which will be specified on the command line preceded by '-o'. This argument provides the path and name for the output file where the pintool stores its results. Here's how to specify the value for this Knob in the previous example: `pin -t /home/18344student/work/cachesim.so -o /home/18344student/work/output.stats -- /home/18344student/work/testprog -a13 -b44` We will provide basic Knobs in the starter codes for each lab, you can add your own Knobs if you expose more configurable parameters than what we provide (e.g. cache/TLB sizes). Knobs will enable you to configure the pintool dynamically at runtime, allowing you to sweep through a wide range of configurations quickly. Running SPEC2017: ----------------- Once you have setup the SPEC2017 environment variables as specified above, you need to follow an additional step before you can run these benchmarks. You will need to modify a configuration file that tells SPEC2017 how it should execute the benchmarks. You will find template configuration file in the directory: /afs/ece.cmu.edu/class/ece344/opt/spec2017/config/18344-f24-template.cfg Copy the configuration file to your working directory. For example: $HOME/private/ece344/config/18344-f24-.cfg You may also add it to your PATH by adding the following line to your .bashrc: ``` export SPEC_CONFIG="" ``` Then, you could use $SPEC_CONFIIG instead of the full path when typing a command. Note that everytime you run the command a clone of the file will be made 18344-f24-.cfg. which may cause this directory get messy. We need to make two edits to the configuration file before you are ready to run the SPEC2017 benchmarks. Open the configuration in your favorite editor, and navigate to the `Global Settings` section that defines the `output_root` and `submit` options. The output_root option is the directory in which the output of each of your runs of the spec benchmarks gets stored by the SPEC infrastructure. The best thing to do is to set that to /tmp/ece344- or something similar. If you need to use any of the data stored here, copy it to a local directory; the /tmp folder is periodically cleaned up (i.e. your files might get deleted). We provide you a different way to store the SPEC2017 results when using with Pin, which bypasses this output_root directory. If you follow the convention we provide, simply set the output_root to the /scratch/ece344- directory. We provide the convention below, after describing the second option -- `submit`. The submit option takes a command to use to run your benchmark program, which can only use fully qualified paths. To run SPEC2017 with Pin, this is where you would provide the pin command as: ``` submit = /afs/ece.cmu.edu/class/ece344/opt/pin/inst/pin -t /home/18344student/work/cachesim.so -o /home/18344student/work/output.stats -- $command ``` We need to specify the full paths to every command and file in this `submit` parameter. Note that we replace the workload binary with the variable $command. SPEC2017 then replaces this $command with the actual binary for each different workload that you select. To actually execute the SPEC2017 benchmark suite, you would run the following command: ``` runcpu -c /18344-f24-.cfg --action=onlyrun --noreportable --size=test intspeed ``` The `runcpu` command reads in the config file specified with the `-c` flag, and runs the selected SPEC2017 benchmarks with the command specified in the `submit` field of the config file. The `intspeed` parameter in this example selects the SPECspeed Integer suite of benchmarks, which is a subset of all the SPEC2017 benchmarks. The options for SPEC2017 suites are: `intspeed` SPECspeed Integer `fpspeed` SPECspeed Floating Point `intrate` SPECrate Integer `fprate` SPECrate Floating Point `all` All the suites These suites provide collections of integer/floating point benchmarks that report either execution speed or concurrent throughput (rate). For our labs, we expect you to restrict your evaluation to the `intspeed` suite. The `runcpu` command will first build the binaries for the selected suite and the run each benchmark with the specified `submit` command. To perform a quick test that validates your commands, you can replace `intspeed` with a single benchmark (e.g. `gcc_s`). This should build much faster, and should be useful while debugging errors. As you will notice, if you specify a fixed pin command in the `submit` option, it will take a fixed argument for the output file. As a result, when you run the entire `intspeed` suite, the results of each benchmark will be overwritten by the following benchmark in the specified output file. We propose a convention that sidesteps this issue. Instead of directly specifying the pin command in the `submit` field, we recommend that you specify a call to an intermediate script , which will consume the $command from its own command line and in turn pass it on to the pin command. For example, you can write a shell script called $HOME/run.sh. You can add this to the `submit` field as: ``` submit = /run.sh ${benchmark} $command ``` When you now run the `runcpu` command, SPEC2017 will call the run.sh script for each benchmark with $command as a command line argument. Note that we've added the ${benchmark} argument, which provides run.sh with the name of the benchmark that is currently running. Also, surround `benchmark` with curly braces, while `command` can be used without the braces. You can now add logic in run.sh that reads in ${benchmark} as the first command line argument, and $command as the remaining arguments. We suggest that you use the value of ${benchmark} to generate a unique output filename that can be passed to the pin command. The results for every benchmark can thus be saved to a unique file. An example for this command would look like this: ``` #!/bin/bash BENCHMARK=$1 COMMAND=${@:2} PINTOOL="/.so" RESULT_FILE="/${BENCHMARK}.stats" pin -t ${PINTOOL} -o ${RESULT_FILE} -- ${COMMAND} ``` You should only have to edit this configuration file once at the beginning of class, and never again. For each lab, simply reconfigure the run.sh file. You will observe that you can pass any Knob arguments to the pintool this way. You can also write a for-loop over different configuration parameters in the script file, such that for each SPEC2017 invocation, your script runs the benchmark on the pintool multiple times, with different parameter values each time (e.g. different cache/TLB sizes). If you are more comfortable with scripting in Python, you can replace the `submit` command with this: ``` submit = python3 /run.py ${benchmark} $command ``` This will now pass the ${benchmark} and $command arguments to your Python script. The rest of the logic should be the same as the shell script. Note that the script can be located wherever you want in your $HOME directory, with any name, as long as you point the `submit` field to the correct script. If you setup the configuration file to point to your script, and execute the pin commands in your script, you are now ready to run the SPEC2017 benchmarks! Use the `runcpu` command explained above. It will call your script, which in turn will call pin. You should be able to view your results in the that you specify. Running Destiny: ---------------- Destiny models different memories, reporting several useful information for the modeled memory (e.g. area, access latencies/energies, etc). Destiny takes in a configuration file which specifies different parameters such as size, block size, associativity, etc. To run Destiny, you will use the following command: ``` destiny .cfg ``` Again, if you haven't set up the $PATH environment variable, you will have to specify the full path to destiny. We will be providing template configuration files across the labs, and will highlight the parameters that you should edit. Upon successful execution, Destiny will report information regarding the modeled memory, which you can pipe to an output file so you can view/parse through it using a technique of your chooosing. Destiny is persnickety and if you specify a technology file in your config file (like an SRAM configuration file) you need to specify the fully qualified path to that file in the config file that you pass to Destiny, or Destiny will complain.