CS 450 Homework Assignment 7

Due Date: Weds, November 14, 9 AM

Important note: The bake-off will begin at 9 AM Wednesday November 16. Homework 7 should be complete by Monday so that we can do preliminary testing, but you will be able to make modifications based on this testing until the bakeoff. Your submission must be in your BAKEOFF directory before lab at 9AM. Furthermore, if you are objecting to any tests in the class test suite, it is essential that you give everyone time to consider whether the test should be removed.

Part 1: Performance Tuning

In this assignment, you will be tuning the performance of your code. Create a file TUNING in your hw6 directory.

First, compare the performance of your code with the debugging flags (-g) and with the compiler optimizations (-0). Report running time on the benchmark with bible.txt in TUNING.

Second, run gprof on your code. Save the output to a file gprof.out. In your TUNING file, report what you learned - which of the functions you wrote consume the most time.

Third, try at least three performance optimizations and report running time on bible.txt with and without the optimization. Use the information in gprof to attempt some performance tuning of your code. Also consider some of the things we mentioned in class like reordering the order in which you check for delimiters, avoiding branches in crucial loops or buffering I/O. You may also want to consider using inline functions or macros for key routines.

Fourth, note the memory usage of your program when running on the bible.txt.

Part 2: Cleanup for Final Submission and Bake-off

It is time to polish it up and submit your word count program once and for all.

Part 3: Use the automated script to test your program with your own testcases

Run the automated script, runTests.sh .

Usage: ./runTests.sh fullPath/codeDirectory fullPath/testcaseDirectory

Example: ./runTests.sh /afs/clarkson.edu/class/cs450/students/YOUR_USERNAME/hw7 /afs/clarkson.edu/class/cs450/students/YOUR_USERNAME/hw6

Start a file hw7/TEST_LOG. Describe your experience running the automated script with your test cases. Include the results file generated in the TEST_LOG.

Part 4: Use an automated script to test your program with the class test suite

The point of this test is both to debug your code *and* to debug everyone elses testcases. If your program fails on a test case, determine if it is because your program is in error *or* if the test in error. If your program is in error, add to your BUG_LOG and fix your program. If you believe the test case is in error, send e-mail to the cs450 mailing list. If an error is reported against your test case, fix the problem and put the new tests in your TESTCASES.FINAL directory. If you believe your test case is correct, respond saying why to the cs450 list.

Use the automated script to run the testcases found in /afs/clarkson.edu/class/cs450/public/tests.

Example: ./runTests.sh /afs/clarkson.edu/class/cs450/students/YOUR_USERNAME/hw7 /afs/clarkson.edu/class/cs450/public/tests

Add the results file and any comments to the TEST_LOG file.

The "correctness" suite is the set of tests that will be used in the first phase of the bake-off. So it is very important that your code pass these tests or that you complain publicly of any test you believe to be in error.

Part 5: White Box Testing

Review the internals of your own code with a focus on testing.

1. Execute all your code

Try to execute every line of code, including all your error handling code. Use gcov to track how much of your code you have executed. Record all the cases you ran and the final percentage of your code executed as reported by gcov. Food for thought: If there were lines of code that you were unable to execute, try to explain why.

2. Document your efforts

DELIVERABLE: Add a new file TESTING.WHITEBOX.USERNAME. In this file, describe your whitebox testing efforts. I would recommend a journal style narrative with three sections for each of the activites above. Describe your experience stepping through your code in gdb as well as copying and pasting a stack trace from within one of your more deeply nested functions. Describe the tests you ran to exercise your programs error handling capabilities and any changes you've made to your code as a result. Describe your experience running gcov along with a list of tests you ran (the "history" command might prove useful) and a list of the final gcov output showing the percentage of your code you were able to execute.

Continue to add new bugs to the BUG_LOG file and note when and how they are fixed as well as how they could have been prevented.