Projects

The work on projects consists of several phases: In each stage, you will receive feedback about your work.

Each project includes a) benchmarking of some existing code and b) a technical problem and/or tool evaluation with the potential to extend existing work.

Benchmarks:

Project Topics:

  1. Anwar Ali, Nhon Van Nguyen, Annika Edwards
    MPIP adaptation for i32/Linux, evaluation and enhancements.
    Benchmark: IRS
    Project Web: http://www4.ncsu.edu:8030/~aredward/csc591c/

  2. Anubhav Dhoot, Kunal Shah
    DPCL installation for i32/Linux, evaluation and demonstration on ASCI benchmarks.
    Benchmark: smg2000
    Project Web: http://www4.ncsu.edu:8030/~kdshah/report.htm

  3. Vikram S. Poojary, Raj Kumar Nagarajan
    Paradyn installation for i32/Linux, evaluation and demonstration on ASCI benchmarks.
    Benchmark: AZTEC
    Project Web: http://www4.ncsu.edu:8030/~rknagara/work/cluster_project.htm

  4. Harini Ramaprasad, Salil Pant, Sibin Mohan
    Tau installation for i32/Linux, evaluation and demonstration on ASCI benchmarks.
    Benchmark: ParBenCCh
    Project Web: http://www4.ncsu.edu:8030/~smohan/ClusterWebPage.htm

  5. Frank Castaneda, Nikola Vouk
    PAPI
    evaluation for threading and HW counter multiplexing: limits in MPI and OpenMP frameworks as well as mixes, design (and, if time, implementation) of methods to extend the functionality to circumvent problems.
    Benchmark: Info Needed
    Project Web: http://www4.ncsu.edu/~nvouk/exploitinghyper.html

  6. Jadeep Marathe
    Address Trace Generation for OpenMP Programs using Dynamic Instrumentation
    Project Web: http://www4.ncsu.edu/~jpmarath/index.html

  7. FT-MPI installation for i32/Linux, evaluation and demonstration on ASCI benchmarks.

More details:

  1. ASCI benchmarks: This is independent of the projects. The objective is to gain experience with larger MPI/OpenMP applications.
  2. You have to install the tool and demonstrate its use for your chosen ASCI benchmark (among other, smaller tests). This requires that you
    1. write some test programs yourself to understand how the tools work
    2. integrate similar code into the ASCI benchmark to trigger the functionality of the tool
    3. evaluate and interpret the result
    4. find ways to improve the performance of the application and try them out (again evaluating results with the tool)
    5. suggest ways of improving the tool (wish list concentrating not on GUIs but on new functionality or runtime support)
    6. if there's time, prototype an improvement (e.g., this should definitely be done for MPIP)

Other Unassigned Ideas:

Other Pointers:

You may find that benchmark codes require any of the libraries below. Notice that the first two are already installed (under /opt) while the last would have to be compiled by you. Other libraries can be added upon request if you provide a compiled version ready to install.