New York University Faculty of Arts and Science College of Arts and Science Graduate School of Arts and Science

The Crunchy HPC Cluster

The Crunchy HPC Cluster


About the cluster

The Crunchy HPC Cluster is a 6-node (16-cpu) UltraSPARC III cluster with 48GB total memory, running Solaris 10. Sun HPC ClusterTools 6 can be used to execute both serial and parallel jobs on the cluster. Sun MPI is Sun's implementation of the Message Passing Interface (MPI) library. To use the cluster, log on to any one of its nodes — crunchy{2,6,12}.cims.nyu.edu.

Useful ClusterTools commands

% mprun [ options ] [ - ] prog-name [ prog-args ]

For example, to execute a program on 8 nodes on the cluster:

% mprun -np 8 myprog

See the mprun(1) man page for more options.

Compiling MPI programs

Programs that use Sun MPI routines must have the following include directives at the top:

For C and C++, use

    #include <mpi.h>

For Fortran, use

    INCLUDE 'mpif.h'

All MPI include files and libraries are under /opt/SUNWhpc. To compile Fortran 77, Fortran 90, C, and C++ MPI programs, use mpf77, mpf90,mpcc, and mpCC respectively. These utilities invokes the corresponding Sun compilers with the appropriate search directory compiler options for MPI include files and libraries.

For example, the following command can be used to compile a C MPI program:

% mpcc -fast -xarch=v9 -o myprog myprog.c -lmpi

The -fast option is for the best optimization and -xarch=v9 specifies 64-bit mode. For multithreaded programs, replace -lmpi with -lmpi_mt.

For Fortran code, the -dalign option is necessary to avoid bus errors. For example,

% mpf95 -o myprog -dalign myprog.f -lmpi

Sample codes can be found in /opt/SUNWhpc/examples/mpi.

Further information on how to use Sun HPC ClusterTools, including Sun MPI and the MPProf MPI profiling utility, can be found in the following documentation: