|
1 | 1 | [Home](./shapepipe.md) | [Environments](./environment.md)
|
2 | 2 |
|
3 | 3 | # CCIN2P3 Set Up
|
| 4 | + |
| 5 | +## Contents |
| 6 | + |
| 7 | +1. [Introduction](#Introduction) |
| 8 | +1. [Installation](#Installation) |
| 9 | +1. [Execution](#Execution) |
| 10 | +1. [Troubleshooting](#Troubleshooting) |
| 11 | + |
| 12 | +## Introduction |
| 13 | + |
| 14 | +[CC-IN2P3](https://doc.cc.in2p3.fr/en-index.html) is the computing centre (CC) of the French National Institute of Nuclear and Particle Physics (INP3) based in Lyon. |
| 15 | + |
| 16 | + |
| 17 | +### CCIN2P3 Account |
| 18 | + |
| 19 | +To get an account on CCIN2P3 you need to follow [their instructions](https://doc.cc.in2p3.fr/en/Getting-started/access.html). In summary: |
| 20 | +- Fill in the [application form](http://cctools.in2p3.fr/cclogon/) and print the resulting PDF |
| 21 | +- Get this document signed by your local "czar". For CEA this should be [Georgette Zoulikha](mailto:zou.georgette@cea.fr). |
| 22 | +- [Submit a ticket](https://cc-usersupport.in2p3.fr/otrs/customer.pl) with a scan of the signed document. |
| 23 | +- If required indicate `Euclid` as project. |
| 24 | + |
| 25 | +### SSH |
| 26 | + |
| 27 | +Once you have an account on CCIN2P3 you can connect via SSH as follows: |
| 28 | + |
| 29 | +```bash |
| 30 | +$ ssh <username>@cca.in2p3.fr |
| 31 | +``` |
| 32 | + |
| 33 | +## Installation |
| 34 | + |
| 35 | +The CCIN2P3 system uses an internal system called `ccenv` to manage various software packages. You can view the packages currently available on the system by running: |
| 36 | + |
| 37 | +```bash |
| 38 | +$ ccenv --list |
| 39 | +``` |
| 40 | + |
| 41 | +ShapePipe requires `conda`, which on CCIN2P3 is provided via `anaconda`. To load this package simply run: |
| 42 | + |
| 43 | +```bash |
| 44 | +$ ccenv anaconda |
| 45 | +``` |
| 46 | + |
| 47 | +> You can add this command to your `.bash_profile` to ensure that this module is available when you log in. |
| 48 | +
|
| 49 | +### With MPI |
| 50 | + |
| 51 | +To install ShapePipe with MPI enabled on CCIN2P3 you also need to load the `openmpi` module. To do so run: |
| 52 | + |
| 53 | +```bash |
| 54 | +$ ccenv openmpi |
| 55 | +``` |
| 56 | + |
| 57 | +You can also specify a specific version of OpenMPI to use. |
| 58 | + |
| 59 | +```bash |
| 60 | +$ ccenv openmpi <VERSION> |
| 61 | +``` |
| 62 | + |
| 63 | +Then you need to identify the root directory of the OpenMPI installation. A easy way to get this information is by running: |
| 64 | + |
| 65 | +```bash |
| 66 | +$ which mpiexec |
| 67 | +``` |
| 68 | + |
| 69 | +which should reveal something like `/pbs/software/centos-7-x86_64/openmpi/<VERSION>`. Provide this path to the `mpi-root` option of the installation script as follows: |
| 70 | + |
| 71 | +```bash |
| 72 | +$ ./shapepipe_install --mpi-root=/pbs/software/centos-7-x86_64/openmpi/<VERSION> |
| 73 | +``` |
| 74 | + |
| 75 | +> Be sure to check the output of the **Installing MPI** section, as the final check only tests if the `mpiexec` command is available on the system. |
| 76 | +
|
| 77 | +You can rebuild the MPI component at any time by doing the following: |
| 78 | + |
| 79 | +```bash |
| 80 | +$ pip uninstall mpi4py |
| 81 | +$ ./install_shapepipe --no-env --no-exe --mpi-root=/pbs/software/centos-7-x86_64/openmpi/<VERSION> |
| 82 | +``` |
| 83 | + |
| 84 | +### Without MPI |
| 85 | + |
| 86 | +To install ShapePipe without MPI enabled simply pass the `no-mpi` option to the installation script as follows: |
| 87 | + |
| 88 | +```bash |
| 89 | +$ ./shapepipe_install --no-mpi |
| 90 | +``` |
| 91 | + |
| 92 | +## Execution |
| 93 | + |
| 94 | +CCIN2P3 uses [UGE](https://en.wikipedia.org/wiki/Univa_Grid_Engine) for handling distributed jobs. |
| 95 | + |
| 96 | +UGE uses standard [Portable Batch System (PBS) commands](https://www.cqu.edu.au/eresearch/high-performance-computing/hpc-user-guides-and-faqs/pbs-commands) such as: |
| 97 | + |
| 98 | +- `qsub` - To submit jobs to the queue. |
| 99 | +- `qstat` - To check on the status of jobs in the queue. |
| 100 | +- `qdel` - To kill jobs in the queue. |
| 101 | + |
| 102 | +Jobs should be submitted as bash scripts. *e.g.*: |
| 103 | + |
| 104 | +```bash |
| 105 | +$ qsub cc_smp.sh |
| 106 | +``` |
| 107 | + |
| 108 | +In this script you can specify: |
| 109 | + |
| 110 | +- The number of nodes to use for SMP (*e.g.* `#$ -pe multicores 4`) |
| 111 | +- The number of nodes to use for MPI (*e.g.* `#$ -pe openmpi 8`) |
| 112 | +- The maximum computing time for your script (*e.g.* `#$ -l h_cpu=10:00:00`) |
| 113 | + |
| 114 | +### Example SMP Script |
| 115 | + |
| 116 | +[`cc_smp.sh`](../../example/pbs/cc_smp.sh) |
| 117 | + |
| 118 | +```bash |
| 119 | +#!/bin/bash |
| 120 | + |
| 121 | +########################## |
| 122 | +# SMP Script for ccin2p3 # |
| 123 | +########################## |
| 124 | + |
| 125 | +# Receive email when job finishes or aborts |
| 126 | +#$ -M <name>@cea.fr |
| 127 | +#$ -m bea |
| 128 | +# Set a name for the job |
| 129 | +#$ -N shapepipe_smp |
| 130 | +# Set a group for the job |
| 131 | +#$ -P P_euclid_sci |
| 132 | +# Join output and errors in one file |
| 133 | +#$ -j y |
| 134 | +# Set maximum computing time (e.g. 5min) |
| 135 | +#$ -l h_cpu=00:05:00 |
| 136 | +# Request muliprocessing resources |
| 137 | +#$ -l os=cl7 |
| 138 | +# Request number of cores |
| 139 | +#$ -pe multicores 4 |
| 140 | + |
| 141 | +# Full path to environment |
| 142 | +export SPENV="$HOME/.conda/envs/shapepipe" |
| 143 | +export SPDIR="$HOME/shapepipe" |
| 144 | + |
| 145 | +# Activate conda environment |
| 146 | +ccenv anaconda |
| 147 | +source activate $SPENV |
| 148 | + |
| 149 | +# Run ShapePipe using full paths to executables |
| 150 | +$SPENV/bin/shapepipe_run -c $SPDIR/example/pbs/config_smp.ini |
| 151 | + |
| 152 | +# Return exit code |
| 153 | +exit 0 |
| 154 | +``` |
| 155 | + |
| 156 | +> Make sure the number of nodes requested matches the `SMP_BATCH_SIZE` in the config file. |
| 157 | +
|
| 158 | +### Example MPI Script |
| 159 | + |
| 160 | +[`cc_mpi.sh`](../../example/pbs/cc_mpi.sh) |
| 161 | + |
| 162 | +```bash |
| 163 | +#!/bin/bash |
| 164 | + |
| 165 | +########################## |
| 166 | +# MPI Script for ccin2p3 # |
| 167 | +########################## |
| 168 | + |
| 169 | +# Receive email when job finishes or aborts |
| 170 | +#$ -M <name>@cea.fr |
| 171 | +#$ -m bea |
| 172 | +# Set a name for the job |
| 173 | +#$ -N shapepipe_mpi |
| 174 | +# Set a group for the job |
| 175 | +#$ -P P_euclid_sci |
| 176 | +# Join output and errors in one file |
| 177 | +#$ -j y |
| 178 | +# Set maximum computing time (e.g. 5min) |
| 179 | +#$ -l h_cpu=00:05:00 |
| 180 | +# Request muliprocessing resources |
| 181 | +#$ -l os=cl7 |
| 182 | +# Request number of cores |
| 183 | +#$ -pe openmpi 4 |
| 184 | + |
| 185 | +# Full path to environment |
| 186 | +export SPENV="$HOME/.conda/envs/shapepipe" |
| 187 | +export SPDIR="$HOME/shapepipe" |
| 188 | + |
| 189 | +# Activate conda environment |
| 190 | +ccenv anaconda |
| 191 | +ccenv openmpi |
| 192 | +source activate $SPENV |
| 193 | + |
| 194 | +# Run ShapePipe |
| 195 | +$SPENV/bin/mpiexec -n $NSLOTS $SPENV/bin/shapepipe_run -c $SPDIR/example/pbs/config_mpi.ini |
| 196 | + |
| 197 | +# Return exit code |
| 198 | +exit 0 |
| 199 | +``` |
| 200 | + |
| 201 | +## Troubleshooting |
| 202 | + |
| 203 | +The Euclid-SDC slack [channel](#euclid-sdc-france.slack.com) is very helpful in case of problems of job submission, environment set up, library and program versions etc. You can write a PM directly at some of the Euclid sys ads, Quentin Le Boulc'h and Gabriele Mainetti (in English or French). They are quick to help with any issue. |
0 commit comments