# Supported HPC systems AI4HPC currently support 5 HPC systems: 1. [JUWELS](https://apps.fz-juelich.de/jsc/hps/juwels/) in FZJ. 2. [JURECA](https://apps.fz-juelich.de/jsc/hps/jureca/) in FZJ. 3. [DEEP-EST](https://deeptrac.zam.kfa-juelich.de:8443/trac/wiki/Public/User_Guide) in FZJ. 4. [LUMI](https://docs.lumi-supercomputer.eu/firststeps/getstarted/) in CSC. 5. [CTE-AMD](https://www.bsc.es/support/CTE_AMD-ug.pdf) in BSC. To define new HPC systems: `setup.py` file should include new configurations in the `class HPC_ini():`. In parallel, an individual `setup_.sh` script to [./Scripts/Setup](https://gitlab.jsc.fz-juelich.de/CoE-RAISE/FZJ/ai4hpc/ai4hpc/-/blob/main/Scripts/Setup/) folder should be added. Examples are given for JUWELS, JURECA, DEEP-EST, LUMI and CTE-AMD. #### CTE-AMD preinstallation As CTE-AMD does not allow incoming/outgoing communication, a workaround is to use a local system and mount BSC's dedicated Data Transfer Machine (dt01.bsc.es or dt02.bsc.es). Using this method, the compilation of AI4HPC requires first issuing the following commands in a local machine: ```bash $ wget https://gitlab.jsc.fz-juelich.de/CoE-RAISE/FZJ/ai4hpc/ai4hpc/Scripts/CTE-AMD/{getDeps_amdlogin.sh,reqs_dep.txt,reqs_ind.txt} $ sh getDeps_amdlogin.sh ``` and type in your CTE-AMD username when prompt on screen. This script transfers AI4HPC and its dependencies to the user's directory of CTE-AMD in folders named `ai4hpc` and `wheels_ai4hpc`, respectively. From that point, the user can freely move `ai4hpc` folder to their working directory, and continue the standard [installation steps](https://ai4hpc.readthedocs.io/en/latest/AI4HPC/install.html). Note that the current stage of CTE-AMD does not include the neccesary Python version. For this purpose, please use the [./Scripts/CTE-AMD/installPython_amdlogin.sh](https://gitlab.jsc.fz-juelich.de/CoE-RAISE/FZJ/ai4hpc/ai4hpc/-/blob/main/Scripts/CTE-AMD/installPython_amdlogin.sh) script.