-
Notifications
You must be signed in to change notification settings - Fork 11
Frontier Hackathon 2023
The Hackathon is described here.
- Thursday June 22, 11:00 EST: Brief meeting to define expectations, tasks; also to help everyone get started compiling and running, and understand the machine.
Zoom link here.
- Friday June 23, 13:00 to 14:00 EST: Preparation session by OLCF. Not everybody needs to attend, but those not attending will need to catch up later.
Zoom link here.
- Wednesday June 28 - Friday June 30, 11:00 EST to 17:00 EST: Workshop.
Zoom link here.
-
Steve Brandt
-
Lorenzo Ennoggi
-
Roland Haas
-
Liwei Ji
-
Jay Kalinani
-
Lucas Timotheo Sanches
-
Erik Schnetter
-
John Holmen: OLCF POC
-
Weiqun Zhang: AMReX support
Please join the Frontier Hackathon - June 2023 Slack workspace using the following link: https://join.slack.com/t/frontier-hack-2023-06/shared_invite/zt-1wwlh59sg-1XgbThZmmkTnxByejw0aoQ
After joining the workspace, please search for and join the #team-asterx channel.
Erik's Simfactory settings for Crusher are here. (Erik is busy this week and doesn't have time to clean this up – these files should be moved into a proper repository, possibly Simfactory itself, after some testing by others.)
-
Create ET folder in the home directory:
cd ~/ mkdir ET cd ET
-
Download the code via the following commands:
curl -kLO https://raw.githubusercontent.com/gridaphobe/CRL/master/GetComponents chmod a+x GetComponents ./GetComponents --root Cactus --parallel --no-shallow https://raw.githubusercontent.com/jaykalinani/AsterX/frontier/Docs/thornlist/asterx_frontier.th
-
Note that you need to use the
eschnett/crusher
branch of the flesh (the above thornlist already take count of this, but you still need to run the following command),- you also need run
git merge master
locally to make it work with current version ofCarpetX
.
- you also need run
-
For
ExternalLibraries-ADIOS2
, we have to use commitc1d6397
for the moment.
-
-
Simfactory files for Frontier are available in the folder:
Cactus/repos/AsterX/Docs/compile-notes/frontier
. (There is a PR to simfactory)-
Copy
frontier.ini
toCactus/simfactory/mdb/machines/.
-
Copy
frontier.cfg
toCactus/simfactory/mdb/optionlists/.
-
Copy
frontier.run
toCactus/simfactory/mdb/runscripts/.
-
Copy
frontier.sub
toCactus/simfactory/mdb/submitscripts/.
-
Copy
defs.local.ini
toCactus/simfactory/etc/.
, and edit user account details, source and base directory paths accordingly.
-
-
Return to Cactus directory and compile using the following command:
./simfactory/bin/sim build -j16 <config_name> --thornlist=./thornlists/asterx_frontier.th --machine=frontier
-
Example command to create-submit a job for a shocktube test via simfactory
./simfactory/bin/sim submit B1 --parfile=./arrangements/AsterX/AsterX/test/Balsara1_shocktube.par --config=<config_name> --machine=frontier --allocation=ast182 --procs=64 --num-threads=1 --ppn-used=8 --queue=batch --walltime 00:01:00
-
For a magnetized TOV test evolving spacetime, example submit command via simfactory
./simfactory/bin/sim submit magTOV_unigrid --parfile=./arrangements/AsterX/AsterX/par/magTOV_unigrid_frontier.par --config=<config_name> --machine=frontier --allocation=AST182 --procs=64 --num-threads=1 --ppn-used=8 --queue=batch --walltime 00:03:00
- Frontier User Guide: https://docs.olcf.ornl.gov/systems/frontier_user_guide.html
- AsterX Team introduction slides here
Jay
- Compile newest version of AMReX (23.06)
- Use
-munsafe-fp-atomics
-
Test ROCm 5.2Did not see any performance difference.
- Cheatsheet here
- Getting started on Frontier here
- To output measurements, in the runscript, prefix the call to the executable with:
hpcrun \
-o hpctoolkit-measurements \
-e CPUTIME \
-e gpu=amd \
-t \
Add to optionlist
MPI_LIB_DIRS = /opt/cray/pe/mpich/8.1.23/ofi/crayclang/10.0/lib /opt/cray/xpmem/2.5.2-2.4_3.45__gd0f7936.shasta/lib64 /opt/cray/pe/mpich/8.1.23/gtl/lib
MPI_LIBS = mpi xpmem mpi_gtl_hsa
Path to AMReX library built with enabled GPU-aware MPI
/lustre/orion/csc308/world-shared/amrex-rocm5.3-gpumpi