generated from sig_core/wiki-template
72 lines
2.4 KiB
Markdown
72 lines
2.4 KiB
Markdown
# SIG/HPC meeting 2023-05-04
|
|
|
|
## Attendees:
|
|
|
|
* Neil Hanlon
|
|
* Matt Bidwell
|
|
* Stack
|
|
* Sherif
|
|
* Nick Eggleston
|
|
* Gregory Kurtzer
|
|
* Forrest Burt
|
|
|
|
## Package lists
|
|
* Slurm - Epel
|
|
* Apptainer - Epel
|
|
* Lustre - lustre.org , no server for el9
|
|
* Warewulf - HPCNG github only el8
|
|
* Easybuild
|
|
* OpenHPC
|
|
* Spack
|
|
* openmpi *with slurm support*
|
|
* glusterfs-server gluster-selinux
|
|
* NIS, ypserv , ypbind, yptools nss_nis
|
|
* fail2ban
|
|
* Lmod
|
|
* conda
|
|
* sstack
|
|
|
|
## Discussions:
|
|
|
|
Greg: suggesting to have our own slurm, apptainer, singulatory, Warewulf
|
|
|
|
Greg: We can reach to DDN about anything related to Luster
|
|
|
|
Sherif: Suggesting to start building packages
|
|
|
|
Nick: To build the community we need to start looking into documentation and forums
|
|
|
|
Stack: we need to be careful and have strong justification for rebuilding stuff that exists in Epel
|
|
|
|
Greg: asked how HPC centre prefer to manage / or already managing their slurm setup
|
|
|
|
Few members mentioned one of the following two methods:
|
|
* Keep upgrading on minor version of slurm
|
|
* Keep upgrading on minor version of slurm then a major upgrade in a scheduled maintains window
|
|
|
|
Greg and Nick: adding major-minor version in package name something like python2/3
|
|
|
|
Sherif: Asking about Testing methodology with testing team
|
|
|
|
Stack: They hope at some point they are able to test all sigs and working on getting OpenQA build for this
|
|
|
|
## Action items:
|
|
|
|
* Start building slurm
|
|
* Start building apptainer
|
|
* Start building singulartiry
|
|
* Start building warewulf
|
|
* Greg reach out for OpenHPC - done
|
|
* Sherif: check about fourms
|
|
|
|
## Old business:
|
|
|
|
* List of applications “Thread on MM to post pkgs” - We have an idea now of which packages we need to build -
|
|
* Building blocks which are each pkg as a building block such as lustre, openHPC, slurm, etc… - We have an idea of what we need to do -
|
|
* Reach out to other communities “Greg” - on going -
|
|
* Reaching out for different sites that uses Rocky for HPC “Stack will ping few of them and others as well -Group effort-”
|
|
* Reaching out to hardware vendors - nothing done yet -
|
|
* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -
|
|
* Meeting will be bi-weekly “Tantive Thursday 9:00PM UTC” - Agreed -
|
|
* Documentations - Wiki is in place but still need some work -
|