generated from sig_core/wiki-template
Compare commits
No commits in common. "211aec1f0d3d0a4358f238a98b610664e8d135c6" and "1f8e6fed601ef18ca76eccf4416b3dcc82dcdd40" have entirely different histories.
211aec1f0d
...
1f8e6fed60
@ -1,2 +1,2 @@
|
|||||||
# Contact US
|
# Contact US
|
||||||
We hang out in our [SIG/HPC Mattermost channel](https://chat.rockylinux.org/rocky-linux/channels/sig-hpc) and #rockylinux-sig-hpc on irc.libera.chat "bridged to our MatterMost channel" also our [SIG forums are located here](https://forums.rockylinux.org/c/sig/hpc/61)
|
We hang out in our [SIG/HPC Mattermost channel](https://chat.rockylinux.org/rocky-linux/channels/sig-hpc) and #rockylinux-sig-hpc on irc.libera.chat "bridged to our MatterMost channel"
|
||||||
|
@ -1,3 +1,3 @@
|
|||||||
# SIG/HPC Meeting
|
## SIG/HPC Meeting
|
||||||
|
|
||||||
We are meeting twice a month on bi-weekly bases on Thursday at 9:00 PM UTC here on [Google meet](https://meet.google.com/hsy-qnoe-dxx) - for now -
|
We are meeting twice a month on bi-weekly bases on Thursday at 9:00 PM UTC here on [Google meet](https://meet.google.com/hsy-qnoe-dxx) - for now -
|
||||||
|
@ -1,52 +0,0 @@
|
|||||||
# SIG/HPC meeting 2023-06-01
|
|
||||||
|
|
||||||
## Attendees:
|
|
||||||
* Chris Simmons
|
|
||||||
* Nick Eggleston
|
|
||||||
* Forrest Burt
|
|
||||||
* Stack
|
|
||||||
* David DeBonis
|
|
||||||
* Jeremy Siadal
|
|
||||||
* Greg Kurtzer
|
|
||||||
* Sherif
|
|
||||||
|
|
||||||
## Discussions:
|
|
||||||
|
|
||||||
Chris gave a quick demo about openHPC / presentation
|
|
||||||
Jeremy sent the packages
|
|
||||||
Greg: asked how the SIG's slurm is compatible with openHPC
|
|
||||||
Sherif needs to look at openHPC slurm packages
|
|
||||||
Chris we need to look on how to build easybuild and look into how to improve it
|
|
||||||
Chris and Greg talking about if there is any standard that explains how to build systems compatible with each others, openHPC does follow best practices from different entities
|
|
||||||
Chris provided https://github.com/holgerBerger/hpc-workspace which now a part of openHPC
|
|
||||||
Sherif mentioned, forums category is now in place https://forums.rockylinux.org/c/sig/hpc/61
|
|
||||||
|
|
||||||
## Action items:
|
|
||||||
* Sherif to look int openHPC slurm spec file
|
|
||||||
* We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR
|
|
||||||
|
|
||||||
## Old business:
|
|
||||||
|
|
||||||
## 2023-06-01:
|
|
||||||
|
|
||||||
* Get a list of packages from Jeremy to pick up from openHPC - Done
|
|
||||||
* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools
|
|
||||||
* Plan the openHPC demo Chris / Sherif - Done
|
|
||||||
* Finlise the slurm package with naming / configuration - Done
|
|
||||||
|
|
||||||
## 2023-05-18:
|
|
||||||
* Get a demo / technical talk after 4 weeks "Sherif can arrange that with Chris" - Done
|
|
||||||
* Getting a list of packages that openHPC would like to move to distros "Jeremy will be point of contact if we need those in couple of weeks" - Done
|
|
||||||
|
|
||||||
## 2023-05-04
|
|
||||||
* Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -
|
|
||||||
* Start building apptainer - on hold -
|
|
||||||
* Start building singulartiry - on hold -
|
|
||||||
* Start building warewulf - on hold -
|
|
||||||
* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -
|
|
||||||
|
|
||||||
## 2023-04-20
|
|
||||||
* Reach out to other communities “Greg” - on going -
|
|
||||||
* Reaching out for different sites that uses Rocky for HPC “Stack will ping few of them and others as well -Group effort-”
|
|
||||||
* Reaching out to hardware vendors - nothing done yet -
|
|
||||||
* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -
|
|
@ -1,31 +0,0 @@
|
|||||||
# SIG/HPC Packages
|
|
||||||
|
|
||||||
Those are some of the packages that we are thinking to maintain and support within this SIG
|
|
||||||
|
|
||||||
* Lustre server and client
|
|
||||||
* Slurm
|
|
||||||
* Apptainer
|
|
||||||
* Easybuild
|
|
||||||
* Spack
|
|
||||||
* opempi build slurm support
|
|
||||||
* Lmod
|
|
||||||
* conda
|
|
||||||
* sstack
|
|
||||||
* fail2ban - in EPEL not sure if it's fit in this SIG -
|
|
||||||
* glusterfs-server - Better suited under SIG/Storage -
|
|
||||||
* glusterfs-selinux - Better suited under SIG/Storage -
|
|
||||||
* Cython
|
|
||||||
* genders
|
|
||||||
* pdsh
|
|
||||||
* gcc (latest releases, parallel install)
|
|
||||||
* autotools
|
|
||||||
* cmake
|
|
||||||
* hwloc (this really needs to support parallel versions)
|
|
||||||
* libtool
|
|
||||||
* valgrind (maybe)
|
|
||||||
* charliecloud
|
|
||||||
* Warewulf (if all config options are runtime instead of pre-compiled)
|
|
||||||
* magpie
|
|
||||||
* openpbs
|
|
||||||
* pmix
|
|
||||||
|
|
Loading…
Reference in New Issue
Block a user