Compare commits

..

4 Commits

Author SHA1 Message Date
211aec1f0d Merge pull request 'init-pages' (#7) from init-pages into main
All checks were successful
mkdocs build / build (push) Successful in 27s
Reviewed-on: #7
2023-06-19 10:03:04 +00:00
bcec901f3f
Adding packages file 2023-06-19 11:00:01 +01:00
c6dbf49aac
Updating contact us with forums link 2023-06-19 10:34:54 +01:00
077427b34d
Adding meeting's notes 2023-06-15 and fixing date on older ones 2023-06-19 10:32:18 +01:00
6 changed files with 105 additions and 22 deletions

View File

@ -1,2 +1,2 @@
# Contact US
We hang out in our [SIG/HPC Mattermost channel](https://chat.rockylinux.org/rocky-linux/channels/sig-hpc) and #rockylinux-sig-hpc on irc.libera.chat "bridged to our MatterMost channel"
We hang out in our [SIG/HPC Mattermost channel](https://chat.rockylinux.org/rocky-linux/channels/sig-hpc) and #rockylinux-sig-hpc on irc.libera.chat "bridged to our MatterMost channel" also our [SIG forums are located here](https://forums.rockylinux.org/c/sig/hpc/61)

View File

@ -1,3 +1,3 @@
## SIG/HPC Meeting
# SIG/HPC Meeting
We are meeting twice a month on bi-weekly bases on Thursday at 9:00 PM UTC here on [Google meet](https://meet.google.com/hsy-qnoe-dxx) - for now -

View File

@ -11,15 +11,15 @@
* Sherif
## Discussions:
Chris: Are we willing to support all openHPC stack or just the modules and how we imagine achieving this?
Jeremy: Clear a bit of distro related stuff from openHPC would be great such as automake / autoconf
Stack: We need to have a base line so people can start use rocky on HPC and make Rocky accessible
Chris: A Demo / technical talk in 4 weeks
Chris: Are we going to focus on 8 and 9?
Stack and Chris, would be great if we can focus on 9
Sherif: I hope we can do both but with 9 in the spotlight "this needs to be a SIG decision"
Stack: Question, if we start moving openHPC within HPC sig are they going support more distros, we don't want to break packages for other EL distros
Chris: so far testing on Rocky as the only supported EL distro
Chris: Are we willing to support all openHPC stack or just the modules and how we imagine achieving this?
Jeremy: Clear a bit of distro related stuff from openHPC would be great such as automake / autoconf
Stack: We need to have a base line so people can start use rocky on HPC and make Rocky accessible
Chris: A Demo / technical talk in 4 weeks
Chris: Are we going to focus on 8 and 9?
Stack and Chris, would be great if we can focus on 9
Sherif: I hope we can do both but with 9 in the spotlight "this needs to be a SIG decision"
Stack: Question, if we start moving openHPC within HPC sig are they going support more distros, we don't want to break packages for other EL distros
Chris: so far testing on Rocky as the only supported EL distro
## Action items:
* Get a demo / technical talk after 4 weeks "Sherif can arrange that with Chris"

View File

@ -8,17 +8,17 @@
* Chris Simmons
## Discussions:
Getting toolchains outside of openHPC such as automake
Greg: We need to talk if we need to have a generic SIG for toolchains
Greg: We need to look into adding more release packages such as intel compiler
Brain storm ideas about optimizing binaries
David: What would be the interest of having a light weight kernel for HPC
Jeremy: mentioning intel light weight kernel https://github.com/intel/mos
Chris: asking if there is any benchmark, hard numbers between shipped kernel and light weight kernel, so far, nothing solid
Sherif: Slurm now is build but not in standard path and we agreed we are going to move standard path
Greg: make sure you have the provide type
Chris: also make sure that downgrade works
Greg and Chris, we can also contribute to openHPC documentation
Getting toolchains outside of openHPC such as automake
Greg: We need to talk if we need to have a generic SIG for toolchains
Greg: We need to look into adding more release packages such as intel compiler
Brain storm ideas about optimizing binaries
David: What would be the interest of having a light weight kernel for HPC
Jeremy: mentioning intel light weight kernel https://github.com/intel/mos
Chris: asking if there is any benchmark, hard numbers between shipped kernel and light weight kernel, so far, nothing solid
Sherif: Slurm now is build but not in standard path and we agreed we are going to move standard path
Greg: make sure you have the provide type
Chris: also make sure that downgrade works
Greg and Chris, we can also contribute to openHPC documentation
## Action items:
* Get a list of packages from Jeremy to pick up from openHPC

View File

@ -0,0 +1,52 @@
# SIG/HPC meeting 2023-06-01
## Attendees:
* Chris Simmons
* Nick Eggleston
* Forrest Burt
* Stack
* David DeBonis
* Jeremy Siadal
* Greg Kurtzer
* Sherif
## Discussions:
Chris gave a quick demo about openHPC / presentation
Jeremy sent the packages
Greg: asked how the SIG's slurm is compatible with openHPC
Sherif needs to look at openHPC slurm packages
Chris we need to look on how to build easybuild and look into how to improve it
Chris and Greg talking about if there is any standard that explains how to build systems compatible with each others, openHPC does follow best practices from different entities
Chris provided https://github.com/holgerBerger/hpc-workspace which now a part of openHPC
Sherif mentioned, forums category is now in place https://forums.rockylinux.org/c/sig/hpc/61
## Action items:
* Sherif to look int openHPC slurm spec file
* We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR
## Old business:
## 2023-06-01:
* Get a list of packages from Jeremy to pick up from openHPC - Done
* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools
* Plan the openHPC demo Chris / Sherif - Done
* Finlise the slurm package with naming / configuration - Done
## 2023-05-18:
* Get a demo / technical talk after 4 weeks "Sherif can arrange that with Chris" - Done
* Getting a list of packages that openHPC would like to move to distros "Jeremy will be point of contact if we need those in couple of weeks" - Done
## 2023-05-04
* Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -
* Start building apptainer - on hold -
* Start building singulartiry - on hold -
* Start building warewulf - on hold -
* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -
## 2023-04-20
* Reach out to other communities “Greg” - on going -
* Reaching out for different sites that uses Rocky for HPC “Stack will ping few of them and others as well -Group effort-”
* Reaching out to hardware vendors - nothing done yet -
* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -

31
docs/packages.md Normal file
View File

@ -0,0 +1,31 @@
# SIG/HPC Packages
Those are some of the packages that we are thinking to maintain and support within this SIG
* Lustre server and client
* Slurm
* Apptainer
* Easybuild
* Spack
* opempi build slurm support
* Lmod
* conda
* sstack
* fail2ban - in EPEL not sure if it's fit in this SIG -
* glusterfs-server - Better suited under SIG/Storage -
* glusterfs-selinux - Better suited under SIG/Storage -
* Cython
* genders
* pdsh
* gcc (latest releases, parallel install)
* autotools
* cmake
* hwloc (this really needs to support parallel versions)
* libtool
* valgrind (maybe)
* charliecloud
* Warewulf (if all config options are runtime instead of pre-compiled)
* magpie
* openpbs
* pmix