* Sherif: Reach out to jose-d about pmix - Done, no feedback yet -
+* Greg: to reach out to openPBS and cloud charly
+* Sherif: To update slurm23 to latest - Done -
+
* Sherif to look int openHPC slurm spec file - Pending on Sherif
+* We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR
+
* Get a list of packages from Jeremy to pick up from openHPC - Done
+* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools
+* Plan the openHPC demo Chris / Sherif - Done
+* Finlise the slurm package with naming / configuration - Done
+
* Get a demo / technical talk after 4 weeks "Sherif can arrange that with Chris" - Done
+* Getting a list of packages that openHPC would like to move to distros "Jeremy will be point of contact if we need those in couple of weeks" - Done
+
* Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -
+* Start building apptainer - on hold -
+* Start building singulartiry - on hold -
+* Start building warewulf - on hold -
+* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -
+
* Reach out to other communities “Greg” - on going -
+* Reaching out for different sites that uses Rocky for HPC “Stack will ping few of them and others as well -Group effort-”
+* Reaching out to hardware vendors - nothing done yet -
+* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -
+
+
+
+
+
+
+ Last update:
+ August 14, 2023
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/index.html b/index.html
index 3f3d4ae..07d40c3 100644
--- a/index.html
+++ b/index.html
@@ -445,6 +445,8 @@
+
+
@@ -572,6 +574,20 @@
+
+
+
+
+
+
Developing and maintaining various HPC related packages, this may include porting, optimized and contributing to upstream sources to support HPC initiative
We hang out in our SIG/HPC Mattermost channel and #rockylinux-sig-hpc on irc.libera.chat \"bridged to our MatterMost channel\" also our SIG forums are located here
Those are some of the packages that we are thinking to maintain and support within this SIG
* Lustre server and client\n* Slurm\n* Apptainer\n* Easybuild\n* Spack\n* opempi build slurm support\n* Lmod\n* conda\n* sstack\n* fail2ban - in EPEL not sure if it's fit in this SIG -\n* glusterfs-server - Better suited under SIG/Storage -\n* glusterfs-selinux - Better suited under SIG/Storage -\n* Cython\n* genders\n* pdsh\n* gcc (latest releases, parallel install)\n* autotools\n* cmake\n* hwloc (this really needs to support parallel versions)\n* libtool\n* valgrind (maybe)\n* charliecloud\n* Warewulf (if all config options are runtime instead of pre-compiled)\n* magpie\n* openpbs\n* pmix\n* NIS : ypserv, ypbind, yptools and a correspdonding nss_nis (took the source rpms from fedora and recompiled them for R9)\n
* Alan Marshall\n* Nje\n* Neil Hanlon\n* Matt Bidwell\n* David (NezSez)\n* Jonathan Andreson\n* Stack\n* Balaji\n* Sherif\n* Gregorgy Kurzer\n* David DeBonis\n
"},{"location":"events/meeting-notes/2023-04-20/#quick-round-of-introduction","title":"Quick round of introduction","text":"
Everyone introduced themselves
"},{"location":"events/meeting-notes/2023-04-20/#definition-of-stakeholders","title":"Definition of stakeholders","text":"
\"still needs lots to clarification and classification since those are very wide terms\"
* HPC End-user ?maybe?\n* HPC Systems admins and engineers, to provide them with tools and know how to build HPC clusters using Rocky linux\n* HPC Vendors, however the SIG has to be vendor neutral and agnostic\n
Stack: we need to make sure that we are not redoing efforts that already done with other groups Greg engaged with Open HPC community and providing some core packages such as apptainer, mpi, openHPC
Sherif: we need to have one hat to fit most of all but we can't have one hat that fit all Stack: Feedback regarding Sherif's idea that generic idea's are not great idea and there is a bad performance Greg: we need to put building blocks in the this repo and will make life easiest and lower the barriers like Spack, slurm and easybuild
Devid (NezSez): Some end users won't understand / know anything about HPC and just needs to use the HPC, such as Maya or dynamic fluids
Neil: some tools can be very easily an entry point for organization and teams to use HPC like jupiter playbook
Stack: HPC is usually tuned to different needs, we can reach to other HPC that are running Rocky to ask them to promate rocky and establish a dialog to get an idea of what things that they are running into rocky
Matt: HPC out of the box there are few projects that doing that and we don't need to run in circles of what we are going to
Balaji: SIG for scientific application that focus on support the application and optimization, and HPC suggest the architecture to reach max capabilities
Greg: Agreeing with stack we don't want to provide application that there are tools that do that
Gregory Kurtzer (Chat): A simple strategy might be just to start assembling a list of packages we want to include as part of SIG/HPC, and be open minded as this list expands.
Neil Hanlon(Chat): actually have to leave now, but, if we make some sort of toolkit, it has to be quite unopinionated... OpenStack-Ansible is a good example of being unopinionated about how you run your openstack cluster(s), but give you all the tools to customize and tune to your unique situation, too
* A point raised, should be rebuild some packages that area already in Epel or not and if we shall have a higher priority on our repo or not\n* We need to think more about conflicts with other SIGs like lustre and sig storage\n
* List of applications \u201cThread on MM to post pkgs\u201d\n* Building blocks which are each pkg as a building block such as lustre, openHPC, slurm, etc\u2026\n* Reach out to other communities \u201cGreg\u201d\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors\n* Statistic / public registry for sites / HPC to add themselves if they want\n* Meeting will be bi-weekly \u201cTantive Thursday 9:00PM UTC\u201d\n* Documentations\n
Greg: suggesting to have our own slurm, apptainer, singulatory, Warewulf
Greg: We can reach to DDN about anything related to Luster
Sherif: Suggesting to start building packages
Nick: To build the community we need to start looking into documentation and forums
Stack: we need to be careful and have strong justification for rebuilding stuff that exists in Epel
Greg: asked how HPC centre prefer to manage / or already managing their slurm setup
Few members mentioned one of the following two methods: * Keep upgrading on minor version of slurm * Keep upgrading on minor version of slurm then a major upgrade in a scheduled maintains window
Greg and Nick: adding major-minor version in package name something like python2/3
Sherif: Asking about Testing methodology with testing team
Stack: They hope at some point they are able to test all sigs and working on getting OpenQA build for this
* Start building slurm\n* Start building apptainer\n* Start building singulartiry\n* Start building warewulf\n* Greg reach out for OpenHPC - done\n* Sherif: check about fourms\n
* List of applications \u201cThread on MM to post pkgs\u201d - We have an idea now of which packages we need to build -\n* Building blocks which are each pkg as a building block such as lustre, openHPC, slurm, etc\u2026 - We have an idea of what we need to do -\n* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet - \n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n* Meeting will be bi-weekly \u201cTantive Thursday 9:00PM UTC\u201d - Agreed -\n* Documentations - Wiki is in place but still need some work -\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\"\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\"\n
* Start building slurm - On going , a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
* Get a list of packages from Jeremy to pick up from openHPC\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif\n* Finlise the slurm package with naming / configuration\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\"\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\"\n
* Start building slurm - On going , a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
Chris gave a quick demo about openHPC / presentation
Jeremy sent the packages
Greg: asked how the SIG's slurm is compatible with openHPC
Sherif needs to look at openHPC slurm packages
Chris we need to look on how to build easybuild and look into how to improve it
Chris and Greg talking about if there is any standard that explains how to build systems compatible with each others, openHPC does follow best practices from different entities
Chris provided https://github.com/holgerBerger/hpc-workspace which now a part of openHPC
Sherif mentioned, forums category is now in place https://forums.rockylinux.org/c/sig/hpc/61
* Get a list of packages from Jeremy to pick up from openHPC - Done\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif - Done\n* Finlise the slurm package with naming / configuration - Done\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\" - Done\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\" - Done\n
* Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
* Sherif to look int openHPC slurm spec file - Pending on Sherif\n* We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR\n
* Get a list of packages from Jeremy to pick up from openHPC - Done\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif - Done\n* Finlise the slurm package with naming / configuration - Done\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\" - Done\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\" - Done\n
* Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
* Sherif to look int openHPC slurm spec file - Pending on Sherif\n* We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR\n
* Get a list of packages from Jeremy to pick up from openHPC - Done\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif - Done\n* Finlise the slurm package with naming / configuration - Done\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\" - Done\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\" - Done\n
* Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
* Sherif: Reach out to jose-d about pmix - Done, no feedback yet -\n* Greg: to reach out to openPBS and clout charly\n* Sherif: To update slurm32 to latest\n
* Sherif to look int openHPC slurm spec file - Pending on Sherif\n* We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR\n
* Get a list of packages from Jeremy to pick up from openHPC - Done\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif - Done\n* Finlise the slurm package with naming / configuration - Done\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\" - Done\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\" - Done\n
* Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
"}]}
\ No newline at end of file
+{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"SIG/HPC Wiki","text":"
This SIG is aiming to provide various HPC packages to support building HPC cluster using Rocky Linux systems
Developing and maintaining various HPC related packages, this may include porting, optimized and contributing to upstream sources to support HPC initiative
We hang out in our SIG/HPC Mattermost channel and #rockylinux-sig-hpc on irc.libera.chat \"bridged to our MatterMost channel\" also our SIG forums are located here
Those are some of the packages that we are thinking to maintain and support within this SIG
* Lustre server and client\n* Slurm\n* Apptainer\n* Easybuild\n* Spack\n* opempi build slurm support\n* Lmod\n* conda\n* sstack\n* fail2ban - in EPEL not sure if it's fit in this SIG -\n* glusterfs-server - Better suited under SIG/Storage -\n* glusterfs-selinux - Better suited under SIG/Storage -\n* Cython\n* genders\n* pdsh\n* gcc (latest releases, parallel install)\n* autotools\n* cmake\n* hwloc (this really needs to support parallel versions)\n* libtool\n* valgrind (maybe)\n* charliecloud\n* Warewulf (if all config options are runtime instead of pre-compiled)\n* magpie\n* openpbs\n* pmix\n* NIS : ypserv, ypbind, yptools and a correspdonding nss_nis (took the source rpms from fedora and recompiled them for R9)\n
* Alan Marshall\n* Nje\n* Neil Hanlon\n* Matt Bidwell\n* David (NezSez)\n* Jonathan Andreson\n* Stack\n* Balaji\n* Sherif\n* Gregorgy Kurzer\n* David DeBonis\n
"},{"location":"events/meeting-notes/2023-04-20/#quick-round-of-introduction","title":"Quick round of introduction","text":"
Everyone introduced themselves
"},{"location":"events/meeting-notes/2023-04-20/#definition-of-stakeholders","title":"Definition of stakeholders","text":"
\"still needs lots to clarification and classification since those are very wide terms\"
* HPC End-user ?maybe?\n* HPC Systems admins and engineers, to provide them with tools and know how to build HPC clusters using Rocky linux\n* HPC Vendors, however the SIG has to be vendor neutral and agnostic\n
Stack: we need to make sure that we are not redoing efforts that already done with other groups Greg engaged with Open HPC community and providing some core packages such as apptainer, mpi, openHPC
Sherif: we need to have one hat to fit most of all but we can't have one hat that fit all Stack: Feedback regarding Sherif's idea that generic idea's are not great idea and there is a bad performance Greg: we need to put building blocks in the this repo and will make life easiest and lower the barriers like Spack, slurm and easybuild
Devid (NezSez): Some end users won't understand / know anything about HPC and just needs to use the HPC, such as Maya or dynamic fluids
Neil: some tools can be very easily an entry point for organization and teams to use HPC like jupiter playbook
Stack: HPC is usually tuned to different needs, we can reach to other HPC that are running Rocky to ask them to promate rocky and establish a dialog to get an idea of what things that they are running into rocky
Matt: HPC out of the box there are few projects that doing that and we don't need to run in circles of what we are going to
Balaji: SIG for scientific application that focus on support the application and optimization, and HPC suggest the architecture to reach max capabilities
Greg: Agreeing with stack we don't want to provide application that there are tools that do that
Gregory Kurtzer (Chat): A simple strategy might be just to start assembling a list of packages we want to include as part of SIG/HPC, and be open minded as this list expands.
Neil Hanlon(Chat): actually have to leave now, but, if we make some sort of toolkit, it has to be quite unopinionated... OpenStack-Ansible is a good example of being unopinionated about how you run your openstack cluster(s), but give you all the tools to customize and tune to your unique situation, too
* A point raised, should be rebuild some packages that area already in Epel or not and if we shall have a higher priority on our repo or not\n* We need to think more about conflicts with other SIGs like lustre and sig storage\n
* List of applications \u201cThread on MM to post pkgs\u201d\n* Building blocks which are each pkg as a building block such as lustre, openHPC, slurm, etc\u2026\n* Reach out to other communities \u201cGreg\u201d\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors\n* Statistic / public registry for sites / HPC to add themselves if they want\n* Meeting will be bi-weekly \u201cTantive Thursday 9:00PM UTC\u201d\n* Documentations\n
Greg: suggesting to have our own slurm, apptainer, singulatory, Warewulf
Greg: We can reach to DDN about anything related to Luster
Sherif: Suggesting to start building packages
Nick: To build the community we need to start looking into documentation and forums
Stack: we need to be careful and have strong justification for rebuilding stuff that exists in Epel
Greg: asked how HPC centre prefer to manage / or already managing their slurm setup
Few members mentioned one of the following two methods: * Keep upgrading on minor version of slurm * Keep upgrading on minor version of slurm then a major upgrade in a scheduled maintains window
Greg and Nick: adding major-minor version in package name something like python2/3
Sherif: Asking about Testing methodology with testing team
Stack: They hope at some point they are able to test all sigs and working on getting OpenQA build for this
* Start building slurm\n* Start building apptainer\n* Start building singulartiry\n* Start building warewulf\n* Greg reach out for OpenHPC - done\n* Sherif: check about fourms\n
* List of applications \u201cThread on MM to post pkgs\u201d - We have an idea now of which packages we need to build -\n* Building blocks which are each pkg as a building block such as lustre, openHPC, slurm, etc\u2026 - We have an idea of what we need to do -\n* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet - \n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n* Meeting will be bi-weekly \u201cTantive Thursday 9:00PM UTC\u201d - Agreed -\n* Documentations - Wiki is in place but still need some work -\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\"\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\"\n
* Start building slurm - On going , a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
* Get a list of packages from Jeremy to pick up from openHPC\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif\n* Finlise the slurm package with naming / configuration\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\"\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\"\n
* Start building slurm - On going , a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
Chris gave a quick demo about openHPC / presentation
Jeremy sent the packages
Greg: asked how the SIG's slurm is compatible with openHPC
Sherif needs to look at openHPC slurm packages
Chris we need to look on how to build easybuild and look into how to improve it
Chris and Greg talking about if there is any standard that explains how to build systems compatible with each others, openHPC does follow best practices from different entities
Chris provided https://github.com/holgerBerger/hpc-workspace which now a part of openHPC
Sherif mentioned, forums category is now in place https://forums.rockylinux.org/c/sig/hpc/61
* Get a list of packages from Jeremy to pick up from openHPC - Done\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif - Done\n* Finlise the slurm package with naming / configuration - Done\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\" - Done\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\" - Done\n
* Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
* Sherif to look int openHPC slurm spec file - Pending on Sherif\n* We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR\n
* Get a list of packages from Jeremy to pick up from openHPC - Done\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif - Done\n* Finlise the slurm package with naming / configuration - Done\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\" - Done\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\" - Done\n
* Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
* Sherif to look int openHPC slurm spec file - Pending on Sherif\n* We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR\n
* Get a list of packages from Jeremy to pick up from openHPC - Done\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif - Done\n* Finlise the slurm package with naming / configuration - Done\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\" - Done\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\" - Done\n
* Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
* Sherif: Reach out to jose-d about pmix - Done, no feedback yet -\n* Greg: to reach out to openPBS and clout charly\n* Sherif: To update slurm32 to latest\n
* Sherif to look int openHPC slurm spec file - Pending on Sherif\n* We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR\n
* Get a list of packages from Jeremy to pick up from openHPC - Done\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif - Done\n* Finlise the slurm package with naming / configuration - Done\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\" - Done\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\" - Done\n
* Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
* Sherif: Reach out to jose-d about pmix - Done, no feedback yet -\n* Greg: to reach out to openPBS and cloud charly\n* Sherif: To update slurm23 to latest - Done -\n
* Sherif to look int openHPC slurm spec file - Pending on Sherif\n* We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR\n
* Get a list of packages from Jeremy to pick up from openHPC - Done\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif - Done\n* Finlise the slurm package with naming / configuration - Done\n
* Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\" - Done\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\" - Done\n
* Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
"}]}
\ No newline at end of file
diff --git a/sitemap.xml b/sitemap.xml
index 7a92ab7..3034063 100644
--- a/sitemap.xml
+++ b/sitemap.xml
@@ -2,72 +2,77 @@
https://SIG/HPC.rocky.page/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/about/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/contact/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/events/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/installation/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/packages/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/events/meeting-notes/2023-04-20/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/events/meeting-notes/2023-05-04/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/events/meeting-notes/2023-05-18/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/events/meeting-notes/2023-06-01/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/events/meeting-notes/2023-06-15/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/events/meeting-notes/2023-06-29/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/events/meeting-notes/2023-07-13/
- 2023-08-04
+ 2023-08-14dailyhttps://SIG/HPC.rocky.page/events/meeting-notes/2023-07-27/
- 2023-08-04
+ 2023-08-14
+ daily
+
+
+ https://SIG/HPC.rocky.page/events/meeting-notes/2023-08-10/
+ 2023-08-14daily
\ No newline at end of file
diff --git a/sitemap.xml.gz b/sitemap.xml.gz
index 122320a..4af7f72 100644
Binary files a/sitemap.xml.gz and b/sitemap.xml.gz differ