From 5e8c24917fc0b966dddb6cecad033a203948152f Mon Sep 17 00:00:00 2001 From: <> Date: Mon, 19 Jun 2023 10:08:59 +0000 Subject: [PATCH] Deployed 843bbbc with MkDocs version: 1.4.3 --- 404.html | 2 +- about/index.html | 2 +- contact/index.html | 2 +- events/index.html | 2 +- events/meeting-notes/2023-04-20/index.html | 2 +- events/meeting-notes/2023-05-04/index.html | 24 +++++++++--------- events/meeting-notes/2023-05-18/index.html | 20 +++++++-------- events/meeting-notes/2023-06-01/index.html | 24 +++++++++--------- events/meeting-notes/2023-06-15/index.html | 28 ++++++++++----------- index.html | 2 +- packages/index.html | 2 +- search/search_index.json | 2 +- sitemap.xml.gz | Bin 283 -> 283 bytes 13 files changed, 56 insertions(+), 56 deletions(-) diff --git a/404.html b/404.html index 970583e..2d5c540 100644 --- a/404.html +++ b/404.html @@ -445,7 +445,7 @@
  • - SIG/HPC meeting 2023-06-01 + SIG/HPC meeting 2023-06-15
  • diff --git a/about/index.html b/about/index.html index 72b66dc..dbcb216 100644 --- a/about/index.html +++ b/about/index.html @@ -459,7 +459,7 @@
  • - SIG/HPC meeting 2023-06-01 + SIG/HPC meeting 2023-06-15
  • diff --git a/contact/index.html b/contact/index.html index e65f482..5163ee7 100644 --- a/contact/index.html +++ b/contact/index.html @@ -466,7 +466,7 @@
  • - SIG/HPC meeting 2023-06-01 + SIG/HPC meeting 2023-06-15
  • diff --git a/events/index.html b/events/index.html index d803383..b173bc8 100644 --- a/events/index.html +++ b/events/index.html @@ -466,7 +466,7 @@
  • - SIG/HPC meeting 2023-06-01 + SIG/HPC meeting 2023-06-15
  • diff --git a/events/meeting-notes/2023-04-20/index.html b/events/meeting-notes/2023-04-20/index.html index 612d803..060234a 100644 --- a/events/meeting-notes/2023-04-20/index.html +++ b/events/meeting-notes/2023-04-20/index.html @@ -532,7 +532,7 @@
  • - SIG/HPC meeting 2023-06-01 + SIG/HPC meeting 2023-06-15
  • diff --git a/events/meeting-notes/2023-05-04/index.html b/events/meeting-notes/2023-05-04/index.html index dfc3679..c61d3bf 100644 --- a/events/meeting-notes/2023-05-04/index.html +++ b/events/meeting-notes/2023-05-04/index.html @@ -525,7 +525,7 @@
  • - SIG/HPC meeting 2023-06-01 + SIG/HPC meeting 2023-06-15
  • @@ -595,18 +595,18 @@ * sstack

    Discussions:

    -

    Greg: suggesting to have our own slurm, apptainer, singulatory, Warewulf -Greg: We can reach to DDN about anything related to Luster -Sherif: Suggesting to start building packages -Nick: To build the community we need to start looking into documentation and forums -Stack: we need to be careful and have strong justification for rebuilding stuff that exists in Epel -Greg: asked how HPC centre prefer to manage / or already managing their slurm setup -Few members mentioned one of the following two methods: +

    Greg: suggesting to have our own slurm, apptainer, singulatory, Warewulf

    +

    Greg: We can reach to DDN about anything related to Luster

    +

    Sherif: Suggesting to start building packages

    +

    Nick: To build the community we need to start looking into documentation and forums

    +

    Stack: we need to be careful and have strong justification for rebuilding stuff that exists in Epel

    +

    Greg: asked how HPC centre prefer to manage / or already managing their slurm setup

    +

    Few members mentioned one of the following two methods: * Keep upgrading on minor version of slurm * Keep upgrading on minor version of slurm then a major upgrade in a scheduled maintains window

    -

    Greg and Nick: adding major-minor version in package name something like python2/3 -Sherif: Asking about Testing methodology with testing team -Stack: They hope at some point they are able to test all sigs and working on getting OpenQA build for this

    +

    Greg and Nick: adding major-minor version in package name something like python2/3

    +

    Sherif: Asking about Testing methodology with testing team

    +

    Stack: They hope at some point they are able to test all sigs and working on getting OpenQA build for this

    Action items:

    * Start building slurm
     * Start building apptainer
    @@ -631,7 +631,7 @@ Stack: They hope at some point they are able to test all sigs and working on get
       
         
           Last update:
    -      May 18, 2023
    +      June 19, 2023
           
         
       
    diff --git a/events/meeting-notes/2023-05-18/index.html b/events/meeting-notes/2023-05-18/index.html
    index 214e38f..cb0c976 100644
    --- a/events/meeting-notes/2023-05-18/index.html
    +++ b/events/meeting-notes/2023-05-18/index.html
    @@ -532,7 +532,7 @@
       
         
  • - SIG/HPC meeting 2023-06-01 + SIG/HPC meeting 2023-06-15
  • @@ -587,15 +587,15 @@ * Sherif

    Discussions:

    -

    Chris: Are we willing to support all openHPC stack or just the modules and how we imagine achieving this? -Jeremy: Clear a bit of distro related stuff from openHPC would be great such as automake / autoconf -Stack: We need to have a base line so people can start use rocky on HPC and make Rocky accessible -Chris: A Demo / technical talk in 4 weeks -Chris: Are we going to focus on 8 and 9? -Stack and Chris, would be great if we can focus on 9 -Sherif: I hope we can do both but with 9 in the spotlight "this needs to be a SIG decision" -Stack: Question, if we start moving openHPC within HPC sig are they going support more distros, we don't want to break packages for other EL distros -Chris: so far testing on Rocky as the only supported EL distro

    +

    Chris: Are we willing to support all openHPC stack or just the modules and how we imagine achieving this?

    +

    Jeremy: Clear a bit of distro related stuff from openHPC would be great such as automake / autoconf

    +

    Stack: We need to have a base line so people can start use rocky on HPC and make Rocky accessible

    +

    Chris: A Demo / technical talk in 4 weeks

    +

    Chris: Are we going to focus on 8 and 9?

    +

    Stack and Chris, would be great if we can focus on 9

    +

    Sherif: I hope we can do both but with 9 in the spotlight "this needs to be a SIG decision"

    +

    Stack: Question, if we start moving openHPC within HPC sig are they going support more distros, we don't want to break packages for other EL distros

    +

    Chris: so far testing on Rocky as the only supported EL distro

    Action items:

    * Get a demo / technical talk after 4 weeks "Sherif can arrange that with Chris"
     * Getting a list of packages that openHPC would like to move to distros "Jeremy will be point of contact if we need those in couple of weeks"
    diff --git a/events/meeting-notes/2023-06-01/index.html b/events/meeting-notes/2023-06-01/index.html
    index 763eb8f..3b22034 100644
    --- a/events/meeting-notes/2023-06-01/index.html
    +++ b/events/meeting-notes/2023-06-01/index.html
    @@ -539,7 +539,7 @@
       
         
  • - SIG/HPC meeting 2023-06-01 + SIG/HPC meeting 2023-06-15
  • @@ -591,17 +591,17 @@ * Chris Simmons

    Discussions:

    -

    Getting toolchains outside of openHPC such as automake -Greg: We need to talk if we need to have a generic SIG for toolchains -Greg: We need to look into adding more release packages such as intel compiler -Brain storm ideas about optimizing binaries -David: What would be the interest of having a light weight kernel for HPC -Jeremy: mentioning intel light weight kernel https://github.com/intel/mos -Chris: asking if there is any benchmark, hard numbers between shipped kernel and light weight kernel, so far, nothing solid -Sherif: Slurm now is build but not in standard path and we agreed we are going to move standard path -Greg: make sure you have the provide type -Chris: also make sure that downgrade works -Greg and Chris, we can also contribute to openHPC documentation

    +

    Getting toolchains outside of openHPC such as automake

    +

    Greg: We need to talk if we need to have a generic SIG for toolchains

    +

    Greg: We need to look into adding more release packages such as intel compiler

    +

    Brain storm ideas about optimizing binaries

    +

    David: What would be the interest of having a light weight kernel for HPC

    +

    Jeremy: mentioning intel light weight kernel https://github.com/intel/mos

    +

    Chris: asking if there is any benchmark, hard numbers between shipped kernel and light weight kernel, so far, nothing solid

    +

    Sherif: Slurm now is build but not in standard path and we agreed we are going to move standard path

    +

    Greg: make sure you have the provide type

    +

    Chris: also make sure that downgrade works

    +

    Greg and Chris, we can also contribute to openHPC documentation

    Action items:

    * Get a list of packages from Jeremy to pick up from openHPC
     * Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools
    diff --git a/events/meeting-notes/2023-06-15/index.html b/events/meeting-notes/2023-06-15/index.html
    index 136a8de..ac7f36d 100644
    --- a/events/meeting-notes/2023-06-15/index.html
    +++ b/events/meeting-notes/2023-06-15/index.html
    @@ -21,7 +21,7 @@
         
         
           
    -        SIG/HPC meeting 2023-06-01 - SIG/HPC Wiki
    +        SIG/HPC meeting 2023-06-15 - SIG/HPC Wiki
           
         
         
    @@ -73,7 +73,7 @@
         
    - + Skip to content @@ -107,7 +107,7 @@
    - SIG/HPC meeting 2023-06-01 + SIG/HPC meeting 2023-06-15
    @@ -463,12 +463,12 @@ - SIG/HPC meeting 2023-06-01 + SIG/HPC meeting 2023-06-15 @@ -587,7 +587,7 @@ -

    SIG/HPC meeting 2023-06-01

    +

    SIG/HPC meeting 2023-06-15

    Attendees:

    * Chris Simmons
     * Nick Eggleston
    @@ -599,14 +599,14 @@
     * Sherif
     

    Discussions:

    -

    Chris gave a quick demo about openHPC / presentation -Jeremy sent the packages -Greg: asked how the SIG's slurm is compatible with openHPC -Sherif needs to look at openHPC slurm packages -Chris we need to look on how to build easybuild and look into how to improve it -Chris and Greg talking about if there is any standard that explains how to build systems compatible with each others, openHPC does follow best practices from different entities -Chris provided https://github.com/holgerBerger/hpc-workspace which now a part of openHPC -Sherif mentioned, forums category is now in place https://forums.rockylinux.org/c/sig/hpc/61

    +

    Chris gave a quick demo about openHPC / presentation

    +

    Jeremy sent the packages

    +

    Greg: asked how the SIG's slurm is compatible with openHPC

    +

    Sherif needs to look at openHPC slurm packages

    +

    Chris we need to look on how to build easybuild and look into how to improve it

    +

    Chris and Greg talking about if there is any standard that explains how to build systems compatible with each others, openHPC does follow best practices from different entities

    +

    Chris provided https://github.com/holgerBerger/hpc-workspace which now a part of openHPC

    +

    Sherif mentioned, forums category is now in place https://forums.rockylinux.org/c/sig/hpc/61

    Action items:

    * Sherif to look int openHPC slurm spec file
     * We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR
    diff --git a/index.html b/index.html
    index 96c18a0..f2628f5 100644
    --- a/index.html
    +++ b/index.html
    @@ -502,7 +502,7 @@
       
         
  • - SIG/HPC meeting 2023-06-01 + SIG/HPC meeting 2023-06-15
  • diff --git a/packages/index.html b/packages/index.html index ea21caa..1308f5b 100644 --- a/packages/index.html +++ b/packages/index.html @@ -466,7 +466,7 @@
  • - SIG/HPC meeting 2023-06-01 + SIG/HPC meeting 2023-06-15
  • diff --git a/search/search_index.json b/search/search_index.json index d2e7373..91c6cba 100644 --- a/search/search_index.json +++ b/search/search_index.json @@ -1 +1 @@ -{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"SIG/HPC Wiki","text":"

    This SIG is aiming to provide various HPC packages to support building HPC cluster using Rocky Linux systems

    "},{"location":"#responsibilities","title":"Responsibilities","text":"

    Developing and maintaining various HPC related packages, this may include porting, optimized and contributing to upstream sources to support HPC initiative

    "},{"location":"#meetings-communications","title":"Meetings / Communications","text":"

    We are meeting on bi-weekly bases on Google meet for now and you may check RESF community calendar here also check Contact US page to reach us

    "},{"location":"about/","title":"About","text":"

    TBD

    "},{"location":"contact/","title":"Contact US","text":"

    We hang out in our SIG/HPC Mattermost channel and #rockylinux-sig-hpc on irc.libera.chat \"bridged to our MatterMost channel\" also our SIG forums are located here

    "},{"location":"events/","title":"SIG/HPC Meeting","text":"

    We are meeting twice a month on bi-weekly bases on Thursday at 9:00 PM UTC here on Google meet - for now -

    "},{"location":"packages/","title":"SIG/HPC Packages","text":"

    Those are some of the packages that we are thinking to maintain and support within this SIG

    * Lustre server and client\n* Slurm\n* Apptainer\n* Easybuild\n* Spack\n* opempi build slurm support\n* Lmod\n* conda\n* sstack\n* fail2ban - in EPEL not sure if it's fit in this SIG -\n* glusterfs-server - Better suited under SIG/Storage -\n* glusterfs-selinux - Better suited under SIG/Storage -\n* Cython\n* genders\n* pdsh\n* gcc (latest releases, parallel install)\n* autotools\n* cmake\n* hwloc (this really needs to support parallel versions)\n* libtool\n* valgrind (maybe)\n* charliecloud\n* Warewulf (if all config options are runtime instead of pre-compiled)\n* magpie\n* openpbs\n* pmix\n
    "},{"location":"events/meeting-notes/2023-04-20/","title":"SIG/HPC meeting 2023-04-20","text":""},{"location":"events/meeting-notes/2023-04-20/#attendees","title":"Attendees:","text":"
    * Alan Marshall\n* Nje\n* Neil Hanlon\n* Matt Bidwell\n* David (NezSez)\n* Jonathan Andreson\n* Stack\n* Balaji\n* Sherif\n* Gregorgy Kurzer\n* David DeBonis\n
    "},{"location":"events/meeting-notes/2023-04-20/#quick-round-of-introduction","title":"Quick round of introduction","text":"

    Everyone introduced themselves

    "},{"location":"events/meeting-notes/2023-04-20/#definition-of-stakeholders","title":"Definition of stakeholders","text":"

    \"still needs lots to clarification and classification since those are very wide terms\"

    * HPC End-user ?maybe?\n* HPC Systems admins and engineers, to provide them with tools and know how to build HPC clusters using Rocky linux\n* HPC Vendors, however the SIG has to be vendor neutral and agnostic\n
    "},{"location":"events/meeting-notes/2023-04-20/#discussions","title":"Discussions:","text":"

    Stack: we need to make sure that we are not redoing efforts that already done with other groups Greg engaged with Open HPC community and providing some core packages such as apptainer, mpi, openHPC

    Sherif: we need to have one hat to fit most of all but we can't have one hat that fit all Stack: Feedback regarding Sherif's idea that generic idea's are not great idea and there is a bad performance Greg: we need to put building blocks in the this repo and will make life easiest and lower the barriers like Spack, slurm and easybuild

    Devid (NezSez): Some end users won't understand / know anything about HPC and just needs to use the HPC, such as Maya or dynamic fluids

    Neil: some tools can be very easily an entry point for organization and teams to use HPC like jupiter playbook

    Stack: HPC is usually tuned to different needs, we can reach to other HPC that are running Rocky to ask them to promate rocky and establish a dialog to get an idea of what things that they are running into rocky

    Matt: HPC out of the box there are few projects that doing that and we don't need to run in circles of what we are going to

    Balaji: SIG for scientific application that focus on support the application and optimization, and HPC suggest the architecture to reach max capabilities

    Greg: Agreeing with stack we don't want to provide application that there are tools that do that

    Gregory Kurtzer (Chat): A simple strategy might be just to start assembling a list of packages we want to include as part of SIG/HPC, and be open minded as this list expands.

    Neil Hanlon(Chat): actually have to leave now, but, if we make some sort of toolkit, it has to be quite unopinionated... OpenStack-Ansible is a good example of being unopinionated about how you run your openstack cluster(s), but give you all the tools to customize and tune to your unique situation, too

    "},{"location":"events/meeting-notes/2023-04-20/#remarks","title":"Remarks:","text":"
    * A point raised, should be rebuild some packages that area already in Epel or not and if we shall have a higher priority on our repo or not\n* We need to think more about conflicts with other SIGs like lustre and sig storage\n
    "},{"location":"events/meeting-notes/2023-04-20/#action-items","title":"Action items:","text":"
    * List of applications \u201cThread on MM to post pkgs\u201d\n* Building blocks which are each pkg as a building block such as lustre, openHPC, slurm, etc\u2026\n* Reach out to other communities \u201cGreg\u201d\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors\n* Statistic / public registry for sites / HPC to add themselves if they want\n* Meeting will be bi-weekly \u201cTantive Thursday 9:00PM UTC\u201d\n* Documentations\n
    "},{"location":"events/meeting-notes/2023-05-04/","title":"SIG/HPC meeting 2023-05-04","text":""},{"location":"events/meeting-notes/2023-05-04/#attendees","title":"Attendees:","text":"
    * Neil Hanlon\n* Matt Bidwell\n* Stack\n* Sherif\n* Nick Eggleston\n* Gregory Kurtzer\n* Forrest Burt\n
    "},{"location":"events/meeting-notes/2023-05-04/#package-lists","title":"Package lists","text":"
    * Slurm - Epel\n* Apptainer - Epel\n* Lustre - lustre.org , no server for el9 \n* Warewulf - HPCNG github only el8\n* Easybuild \n* OpenHPC\n* Spack\n* openmpi *with slurm support*\n* glusterfs-server gluster-selinux\n* NIS, ypserv , ypbind, yptools nss_nis\n* fail2ban\n* Lmod\n* conda\n* sstack\n
    "},{"location":"events/meeting-notes/2023-05-04/#discussions","title":"Discussions:","text":"

    Greg: suggesting to have our own slurm, apptainer, singulatory, Warewulf Greg: We can reach to DDN about anything related to Luster Sherif: Suggesting to start building packages Nick: To build the community we need to start looking into documentation and forums Stack: we need to be careful and have strong justification for rebuilding stuff that exists in Epel Greg: asked how HPC centre prefer to manage / or already managing their slurm setup Few members mentioned one of the following two methods: * Keep upgrading on minor version of slurm * Keep upgrading on minor version of slurm then a major upgrade in a scheduled maintains window

    Greg and Nick: adding major-minor version in package name something like python2/3 Sherif: Asking about Testing methodology with testing team Stack: They hope at some point they are able to test all sigs and working on getting OpenQA build for this

    "},{"location":"events/meeting-notes/2023-05-04/#action-items","title":"Action items:","text":"
    * Start building slurm\n* Start building apptainer\n* Start building singulartiry\n* Start building warewulf\n* Greg reach out for OpenHPC - done\n* Sherif: check about fourms\n
    "},{"location":"events/meeting-notes/2023-05-04/#old-business","title":"Old business:","text":"
    * List of applications \u201cThread on MM to post pkgs\u201d - We have an idea now of which packages we need to build -\n* Building blocks which are each pkg as a building block such as lustre, openHPC, slurm, etc\u2026 - We have an idea of what we need to do -\n* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet - \n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n* Meeting will be bi-weekly \u201cTantive Thursday 9:00PM UTC\u201d - Agreed -\n* Documentations - Wiki is in place but still need some work -\n
    "},{"location":"events/meeting-notes/2023-05-18/","title":"SIG/HPC meeting 2023-05-18","text":""},{"location":"events/meeting-notes/2023-05-18/#attendees","title":"Attendees:","text":"
    * Stack\n* Forrest Burt\n* Nick Eggleston\n* David H\n* Jeremy Siadal\n* Al Bowles\n* Chris Simmons\n* Sherif\n
    "},{"location":"events/meeting-notes/2023-05-18/#discussions","title":"Discussions:","text":"

    Chris: Are we willing to support all openHPC stack or just the modules and how we imagine achieving this? Jeremy: Clear a bit of distro related stuff from openHPC would be great such as automake / autoconf Stack: We need to have a base line so people can start use rocky on HPC and make Rocky accessible Chris: A Demo / technical talk in 4 weeks Chris: Are we going to focus on 8 and 9? Stack and Chris, would be great if we can focus on 9 Sherif: I hope we can do both but with 9 in the spotlight \"this needs to be a SIG decision\" Stack: Question, if we start moving openHPC within HPC sig are they going support more distros, we don't want to break packages for other EL distros Chris: so far testing on Rocky as the only supported EL distro

    "},{"location":"events/meeting-notes/2023-05-18/#action-items","title":"Action items:","text":"
    * Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\"\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\"\n
    "},{"location":"events/meeting-notes/2023-05-18/#old-business","title":"Old business:","text":""},{"location":"events/meeting-notes/2023-05-18/#2023-05-04","title":"2023-05-04","text":"
    * Start building slurm - On going , a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
    "},{"location":"events/meeting-notes/2023-05-18/#2023-04-20","title":"2023-04-20","text":"
    * Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
    "},{"location":"events/meeting-notes/2023-06-01/","title":"SIG/HPC meeting 2023-06-01","text":""},{"location":"events/meeting-notes/2023-06-01/#attendees","title":"Attendees:","text":"
    * Jeremy Siadal\n* Sherif\n* Gregory Kurtzer\n* David DeBonis\n* Chris Simmons\n
    "},{"location":"events/meeting-notes/2023-06-01/#discussions","title":"Discussions:","text":"

    Getting toolchains outside of openHPC such as automake Greg: We need to talk if we need to have a generic SIG for toolchains Greg: We need to look into adding more release packages such as intel compiler Brain storm ideas about optimizing binaries David: What would be the interest of having a light weight kernel for HPC Jeremy: mentioning intel light weight kernel https://github.com/intel/mos Chris: asking if there is any benchmark, hard numbers between shipped kernel and light weight kernel, so far, nothing solid Sherif: Slurm now is build but not in standard path and we agreed we are going to move standard path Greg: make sure you have the provide type Chris: also make sure that downgrade works Greg and Chris, we can also contribute to openHPC documentation

    "},{"location":"events/meeting-notes/2023-06-01/#action-items","title":"Action items:","text":"
    * Get a list of packages from Jeremy to pick up from openHPC\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif\n* Finlise the slurm package with naming / configuration\n
    "},{"location":"events/meeting-notes/2023-06-01/#old-business","title":"Old business:","text":""},{"location":"events/meeting-notes/2023-06-01/#2023-05-18","title":"2023-05-18:","text":"
    * Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\"\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\"\n
    "},{"location":"events/meeting-notes/2023-06-01/#2023-05-04","title":"2023-05-04","text":"
    * Start building slurm - On going , a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
    "},{"location":"events/meeting-notes/2023-06-01/#2023-04-20","title":"2023-04-20","text":"
    * Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
    "},{"location":"events/meeting-notes/2023-06-15/","title":"SIG/HPC meeting 2023-06-01","text":""},{"location":"events/meeting-notes/2023-06-15/#attendees","title":"Attendees:","text":"
    * Chris Simmons\n* Nick Eggleston\n* Forrest Burt\n* Stack\n* David DeBonis\n* Jeremy Siadal\n* Greg Kurtzer\n* Sherif\n
    "},{"location":"events/meeting-notes/2023-06-15/#discussions","title":"Discussions:","text":"

    Chris gave a quick demo about openHPC / presentation Jeremy sent the packages Greg: asked how the SIG's slurm is compatible with openHPC Sherif needs to look at openHPC slurm packages Chris we need to look on how to build easybuild and look into how to improve it Chris and Greg talking about if there is any standard that explains how to build systems compatible with each others, openHPC does follow best practices from different entities Chris provided https://github.com/holgerBerger/hpc-workspace which now a part of openHPC Sherif mentioned, forums category is now in place https://forums.rockylinux.org/c/sig/hpc/61

    "},{"location":"events/meeting-notes/2023-06-15/#action-items","title":"Action items:","text":"
    * Sherif to look int openHPC slurm spec file\n* We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR\n
    "},{"location":"events/meeting-notes/2023-06-15/#old-business","title":"Old business:","text":""},{"location":"events/meeting-notes/2023-06-15/#2023-06-01","title":"2023-06-01:","text":"
    * Get a list of packages from Jeremy to pick up from openHPC - Done\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif - Done\n* Finlise the slurm package with naming / configuration - Done\n
    "},{"location":"events/meeting-notes/2023-06-15/#2023-05-18","title":"2023-05-18:","text":"
    * Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\" - Done\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\" - Done\n
    "},{"location":"events/meeting-notes/2023-06-15/#2023-05-04","title":"2023-05-04","text":"
    * Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
    "},{"location":"events/meeting-notes/2023-06-15/#2023-04-20","title":"2023-04-20","text":"
    * Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
    "}]} \ No newline at end of file +{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"SIG/HPC Wiki","text":"

    This SIG is aiming to provide various HPC packages to support building HPC cluster using Rocky Linux systems

    "},{"location":"#responsibilities","title":"Responsibilities","text":"

    Developing and maintaining various HPC related packages, this may include porting, optimized and contributing to upstream sources to support HPC initiative

    "},{"location":"#meetings-communications","title":"Meetings / Communications","text":"

    We are meeting on bi-weekly bases on Google meet for now and you may check RESF community calendar here also check Contact US page to reach us

    "},{"location":"about/","title":"About","text":"

    TBD

    "},{"location":"contact/","title":"Contact US","text":"

    We hang out in our SIG/HPC Mattermost channel and #rockylinux-sig-hpc on irc.libera.chat \"bridged to our MatterMost channel\" also our SIG forums are located here

    "},{"location":"events/","title":"SIG/HPC Meeting","text":"

    We are meeting twice a month on bi-weekly bases on Thursday at 9:00 PM UTC here on Google meet - for now -

    "},{"location":"packages/","title":"SIG/HPC Packages","text":"

    Those are some of the packages that we are thinking to maintain and support within this SIG

    * Lustre server and client\n* Slurm\n* Apptainer\n* Easybuild\n* Spack\n* opempi build slurm support\n* Lmod\n* conda\n* sstack\n* fail2ban - in EPEL not sure if it's fit in this SIG -\n* glusterfs-server - Better suited under SIG/Storage -\n* glusterfs-selinux - Better suited under SIG/Storage -\n* Cython\n* genders\n* pdsh\n* gcc (latest releases, parallel install)\n* autotools\n* cmake\n* hwloc (this really needs to support parallel versions)\n* libtool\n* valgrind (maybe)\n* charliecloud\n* Warewulf (if all config options are runtime instead of pre-compiled)\n* magpie\n* openpbs\n* pmix\n
    "},{"location":"events/meeting-notes/2023-04-20/","title":"SIG/HPC meeting 2023-04-20","text":""},{"location":"events/meeting-notes/2023-04-20/#attendees","title":"Attendees:","text":"
    * Alan Marshall\n* Nje\n* Neil Hanlon\n* Matt Bidwell\n* David (NezSez)\n* Jonathan Andreson\n* Stack\n* Balaji\n* Sherif\n* Gregorgy Kurzer\n* David DeBonis\n
    "},{"location":"events/meeting-notes/2023-04-20/#quick-round-of-introduction","title":"Quick round of introduction","text":"

    Everyone introduced themselves

    "},{"location":"events/meeting-notes/2023-04-20/#definition-of-stakeholders","title":"Definition of stakeholders","text":"

    \"still needs lots to clarification and classification since those are very wide terms\"

    * HPC End-user ?maybe?\n* HPC Systems admins and engineers, to provide them with tools and know how to build HPC clusters using Rocky linux\n* HPC Vendors, however the SIG has to be vendor neutral and agnostic\n
    "},{"location":"events/meeting-notes/2023-04-20/#discussions","title":"Discussions:","text":"

    Stack: we need to make sure that we are not redoing efforts that already done with other groups Greg engaged with Open HPC community and providing some core packages such as apptainer, mpi, openHPC

    Sherif: we need to have one hat to fit most of all but we can't have one hat that fit all Stack: Feedback regarding Sherif's idea that generic idea's are not great idea and there is a bad performance Greg: we need to put building blocks in the this repo and will make life easiest and lower the barriers like Spack, slurm and easybuild

    Devid (NezSez): Some end users won't understand / know anything about HPC and just needs to use the HPC, such as Maya or dynamic fluids

    Neil: some tools can be very easily an entry point for organization and teams to use HPC like jupiter playbook

    Stack: HPC is usually tuned to different needs, we can reach to other HPC that are running Rocky to ask them to promate rocky and establish a dialog to get an idea of what things that they are running into rocky

    Matt: HPC out of the box there are few projects that doing that and we don't need to run in circles of what we are going to

    Balaji: SIG for scientific application that focus on support the application and optimization, and HPC suggest the architecture to reach max capabilities

    Greg: Agreeing with stack we don't want to provide application that there are tools that do that

    Gregory Kurtzer (Chat): A simple strategy might be just to start assembling a list of packages we want to include as part of SIG/HPC, and be open minded as this list expands.

    Neil Hanlon(Chat): actually have to leave now, but, if we make some sort of toolkit, it has to be quite unopinionated... OpenStack-Ansible is a good example of being unopinionated about how you run your openstack cluster(s), but give you all the tools to customize and tune to your unique situation, too

    "},{"location":"events/meeting-notes/2023-04-20/#remarks","title":"Remarks:","text":"
    * A point raised, should be rebuild some packages that area already in Epel or not and if we shall have a higher priority on our repo or not\n* We need to think more about conflicts with other SIGs like lustre and sig storage\n
    "},{"location":"events/meeting-notes/2023-04-20/#action-items","title":"Action items:","text":"
    * List of applications \u201cThread on MM to post pkgs\u201d\n* Building blocks which are each pkg as a building block such as lustre, openHPC, slurm, etc\u2026\n* Reach out to other communities \u201cGreg\u201d\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors\n* Statistic / public registry for sites / HPC to add themselves if they want\n* Meeting will be bi-weekly \u201cTantive Thursday 9:00PM UTC\u201d\n* Documentations\n
    "},{"location":"events/meeting-notes/2023-05-04/","title":"SIG/HPC meeting 2023-05-04","text":""},{"location":"events/meeting-notes/2023-05-04/#attendees","title":"Attendees:","text":"
    * Neil Hanlon\n* Matt Bidwell\n* Stack\n* Sherif\n* Nick Eggleston\n* Gregory Kurtzer\n* Forrest Burt\n
    "},{"location":"events/meeting-notes/2023-05-04/#package-lists","title":"Package lists","text":"
    * Slurm - Epel\n* Apptainer - Epel\n* Lustre - lustre.org , no server for el9 \n* Warewulf - HPCNG github only el8\n* Easybuild \n* OpenHPC\n* Spack\n* openmpi *with slurm support*\n* glusterfs-server gluster-selinux\n* NIS, ypserv , ypbind, yptools nss_nis\n* fail2ban\n* Lmod\n* conda\n* sstack\n
    "},{"location":"events/meeting-notes/2023-05-04/#discussions","title":"Discussions:","text":"

    Greg: suggesting to have our own slurm, apptainer, singulatory, Warewulf

    Greg: We can reach to DDN about anything related to Luster

    Sherif: Suggesting to start building packages

    Nick: To build the community we need to start looking into documentation and forums

    Stack: we need to be careful and have strong justification for rebuilding stuff that exists in Epel

    Greg: asked how HPC centre prefer to manage / or already managing their slurm setup

    Few members mentioned one of the following two methods: * Keep upgrading on minor version of slurm * Keep upgrading on minor version of slurm then a major upgrade in a scheduled maintains window

    Greg and Nick: adding major-minor version in package name something like python2/3

    Sherif: Asking about Testing methodology with testing team

    Stack: They hope at some point they are able to test all sigs and working on getting OpenQA build for this

    "},{"location":"events/meeting-notes/2023-05-04/#action-items","title":"Action items:","text":"
    * Start building slurm\n* Start building apptainer\n* Start building singulartiry\n* Start building warewulf\n* Greg reach out for OpenHPC - done\n* Sherif: check about fourms\n
    "},{"location":"events/meeting-notes/2023-05-04/#old-business","title":"Old business:","text":"
    * List of applications \u201cThread on MM to post pkgs\u201d - We have an idea now of which packages we need to build -\n* Building blocks which are each pkg as a building block such as lustre, openHPC, slurm, etc\u2026 - We have an idea of what we need to do -\n* Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet - \n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n* Meeting will be bi-weekly \u201cTantive Thursday 9:00PM UTC\u201d - Agreed -\n* Documentations - Wiki is in place but still need some work -\n
    "},{"location":"events/meeting-notes/2023-05-18/","title":"SIG/HPC meeting 2023-05-18","text":""},{"location":"events/meeting-notes/2023-05-18/#attendees","title":"Attendees:","text":"
    * Stack\n* Forrest Burt\n* Nick Eggleston\n* David H\n* Jeremy Siadal\n* Al Bowles\n* Chris Simmons\n* Sherif\n
    "},{"location":"events/meeting-notes/2023-05-18/#discussions","title":"Discussions:","text":"

    Chris: Are we willing to support all openHPC stack or just the modules and how we imagine achieving this?

    Jeremy: Clear a bit of distro related stuff from openHPC would be great such as automake / autoconf

    Stack: We need to have a base line so people can start use rocky on HPC and make Rocky accessible

    Chris: A Demo / technical talk in 4 weeks

    Chris: Are we going to focus on 8 and 9?

    Stack and Chris, would be great if we can focus on 9

    Sherif: I hope we can do both but with 9 in the spotlight \"this needs to be a SIG decision\"

    Stack: Question, if we start moving openHPC within HPC sig are they going support more distros, we don't want to break packages for other EL distros

    Chris: so far testing on Rocky as the only supported EL distro

    "},{"location":"events/meeting-notes/2023-05-18/#action-items","title":"Action items:","text":"
    * Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\"\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\"\n
    "},{"location":"events/meeting-notes/2023-05-18/#old-business","title":"Old business:","text":""},{"location":"events/meeting-notes/2023-05-18/#2023-05-04","title":"2023-05-04","text":"
    * Start building slurm - On going , a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
    "},{"location":"events/meeting-notes/2023-05-18/#2023-04-20","title":"2023-04-20","text":"
    * Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
    "},{"location":"events/meeting-notes/2023-06-01/","title":"SIG/HPC meeting 2023-06-01","text":""},{"location":"events/meeting-notes/2023-06-01/#attendees","title":"Attendees:","text":"
    * Jeremy Siadal\n* Sherif\n* Gregory Kurtzer\n* David DeBonis\n* Chris Simmons\n
    "},{"location":"events/meeting-notes/2023-06-01/#discussions","title":"Discussions:","text":"

    Getting toolchains outside of openHPC such as automake

    Greg: We need to talk if we need to have a generic SIG for toolchains

    Greg: We need to look into adding more release packages such as intel compiler

    Brain storm ideas about optimizing binaries

    David: What would be the interest of having a light weight kernel for HPC

    Jeremy: mentioning intel light weight kernel https://github.com/intel/mos

    Chris: asking if there is any benchmark, hard numbers between shipped kernel and light weight kernel, so far, nothing solid

    Sherif: Slurm now is build but not in standard path and we agreed we are going to move standard path

    Greg: make sure you have the provide type

    Chris: also make sure that downgrade works

    Greg and Chris, we can also contribute to openHPC documentation

    "},{"location":"events/meeting-notes/2023-06-01/#action-items","title":"Action items:","text":"
    * Get a list of packages from Jeremy to pick up from openHPC\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif\n* Finlise the slurm package with naming / configuration\n
    "},{"location":"events/meeting-notes/2023-06-01/#old-business","title":"Old business:","text":""},{"location":"events/meeting-notes/2023-06-01/#2023-05-18","title":"2023-05-18:","text":"
    * Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\"\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\"\n
    "},{"location":"events/meeting-notes/2023-06-01/#2023-05-04","title":"2023-05-04","text":"
    * Start building slurm - On going , a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
    "},{"location":"events/meeting-notes/2023-06-01/#2023-04-20","title":"2023-04-20","text":"
    * Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
    "},{"location":"events/meeting-notes/2023-06-15/","title":"SIG/HPC meeting 2023-06-15","text":""},{"location":"events/meeting-notes/2023-06-15/#attendees","title":"Attendees:","text":"
    * Chris Simmons\n* Nick Eggleston\n* Forrest Burt\n* Stack\n* David DeBonis\n* Jeremy Siadal\n* Greg Kurtzer\n* Sherif\n
    "},{"location":"events/meeting-notes/2023-06-15/#discussions","title":"Discussions:","text":"

    Chris gave a quick demo about openHPC / presentation

    Jeremy sent the packages

    Greg: asked how the SIG's slurm is compatible with openHPC

    Sherif needs to look at openHPC slurm packages

    Chris we need to look on how to build easybuild and look into how to improve it

    Chris and Greg talking about if there is any standard that explains how to build systems compatible with each others, openHPC does follow best practices from different entities

    Chris provided https://github.com/holgerBerger/hpc-workspace which now a part of openHPC

    Sherif mentioned, forums category is now in place https://forums.rockylinux.org/c/sig/hpc/61

    "},{"location":"events/meeting-notes/2023-06-15/#action-items","title":"Action items:","text":"
    * Sherif to look int openHPC slurm spec file\n* We need to get lists of centres and HPC that are moving to Rocky to make a blog post and PR\n
    "},{"location":"events/meeting-notes/2023-06-15/#old-business","title":"Old business:","text":""},{"location":"events/meeting-notes/2023-06-15/#2023-06-01","title":"2023-06-01:","text":"
    * Get a list of packages from Jeremy to pick up from openHPC - Done\n* Greg / Sherif talk in Rocky / RESF about generic SIG for common packages such as chaintools\n* Plan the openHPC demo Chris / Sherif - Done\n* Finlise the slurm package with naming / configuration - Done\n
    "},{"location":"events/meeting-notes/2023-06-15/#2023-05-18","title":"2023-05-18:","text":"
    * Get a demo / technical talk after 4 weeks \"Sherif can arrange that with Chris\" - Done\n* Getting a list of packages that openHPC would like to move to distros \"Jeremy will be point of contact if we need those in couple of weeks\" - Done\n
    "},{"location":"events/meeting-notes/2023-06-15/#2023-05-04","title":"2023-05-04","text":"
    * Start building slurm - On going, a bit slowing down with R9.2 and R8.8 releases, however packages are built, some minor configurations needs to be fixed -\n* Start building apptainer - on hold -\n* Start building singulartiry - on hold -\n* Start building warewulf - on hold -\n* Sherif: check about forums - done, we can have our own section if we want, can be discussed over the chat -\n
    "},{"location":"events/meeting-notes/2023-06-15/#2023-04-20","title":"2023-04-20","text":"
    * Reach out to other communities \u201cGreg\u201d - on going -\n* Reaching out for different sites that uses Rocky for HPC \u201cStack will ping few of them and others as well -Group effort-\u201d\n* Reaching out to hardware vendors - nothing done yet -\n* Statistic / public registry for sites / HPC to add themselves if they want - nothing done yet -\n
    "}]} \ No newline at end of file diff --git a/sitemap.xml.gz b/sitemap.xml.gz index a8b6f6145f05d48e11e8c7dd62ec7711fbcba3d8..76e476382bda959d0c3e89287340b4eba182a816 100644 GIT binary patch delta 15 WcmbQuG@FS{zMF%?N^>KdC?fzJGy^LD delta 15 WcmbQuG@FS{zMF&Nv-(ChQAPkEfCJ(H