Alternative compilers - simple example¶
GCC is the most supported compiler
The ITSR Applications Team have compiled the majority of applications available via modules on Apocrita using GCC. Whilst other compilers are available on Apocrita as outlined below, you will receive the best support from us should you choose to use GCC as well.
In all of the examples so far in this tutorial, we have used the GCC compiler. Using other compilers is also possible, but it requires a different workflow.
For the purposes of the examples below, we are going to try compile and run a very simple C++ MPI binary, first with our recommended GCC and Open MPI stack, and then using Intel and Intel MPI.
Comparative GCC and Open MPI example¶
For the purposes of comparison, a spack.yaml
template for compiling and
running this example using our recommended GCC and Open MPI setup can be found
in the
spack-config-templates repository
inside the environment-templates
sub-directory for the version of Spack you
are using - called helloworld-gcc-ompi.yaml
:
Contents of spack.yaml for Hello World GCC/Open MPI
(click to expand)
spack:
compilers:
- compiler:
spec: gcc@=11.4.1
paths:
cc: /bin/gcc
cxx: /bin/g++
f77: /bin/gfortran
fc: /bin/gfortran
flags: {}
operating_system: rocky9
target: x86_64
modules: []
environment: {}
extra_rpaths: []
- compiler:
spec: gcc@=12.2.0
paths:
cc: /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/gcc/12.2.0-6frskzg/bin/gcc
cxx: /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/gcc/12.2.0-6frskzg/bin/g++
f77: /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/gcc/12.2.0-6frskzg/bin/gfortran
fc: /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/gcc/12.2.0-6frskzg/bin/gfortran
flags: {}
operating_system: rocky9
target: x86_64
modules: []
environment: {}
extra_rpaths: []
concretizer:
unify: true
config:
install_tree:
root: /data/scratch/${USER}/spack/apps
projections:
^mpi: '{architecture}/{compiler.name}-{compiler.version}/{name}/{version}-{^mpi.name}-{^mpi.version}-{hash:7}'
all: '{architecture}/{compiler.name}-{compiler.version}/{name}/{version}-{hash:7}'
license_dir: /data/scratch/${USER}/spack/licenses
source_cache: /data/scratch/${USER}/spack/cache
modules:
prefix_inspections:
./share/aclocal:
- ACLOCAL_PATH
./lib:
- LD_LIBRARY_PATH
- LIBRARY_PATH
./lib64:
- LD_LIBRARY_PATH
- LIBRARY_PATH
./include:
- C_INCLUDE_PATH
- CPLUS_INCLUDE_PATH
packages:
all:
target: [x86_64_v4]
openmpi:
buildable: false
prefer:
- '+gpfs'
externals:
- spec: "openmpi@5.0.3%gcc@12.2.0"
prefix: /share/apps/rocky9/general/libs/openmpi/gcc/12.2.0/5.0.3
ucx:
buildable: false
externals:
- spec: "ucx@1.16.0%gcc@12.2.0"
prefix: /share/apps/rocky9/general/libs/ucx/gcc/12.2.0/1.16.0
specs:
- gcc@12.2.0
- openmpi
upstreams:
apocrita:
install_tree: /share/apps/rocky9/spack/apps
view:
default:
root: .spack-env/view
If we save this as
/data/scratch/${USER}/spack-environments/helloworld-gcc-ompi/spack.yaml
and
then activate, spec and install the environment:
Output of activate
and spec
Spack commands (click to expand)
$ spack env activate -p /data/scratch/${USER}/spack-environments/helloworld-gcc-ompi
[helloworld-gcc-ompi] $ spack spec
Input spec
--------------------------------
- openmpi
Concretized
--------------------------------
[e] openmpi@5.0.3%gcc@12.2.0~atomics~cuda+gpfs~internal-hwloc~internal-libevent~internal-pmix~java~legacylaunchers~lustre~memchecker~openshmem~orterunprefix~romio+rsh~static+vt+wrapper-rpath build_system=autotools fabrics=none romio-filesystem=none schedulers=none arch=linux-rocky9-x86_64_v4
Input spec
--------------------------------
- gcc@12.2.0
Concretized
--------------------------------
[^] gcc@12.2.0%gcc@11.4.1~binutils+bootstrap~graphite~nvptx~piclibs~profiled~strip build_system=autotools build_type=RelWithDebInfo languages='c,c++,fortran' arch=linux-rocky9-x86_64_v4
[^] ^diffutils@3.10%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^gawk@5.3.0%gcc@11.4.1~nls build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^libsigsegv@2.14%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^readline@8.2%gcc@11.4.1 build_system=autotools patches=bbf97f1 arch=linux-rocky9-x86_64_v4
[^] ^gcc-runtime@11.4.1%gcc@11.4.1 build_system=generic arch=linux-rocky9-x86_64_v4
[e] ^glibc@2.34%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^gmake@4.4.1%gcc@11.4.1~guile build_system=generic arch=linux-rocky9-x86_64_v4
[^] ^gmp@6.2.1%gcc@11.4.1+cxx build_system=autotools libs=shared,static patches=69ad2e2 arch=linux-rocky9-x86_64_v4
[^] ^autoconf@2.72%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^automake@1.16.5%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^m4@1.4.19%gcc@11.4.1+sigsegv build_system=autotools patches=9dc5fbd,bfdffa7 arch=linux-rocky9-x86_64_v4
[^] ^libtool@2.4.7%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^findutils@4.9.0%gcc@11.4.1 build_system=autotools patches=440b954 arch=linux-rocky9-x86_64_v4
[^] ^mpc@1.3.1%gcc@11.4.1 build_system=autotools libs=shared,static arch=linux-rocky9-x86_64_v4
[^] ^mpfr@4.2.1%gcc@11.4.1 build_system=autotools libs=shared,static arch=linux-rocky9-x86_64_v4
[^] ^autoconf-archive@2023.02.20%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^perl@5.38.0%gcc@11.4.1+cpanm+opcode+open+shared+threads build_system=generic patches=714e4d1 arch=linux-rocky9-x86_64_v4
[^] ^berkeley-db@18.1.40%gcc@11.4.1+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=linux-rocky9-x86_64_v4
[^] ^bzip2@1.0.8%gcc@11.4.1~debug~pic+shared build_system=generic arch=linux-rocky9-x86_64_v4
[^] ^gdbm@1.23%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^texinfo@7.0.3%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^gettext@0.22.5%gcc@11.4.1+bzip2+curses+git~libunistring+libxml2+pic+shared+tar+xz build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^libxml2@2.10.3%gcc@11.4.1+pic~python+shared build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^tar@1.34%gcc@11.4.1 build_system=autotools zip=pigz arch=linux-rocky9-x86_64_v4
[^] ^pigz@2.8%gcc@11.4.1 build_system=makefile arch=linux-rocky9-x86_64_v4
[^] ^xz@5.4.6%gcc@11.4.1~pic build_system=autotools libs=shared,static arch=linux-rocky9-x86_64_v4
[^] ^ncurses@6.5%gcc@11.4.1~symlinks+termlib abi=none build_system=autotools patches=7a351bc arch=linux-rocky9-x86_64_v4
[^] ^pkgconf@2.2.0%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^zlib-ng@2.1.6%gcc@11.4.1+compat+new_strategies+opt+pic+shared build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^zstd@1.5.6%gcc@11.4.1+programs build_system=makefile compression=none libs=shared,static arch=linux-rocky9-x86_64_v4
Output of spack install
(click to expand)
[helloworld-gcc-ompi] $ spack install -j ${NSLOTS}
==> Concretized openmpi
[e] f6rfqov openmpi@5.0.3%gcc@12.2.0~atomics~cuda+gpfs~internal-hwloc~internal-libevent~internal-pmix~java~legacylaunchers~lustre~memchecker~openshmem~orterunprefix~romio+rsh~static+vt+wrapper-rpath build_system=autotools fabrics=none romio-filesystem=none schedulers=none arch=linux-rocky9-x86_64_v4
==> Concretized gcc@12.2.0
[^] 6frskzg gcc@12.2.0%gcc@11.4.1~binutils+bootstrap~graphite~nvptx~piclibs~profiled~strip build_system=autotools build_type=RelWithDebInfo languages='c,c++,fortran' arch=linux-rocky9-x86_64_v4
[^] 7zermel ^diffutils@3.10%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] cfhpr5n ^gawk@5.3.0%gcc@11.4.1~nls build_system=autotools arch=linux-rocky9-x86_64_v4
[^] m2gvtuy ^libsigsegv@2.14%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ea7drzj ^readline@8.2%gcc@11.4.1 build_system=autotools patches=bbf97f1 arch=linux-rocky9-x86_64_v4
[^] llid4hw ^gcc-runtime@11.4.1%gcc@11.4.1 build_system=generic arch=linux-rocky9-x86_64_v4
[e] xri56vc ^glibc@2.34%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] xchit5a ^gmake@4.4.1%gcc@11.4.1~guile build_system=generic arch=linux-rocky9-x86_64_v4
[^] ibyfd4t ^gmp@6.2.1%gcc@11.4.1+cxx build_system=autotools libs=shared,static patches=69ad2e2 arch=linux-rocky9-x86_64_v4
[^] h32ralr ^autoconf@2.72%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] y5sqj65 ^automake@1.16.5%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] fdljdyg ^m4@1.4.19%gcc@11.4.1+sigsegv build_system=autotools patches=9dc5fbd,bfdffa7 arch=linux-rocky9-x86_64_v4
[^] c43op4r ^libtool@2.4.7%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] 62lolye ^findutils@4.9.0%gcc@11.4.1 build_system=autotools patches=440b954 arch=linux-rocky9-x86_64_v4
[^] cbwhyvl ^mpc@1.3.1%gcc@11.4.1 build_system=autotools libs=shared,static arch=linux-rocky9-x86_64_v4
[^] u7r7eqd ^mpfr@4.2.1%gcc@11.4.1 build_system=autotools libs=shared,static arch=linux-rocky9-x86_64_v4
[^] oo5zdhe ^autoconf-archive@2023.02.20%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] eyk53wh ^perl@5.38.0%gcc@11.4.1+cpanm+opcode+open+shared+threads build_system=generic patches=714e4d1 arch=linux-rocky9-x86_64_v4
[^] cogxxtx ^berkeley-db@18.1.40%gcc@11.4.1+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=linux-rocky9-x86_64_v4
[^] uj4wyhx ^bzip2@1.0.8%gcc@11.4.1~debug~pic+shared build_system=generic arch=linux-rocky9-x86_64_v4
[^] bx77xc6 ^gdbm@1.23%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] lpjte4q ^texinfo@7.0.3%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] udcuonu ^gettext@0.22.5%gcc@11.4.1+bzip2+curses+git~libunistring+libxml2+pic+shared+tar+xz build_system=autotools arch=linux-rocky9-x86_64_v4
[^] q6zmsq6 ^libxml2@2.10.3%gcc@11.4.1+pic~python+shared build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ivzmnos ^tar@1.34%gcc@11.4.1 build_system=autotools zip=pigz arch=linux-rocky9-x86_64_v4
[^] somkvv4 ^pigz@2.8%gcc@11.4.1 build_system=makefile arch=linux-rocky9-x86_64_v4
[^] rwn7pno ^xz@5.4.6%gcc@11.4.1~pic build_system=autotools libs=shared,static arch=linux-rocky9-x86_64_v4
[^] 4n2uzha ^ncurses@6.5%gcc@11.4.1~symlinks+termlib abi=none build_system=autotools patches=7a351bc arch=linux-rocky9-x86_64_v4
[^] 7wg26bz ^pkgconf@2.2.0%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] g2yruc3 ^zlib-ng@2.1.6%gcc@11.4.1+compat+new_strategies+opt+pic+shared build_system=autotools arch=linux-rocky9-x86_64_v4
[^] my7tyw6 ^zstd@1.5.6%gcc@11.4.1+programs build_system=makefile compression=none libs=shared,static arch=linux-rocky9-x86_64_v4
[+] /share/apps/rocky9/general/libs/openmpi/gcc/12.2.0/5.0.3 (external openmpi-5.0.3-f6rfqovydeohhtdzs6yl2qiro2hxltpm)
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/gcc/12.2.0-6frskzg
==> Updating view at /data/scratch/abc123/spack-environments/helloworld-gcc-ompi/.spack-env/view
To ensure that all required environment variables are activated for Open MPI, it is best to deactivate the environment and then re-activate it again to enable them:
spack env deactivate
spack env activate -p /data/scratch/${USER}/spack-environments/helloworld-gcc-ompi
Create a file called hello_world_mpi.cpp
in a folder of your choice, for this
example we will use the path:
/data/scratch/${USER}/hello_world_cpp/hello_world_mpi.cpp
Use the sample code from the previously mentioned example:
hello_world_mpi.cpp
(click to expand)
#include <stdio.h>
#include <mpi.h>
int main(int argc, char** argv){
int process_Rank, size_Of_Cluster;
MPI_Init(&argc, &argv);
MPI_Comm_size(MPI_COMM_WORLD, &size_Of_Cluster);
MPI_Comm_rank(MPI_COMM_WORLD, &process_Rank);
for(int i = 0; i < size_Of_Cluster; i++){
if(i == process_Rank){
printf("Hello World from process %d of %d\n", process_Rank, size_Of_Cluster);
}
MPI_Barrier(MPI_COMM_WORLD);
}
MPI_Finalize();
return 0;
}
Then, much as our
previous cmake
example, the
Spack environment enables us to compile the simple example:
hello_world_ompi.exe
compile and install
[helloworld-gcc-ompi] $ cd /data/scratch/${USER}/hello_world_cpp
[helloworld-gcc-ompi] [hello_world_cpp] $ mpic++ hello_world_mpi.cpp -o hello_world_ompi.exe
You can then run the compiled executable as instructed:
hello_world_ompi.exe
output
[helloworld-gcc-ompi] [hello_world_cpp] $ mpirun -np ${NSLOTS} ./hello_world_ompi.exe
Hello World from process 0 of 8
Hello World from process 1 of 8
Hello World from process 2 of 8
Hello World from process 3 of 8
Hello World from process 4 of 8
Hello World from process 5 of 8
Hello World from process 6 of 8
Hello World from process 7 of 8
Hello World from process 8 of 8
Intel and Intel MPI example¶
Let's run through the same thing again, but this time we will compile using Intel and Intel MPI.
A spack.yaml
template for compiling and running this example using Intel and
Intel MPI can be found in the
spack-config-templates repository
inside the environment-templates
sub-directory for the version of Spack you
are using - called helloworld-intel-intelmpi.yaml
:
Contents of spack.yaml for Hello World Intel/Intel MPI
(click to expand)
spack:
compilers:
- compiler:
spec: oneapi@=2024.1.0
paths:
cc: /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-12.2.0/intel-oneapi-compilers/2024.1.0-bdqsx5f/compiler/2024.1/bin/icx
cxx: /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-12.2.0/intel-oneapi-compilers/2024.1.0-bdqsx5f/compiler/2024.1/bin/icpx
f77: /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-12.2.0/intel-oneapi-compilers/2024.1.0-bdqsx5f/compiler/2024.1/bin/ifx
fc: /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-12.2.0/intel-oneapi-compilers/2024.1.0-bdqsx5f/compiler/2024.1/bin/ifx
flags: {}
operating_system: rocky9
target: x86_64
modules: []
environment: {}
extra_rpaths: []
concretizer:
unify: true
config:
install_tree:
root: /data/scratch/${USER}/spack/apps
projections:
^mpi: '{architecture}/{compiler.name}-{compiler.version}/{name}/{version}-{^mpi.name}-{^mpi.version}-{hash:7}'
all: '{architecture}/{compiler.name}-{compiler.version}/{name}/{version}-{hash:7}'
license_dir: /data/scratch/${USER}/spack/licenses
source_cache: /data/scratch/${USER}/spack/cache
modules:
prefix_inspections:
./share/aclocal:
- ACLOCAL_PATH
./lib:
- LD_LIBRARY_PATH
- LIBRARY_PATH
./lib64:
- LD_LIBRARY_PATH
- LIBRARY_PATH
./include:
- C_INCLUDE_PATH
- CPLUS_INCLUDE_PATH
packages:
all:
target: [x86_64_v4]
specs:
- intel-oneapi-compilers
- intel-oneapi-mpi
upstreams:
apocrita:
install_tree: /share/apps/rocky9/spack/apps
view:
default:
root: .spack-env/view
So, what has changed?
- The
compilers
section just listsoneapi@=2024.1.0
, the Intel compiler - The
packages
section no longer lists our external Open MPI and UCX versions... - ...because we have replaced Open MPI with
intel-oneapi-mpi
in ourspecs
intel-oneapi-compilers
replaces thegcc@12.2.0
we had in the first GCC/Open MPI example above- Remember, Intel packages commence with
intel-oneapi-*
- for more information refer back to the Custom Scopes tutorial
If we save this as
/data/scratch/${USER}/spack-environments/helloworld-intel-intelmpi/spack.yaml
and then activate, spec and install the environment:
Output of activate
and spec
Spack commands (click to expand)
$ spack env activate -p /data/scratch/${USER}/spack-environments/helloworld-intel-intelmpi
[helloworld-intel-intelmpi] $ spack spec
Input spec
--------------------------------
- intel-oneapi-mpi
Concretized
--------------------------------
[^] intel-oneapi-mpi@2021.12.1%gcc@12.2.0+envmods~external-libfabric~generic-names~ilp64 build_system=generic arch=linux-rocky9-x86_64_v4
[^] ^gcc-runtime@12.2.0%gcc@12.2.0 build_system=generic arch=linux-rocky9-x86_64_v4
[e] ^glibc@2.34%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
Input spec
--------------------------------
- intel-oneapi-compilers
Concretized
--------------------------------
[^] intel-oneapi-compilers@2024.1.0%gcc@12.2.0+envmods build_system=generic arch=linux-rocky9-x86_64_v4
[^] ^gcc-runtime@12.2.0%gcc@12.2.0 build_system=generic arch=linux-rocky9-x86_64_v4
[e] ^glibc@2.34%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^patchelf@0.17.2%gcc@12.2.0 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^gmake@4.4.1%gcc@11.4.1~guile build_system=generic arch=linux-rocky9-x86_64_v4
[^] ^gcc-runtime@11.4.1%gcc@11.4.1 build_system=generic arch=linux-rocky9-x86_64_v4
Output of spack install
(click to expand)
[helloworld-intel-intelmpi] [abc123@ddy58 ~]$ spack install -j $NSLOTS
==> Concretized intel-oneapi-mpi
[^] kmurskw intel-oneapi-mpi@2021.12.1%gcc@12.2.0+envmods~external-libfabric~generic-names~ilp64 build_system=generic arch=linux-rocky9-x86_64_v4
[^] w77gg5r ^gcc-runtime@12.2.0%gcc@12.2.0 build_system=generic arch=linux-rocky9-x86_64_v4
[e] xri56vc ^glibc@2.34%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
==> Concretized intel-oneapi-compilers
[^] bdqsx5f intel-oneapi-compilers@2024.1.0%gcc@12.2.0+envmods build_system=generic arch=linux-rocky9-x86_64_v4
[^] w77gg5r ^gcc-runtime@12.2.0%gcc@12.2.0 build_system=generic arch=linux-rocky9-x86_64_v4
[e] xri56vc ^glibc@2.34%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] nlkm7ih ^patchelf@0.17.2%gcc@12.2.0 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] xchit5a ^gmake@4.4.1%gcc@11.4.1~guile build_system=generic arch=linux-rocky9-x86_64_v4
[^] llid4hw ^gcc-runtime@11.4.1%gcc@11.4.1 build_system=generic arch=linux-rocky9-x86_64_v4
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-12.2.0/intel-oneapi-mpi/2021.12.1-kmurskw
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-12.2.0/intel-oneapi-compilers/2024.1.0-bdqsx5f
==> Updating view at /data/scratch/abc123/spack-environments/helloworld-intel-intelmpi/.spack-env/view
To ensure that all required environment variables are activated for Intel and Intel MPI, it is best to deactivate the environment and then re-activate it again to enable them:
spack env deactivate
spack env activate -p /data/scratch/${USER}/spack-environments/helloworld-intel-intelmpi
We can use the same hello_world_mpi.cpp
as above to again compile the simple
example (this time we will use the name hello_world_intelmpi.exe
):
Use mpiicpx
instead of mpiicc
The linked example uses mpiicc
to compile which is now
deprecated. As we are using
version 2024.1.0
for this tutorial, you will need to use mpiicpx
instead, as detailed below.
hello_world_intelmpi.exe
compile and install
[helloworld-intel-intelmpi] $ cd /data/scratch/${USER}/hello_world_cpp
[helloworld-intel-intelmpi] [hello_world_cpp] $ mpiicpx hello_world_mpi.cpp -o hello_world_intelmpi.exe
You can again run the compiled executable as instructed:
hello_world_intelmpi.exe
output
[helloworld-intel-intelmpi] [hello_world_cpp] $ mpirun -np ${NSLOTS} ./hello_world_intelmpi.exe
Hello World from process 0 of 8
Hello World from process 1 of 8
Hello World from process 2 of 8
Hello World from process 3 of 8
Hello World from process 4 of 8
Hello World from process 5 of 8
Hello World from process 6 of 8
Hello World from process 7 of 8
Hello World from process 8 of 8