miƩrcoles, 22 de septiembre de 2010

Turn Vim into a bash IDE

http://www.linux.com/archive/articles/114359

Turn Vim into a bash IDE
By Joe 'Zonker' Brockmeier on June 11, 2007 (9:01:00 PM)

By itself, Vim is one of the best editors for shell scripting. With a little tweaking, however, you can turn Vim into a full-fledged IDE for writing scripts. You could do it yourself, or you can just install Fritz Mehner's Bash Support plugin.



viernes, 4 de junio de 2010

How To Set Environment Variables in PBS Batch Scripts?

In order to use certain software packages on the Shared Computing Clusters, it is necessary to set one or more environment variables. This is accomplished automatically with the module command. It is often necessary, however, for these environment variables to be visible from within a PBS batch script, yet the module command does not work from within a batch script on a compute node. The following document describes the preferred way of setting environment variables inside a PBS batch script:
Inheriting Environment Variables
The preferred way of setting environment variables in a PBS batch script is to simply have the script inherit the environment of the parent shell from the login node. For example, you might execute the following command on a login node: module load mpich. This will set the necessary environment variables to use the mpich package. In order for these variables to be inherited in the PBS batch script that will execute on a compute node, include the #PBS -V option in the batch script as follows:
#!/bin/bash
#PBS -N Test
#PBS -l nodes=2:ppn=1
#PBS -S /bin/bash
<u><strong>#PBS -V</strong></u>
#PBS -m abe
#PBS -o /home/username/workingdir
#PBS -e /home/username/workingdir
echo "I ran on:"
cat $PBS_NODEFILE
mpiexec ./myjob.exe
When the job is submitted it will inherit all environment variables that were set in the parent shell on the login node, not just those set by the module command . Care should be taken to make sure that there are no other environment variables in the parent shell that could cause problems in the batch script.

jueves, 3 de junio de 2010

Script to launch MOLCAS in a MVAPICH/Torque/Maui env.

***********************************************************
#!/bin/bash

#PBS -l nodes=1:ppn=8
#PBS -l walltime=00:10:00
#PBS -q batch
#PBS -j oe
#PBS -o err.out

echo execution host: $PBS_O_HOST
echo Your user ID where you ran qsub ID: $PBS_O_LOGNAME
echo Your home directory where you ran qsub HOME: $PBS_O_HOME
echo The working directory where you ran qsub: $PBS_O_WORKDIR
echo The original queue you submitted to: $PBS_O_QUEUE
echo The queue the job is executing from: $PBS_QUEUE
echo The jobs PBS identifier: $PBS_JOBID
echo The jobs name: $PBS_JOBNAME
echo The contents of nodefile is:
cat $PBS_NODEFILE
NP=`(wc -l < $PBS_NODEFILE) | awk '{print $1}'`
echo numero de procesadores para ejecutar la tarea
echo $NP
export Project=CH4
export MOLCAS=/data1/aldo/Source-Code/molcas/molcas74
echo ' ---------------------------------------'
echo ' Job:' $Project
echo ' ' `cat $MOLCAS/.molcashome`
echo ' MolCASMem=' $MOLCASMEM
echo ' Date:' `date`
echo ' ---------------------------------------'
export MOLCAS_PROJECT=CH4
echo "Molcas MOLCAS_PROJECT = " $Project export MOLCAS_WORKDIR=$PBS_O_WORKDIR/tmp_$Project
echo "Molcas MOLCAS_WORKDIR = " $MOLCAS_WORKDIR
# Number of processors the job is going to run on
NP=`(wc -l < $PBS_NODEFILE) | awk '{print $1}'`
export CPUS=$NP
cd $PBS_O_WORKDIR $MOLCAS/sbin/molcas.driver $Project.input
echo ' ---------------------------------------'
echo ' Job done: ' $Project
echo ' Date: ' `date`
echo ' ---------------------------------------' ************************************************************

Improved script to launch MOLCAS in a MVAPICH/NoPBS env.

#!/bin/bash
export Project=project_name
export MOLCAS=/data1/aldo/Source-Code/molcas/molcas74

#An example for using /scratch area as a parent for WorkDir, keep(NO)/remove(YES)
#WorkDir before a calculation, and keep/remove it when calculation finished
#generated for the the name of the input file:
CurrDir=`pwd`
export MOLCAS_WORKDIR=$CurrDir/scr_$Project
export MOLCAS_NEW_WORKDIR=YES
export MOLCAS_KEEP_WORKDIR=YES
echo ' ---------------------------------------'
echo ' Job:' $Project
echo ' ' `cat $MOLCAS/.molcashome`
echo ' MolCASMem=' $MOLCASMEM
echo ' Date:' `date`
echo ' ---------------------------------------'
export PBS_NODEFILE=$CurrDir/nodelist
export CPUS=8
cd $CurrDir
$MOLCAS/sbin/molcas.driver $Project.input > $Project.out 2> $WorkDir/$Project.err
echo ' ---------------------------------------'
echo ' Job done: ' $Project
echo ' Date: ' `date`
echo ' ---------------------------------------'


miƩrcoles, 2 de junio de 2010

MOLCAS script to launch MVAPICH/NoPBS job

This script will launch molcas on N cpus (depending on the variable $CPUS)  using the MVAPICH/parallel paradigm.
Used to test molcas installation in a totally occupied cluster!!

*************************************************************************
#!/bin/bash
MOLCAS=/data1/aldo/Source-Code/molcas/molcas74
export MOLCAS

#crates the temporary output files inside the working directory
MOLCAS_WORKDIR=
export MOLCAS_WORKDIR

#How many CPUS do you want for the calculation?
CPUS=2
export CPUS

# As I'm not using the PBS queue system at this time I've to write a file with
# the name of the hosts which will be used for the calculation (nodelist)
# and put it inside the working area, so molcas can find it !
PBS_NODEFILE=/data1/aldo/molcas/test-ch4/nodelist
export PBS_NODEFILE

#Execute the application by calling the driver script, here specifying the whole path
# because is not in a user wide available location...yet !

/data1/aldo/Source-Code/molcas/molcas74/sbin/molcas.driver molcas-input

# Ready to rock !!
***************************************************************

MOLCAS command line " molcas help basis XX"

** type  molcas help basis A  to list basis set for the element A
*********************************************************************
$> molcas help basis Pd
Recommended basis sets for  Pd
Pd.ANO-RCC-MB                            Pd.ANO-RCC...5s4p2d.                   
Pd.ANO-RCC-VDZ                           Pd.ANO-RCC...6s5p3d.                   
Pd.ANO-RCC-VDZP                          Pd.ANO-RCC...6s5p3d1f.                 
Pd.ANO-RCC-VTZP                          Pd.ANO-RCC...7s6p4d2f1g.               
Pd.ANO-RCC-VQZP                          Pd.ANO-RCC...8s7p5d3f2g1h.               

 Other basis sets for  Pd
Pd.ECP.HW.5s6p4d.3s3p2d.18e-LANL2DZ.
Pd.ANO-DK3.Tsuchiya.23s19p12d.4s3p2d.
Pd.ano-rcc.Roos.21s18p13d6f4g2h.10s9p9d6f4g2h.
Pd.ECP.Dolg.8s7p6d.6s5p3d.18e-MWB.                                                    
Pd.ECP.Barandiaran.11s8p7d.1s2p2d.16e-CG-AIMP.                                
Pd.ECP.Barandiaran.11s8p7d3f.1s2p2d1f.16e-CG-AIMP.                            
Pd.ECP.Barandiaran.11s8p7d.1s2p2d.16e-NR-AIMP.                                
Pd.ECP.Rakowitz.11s8p7d3f.5s4p4d1f.18e-NP-AIMP.                               
Pd.ECP.Stoll.8s7p6d.6s5p3d.18e-MWB.
Pd.Raf-r.Wahlgren.17s13p9d2f.7s6p4d2f.                                   
Pd.ECP.Hay-Wadt.5s6p4d.3s3p2d.18e-LANL2DZ.
*******************************************************************

MOLCAS command line " molcas help"

Useful Commands
******************************************************************
$> molcas help
Welcome to molcas help!

type  molcas help module  to get the list of keywords
type  molcas help module keyword  to get the description of keyword
type  molcas help -t text  to locate text
type  molcas help ENVIRONMENT  to list ENV variables
type  molcas help EMIL  to list EMIL commands
type  molcas help basis A  to list basis set for the element A

Installed modules:
ALASKA CASPT2 CASVB CCSDT CHCC CHECK CHT3 CIISCMNG CPF EMIL ESPF ESPF
EXPBAS FFPT GATEWAY GENANO GRID_IT GUESSORB GUGA LOCALISATION LOPROP M2SO
MBPT2 MCKINLEY MCLR MOTRA MRCI MULA NUMERICAL_GRADIENT PARALLELTEST RASSCF
RASSI SCF SEWARD SLAPAF VIBROT

Utilities:
edit verify installpkg help find copy install timing defdoc snooper revert
getpatch uninstallpkg checksum uninstall gv
*********************************************************************