ATLAS Software: Difference between revisions

From LHEP Wiki
Jump to navigation Jump to search
No edit summary
 
m (1 revision imported)
(No difference)

Revision as of 09:49, 18 March 2015

Interactive machines at LHEP

 lheppc1.unibe.ch
 lheppc7.unibe.ch


Basic SVN

 svn mkdir -m “Add OQMaps dir" http://svn.lhep.unibe.ch/subv/atlas/SUSY2011/cutflow/OQMaps
 svn co http://svn.lhep.unibe.ch/subv/atlas/SUSY2011/cutflow/OQMaps
 svn add * # All files you have in OQMaps. Directories must be created like above.
 svn ci "Add OQMaps”


LCG Grid Access

To setup the lcg grid environment, type the following :

  source /terabig/atlsoft/LocalSoft/glite/glite-UI-current/external/etc/profile.d/grid-env.sh && unset GT_PROXY_MODE


Nordugrid Grid Access

To send jobs to Swiss clusters, you need to setup the nordugrid standalone client :

  cd /terabig/atlsoft/LocalSoft/nordugrid-arc/nordugrid-arc-ui-current
  source setup.sh

Beware 1: You have to go into the nordugrid directory and then setup from this location, otherwise it does not work

Beware 2: Both LCG and Nordugrid use the same instance of your grid certificate, so you don't have to change/renew your grid proxy, but you can not setup both environments in the same xterm.


Prepare and check your grid certificate

To prepare your grid certificate, follow the following tutorial (you can adapt the procedure to our interactive nodes):

 https://twiki.cern.ch/twiki/bin/viewauth/Atlas/FullGangaAtlasTutorial#3_1_Preparing_your_Grid_Certific

To verify the expiry date of your grid certificate installed in the .globus subdir of your home directory (on teh interactive nodes and/or lxplus):

 cd ~/.globus
 openssl x509 -in usercert.pem -noout -enddate

Follow this link[1] for a summary of useful openssl commands.


Use DQ2

To be able to use dq2 command, you need to first setup the grid access. Then, just make :

  source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
  export DQ2_LOCAL_SITE_ID=UNIBE-LHEP_LOCALGROUPDISK

You able to search for datasets, look at the number of files in the dataset, etc... (follow this link[2] if you don't know the dq2 commands)

But, if you want to download locally a dataset, you have also to setup the Athena environment to fix the problem with python releases between dq2 and the grid environment. You can setup the Athena environment just before or after the dq2 setup, as you want.

Not sure the above is still relevant...


Athena setup with CVMFS

It is highly recommended to use CVMFS instead of AFS if using an Athena kit on one of the SLC5 machines in LHEP. CVMFS is a caching file system which hosts all athena releases and conditions data flat files. It was initially developed for the Cern-VM [3] Virtualization Software.

Athena Setup

To setup a particular release and testarea use the following commands and replace 17.0.2 and /path/to/my/testarea appropriately:

 source /cvmfs/atlas.cern.ch/repo/sw/software/i686-slc5-gcc43-opt/17.0.2/cmtsite/asetup.sh 17.0.2
 asetup 17.0.2 --testarea /path/to/my/testarea

Alternative Athena Setup (suitable for start-up scripts, e.g. .bash_profile)

 export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase
 source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh
 asetup 17.0.2 --single --testarea /path/to/my/testarea

Conditions DB files

The conditions DB flat files can be found in:

 /cvmfs/atlas-condb.cern.ch/repo/conditions/ 

Development Nightly Builds (64 bit)

 source /cvmfs/atlas-nightlies.cern.ch/repo/sw/nightlies/x86_64-slc5-gcc43-opt/17.X.0/rel_4/cmtsite/setup.sh -tag=AtlasProduction,rel_4,opt,gcc43,slc5,64
 asetup AtlasProduction,rel_4,64 --gccversion 4.3.5 --gcclocation /cvmfs/atlas-nightlies.cern.ch/repo/sw/nightlies/x86_64-slc5-gcc43-opt/17.X.0/atlas-gcc-435/slc5/ 

Bugfix Nightly Builds

 source /cvmfs/atlas-nightlies.cern.ch/repo/sw/nightlies/i686-slc5-gcc43-opt/17.0.X/rel_4/cmtsite/setup.sh -tag=AtlasProduction,rel_4,opt,gcc43,slc5,32 


How to use ROOT libraries from gcc

CINT, the compiler included within ROOT, is not fully reliable from my point of view just for at least two reasons. First, it allows some "bad code writing" hard to debug once you change your ROOT release. Second, CINT, as python, is not a real compiler but just interpret your code for each event. So, if your code becomes sophisticated (kinematic fits,...) CINT spend much more time to interpret your code for each event than running on the event. But, you can use ROOT as libraries from gcc, in a CMT environment in the following :

  source /terabig/atlsoft/LocalSoft/tools/config/setup.sh

Then, go in your package/cmt directory (an example can be found here [4]) This package, written for the SUSY/Trigger analysis as been documented using Doxygen. The documentation can be found here[5](page reachable only from unibe.ch domain)

  cmt config
  source setup.sh
  gmake (or make)
  cd ../run
  And finally run your code calling your program (the name defined in the cmt/requirement file)


Obsolete pages

Version 13 Bern ATLAS Cluster - SLC4 (OBSOLETE)

  • ATLAS Software 13.0.x uses a root version, which wants to get the home directory of the user. On lheppc* ldap setup, nssswitch (I think) calls the nss_ldap library, which is not installed on standard SLC4. root crashes because of this. Need to install (yum install nss_ldap) on SLC4 nodes for LHEP cluster.
  • MooEventCompilation needs crti.o, which is provides by package glibc-devel.i386. So need to install this
  • There are a few symlinks missing, which cause MooEventCompile to crash. Thanks to szymon for the fix.
  #as root, execute the following lines to make ATLAS software release 13 work
  yum install nss_ldap 
  yum install glibc-devel.i386
  cd /usr/lib   
  ln -s /usr/lib/libjpeg.so.62 libjpeg.so
  ln -s /usr/lib/libfreetype.so.6 libfreetype.so
  ln -s /usr/lib/libg2c.so.0 libg2c.so
  cd /usr/X11R6/lib/
  ln -s /usr/X11R6/lib/libXm.so.3 libXm.so

If you want to automatize the process, copy this code in a file fixnode.sh, and execute as root

  yes | sh fixnode.sh

Version 14 Bern ATLAS Cluster - SLC4 (OBSOLETE)

  • ATLAS Software 14.x needs a few packages. They are listed in /ngsys/slcrequiredpackages/reqpack.txt
  #to install, run as root:
  cd /ngsys/slcrequiredpackages
  sh install.sh

Version 13 UBELIX Cluster - Gentoo Linux (OBSOLETE)

Changes with every release. Follow runtime environment script to see al the tweaks made. (/home/ubelix/lhep/nordugrid/software/APPS/HEP)