ATLAS Software: Difference between revisions

From LHEP Wiki
Jump to navigation Jump to search
No edit summary
No edit summary
 
Line 31: Line 31:




=== Use DQ2 ===
=== Use Rucio ===
To be able to use dq2 command, you need to first setup the grid access.
To be able to use the rucio commands, you need to setup CVMFS first:
Then, just make :
   export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase
   source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
  source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh
   export DQ2_LOCAL_SITE_ID=UNIBE-LHEP_LOCALGROUPDISK
   alias asetup='source $AtlasSetup/scripts/asetup.sh'
You able to search for datasets, look at the number of files in the dataset, etc... (follow this link[https://twiki.cern.ch/twiki/bin/viewauth/Atlas/DQ2ClientsHowTo] if you don't know the dq2 commands)
Then type
 
  lsetup rucio
'''But''', if you want to '''download locally a dataset''', you have also to setup the Athena environment to fix the problem with python releases between dq2 and the grid environment. You can setup the Athena environment just before or after the dq2 setup, as you want.
 
'''Not sure the above is still relevant...'''




=== Athena setup with CVMFS ===
=== Athena setup with CVMFS ===
It is '''highly''' recommended to use CVMFS instead of AFS if using an Athena kit on one of the SLC5 machines in LHEP. CVMFS is a caching file system which hosts all athena releases and conditions data flat files. It was initially developed for the Cern-VM [https://twiki.cern.ch/twiki/bin/viewauth/Atlas/CernVM] Virtualization Software.  
CVMFS is a caching file system which hosts all athena releases and conditions data flat files. It was initially developed for the Cern-VM [https://twiki.cern.ch/twiki/bin/viewauth/Atlas/CernVM] Virtualization Software.  


<h4> Athena Setup </h4>
<h4> Athena Setup </h4>
To setup a particular release and testarea use the following commands and replace 17.0.2 and /path/to/my/testarea appropriately:
To setup a particular release and testarea use the following commands and replace 17.0.2 and /path/to/my/testarea appropriately:
  source /cvmfs/atlas.cern.ch/repo/sw/software/i686-slc5-gcc43-opt/17.0.2/cmtsite/asetup.sh 17.0.2
  export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase
  asetup 17.0.2 --testarea /path/to/my/testarea
  source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh
  alias asetup='source $AtlasSetup/scripts/asetup.sh'
  asetup 17.0.2 --testarea /path/to/my/testarea


<h4> Alternative Athena Setup (suitable for start-up scripts, e.g. .bash_profile)</h4>
  export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase
  source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh
  asetup 17.0.2 --single --testarea /path/to/my/testarea


<h4> Conditions DB files </h4>
<h4> Conditions DB files </h4>
Line 60: Line 55:
   /cvmfs/atlas-condb.cern.ch/repo/conditions/  
   /cvmfs/atlas-condb.cern.ch/repo/conditions/  


<h4> Development Nightly Builds (64 bit) </h4>
  source /cvmfs/atlas-nightlies.cern.ch/repo/sw/nightlies/x86_64-slc5-gcc43-opt/17.X.0/rel_4/cmtsite/setup.sh -tag=AtlasProduction,rel_4,opt,gcc43,slc5,64
  asetup AtlasProduction,rel_4,64 --gccversion 4.3.5 --gcclocation /cvmfs/atlas-nightlies.cern.ch/repo/sw/nightlies/x86_64-slc5-gcc43-opt/17.X.0/atlas-gcc-435/slc5/
<h4> Bugfix Nightly Builds </h4>
  source /cvmfs/atlas-nightlies.cern.ch/repo/sw/nightlies/i686-slc5-gcc43-opt/17.0.X/rel_4/cmtsite/setup.sh -tag=AtlasProduction,rel_4,opt,gcc43,slc5,32




Line 79: Line 68:
   cd ../run
   cd ../run
   And finally run your code calling your program (the name defined in the cmt/requirement file)
   And finally run your code calling your program (the name defined in the cmt/requirement file)
=== Obsolete pages ===
<h4> Version 13 Bern ATLAS Cluster - SLC4 (OBSOLETE) </h4>
* ATLAS Software 13.0.x uses a root version, which wants to get the home directory of the user. On lheppc* ldap setup, nssswitch (I think) calls the nss_ldap library, which is not installed on standard SLC4. root crashes because of this. Need to install (yum install nss_ldap) on SLC4 nodes for LHEP cluster.
* MooEventCompilation needs crti.o, which is provides by package glibc-devel.i386. So need to install this
* There are a few symlinks missing, which cause MooEventCompile to crash. Thanks to szymon for the fix.
  #as root, execute the following lines to make ATLAS software release 13 work
  yum install nss_ldap
  yum install glibc-devel.i386
  cd /usr/lib 
  ln -s /usr/lib/libjpeg.so.62 libjpeg.so
  ln -s /usr/lib/libfreetype.so.6 libfreetype.so
  ln -s /usr/lib/libg2c.so.0 libg2c.so
  cd /usr/X11R6/lib/
  ln -s /usr/X11R6/lib/libXm.so.3 libXm.so
If you want to automatize the process, copy this code in a file fixnode.sh, and execute as root
  yes | sh fixnode.sh
<h4> Version 14 Bern ATLAS Cluster - SLC4 (OBSOLETE) </h4>
* ATLAS Software 14.x needs a few packages. They are listed in /ngsys/slcrequiredpackages/reqpack.txt
  #to install, run as root:
  cd /ngsys/slcrequiredpackages
  sh install.sh
<h4> Version 13 UBELIX Cluster - Gentoo Linux (OBSOLETE) </h4>
Changes with every release. Follow runtime environment script to see al the tweaks made. (/home/ubelix/lhep/nordugrid/software/APPS/HEP)

Latest revision as of 15:19, 23 June 2016

Interactive machines at LHEP

 ui01.lhep.unibe.ch
 ui02.lhep.unibe.ch
 ui03.lhep.unibe.ch

Basic SVN

 svn mkdir -m “Add OQMaps dir" http://svn.lhep.unibe.ch/subv/atlas/SUSY2011/cutflow/OQMaps
 svn co http://svn.lhep.unibe.ch/subv/atlas/SUSY2011/cutflow/OQMaps
 svn add * # All files you have in OQMaps. Directories must be created like above.
 svn ci "Add OQMaps”


LCG Grid Access

To setup the lcg grid environment, setup CVMFS first :

  export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase
  source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh
  alias asetup='source $AtlasSetup/scripts/asetup.sh'

Then type:

  lsetup emi


Prepare and check your grid certificate

To prepare your grid certificate, follow the following tutorial (you can adapt the procedure to our interactive nodes):

 https://twiki.cern.ch/twiki/bin/view/AtlasComputing/WorkBook
 https://twiki.cern.ch/twiki/bin/view/AtlasComputing/WorkBookStartingGrid (Preparing your certificate)

To verify the expiry date of your grid certificate installed in the .globus subdir of your home directory (on teh interactive nodes and/or lxplus):

 cd ~/.globus
 openssl x509 -in usercert.pem -noout -enddate

Follow this link[1] for a summary of useful openssl commands.


Use Rucio

To be able to use the rucio commands, you need to setup CVMFS first:

  export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase
  source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh
  alias asetup='source $AtlasSetup/scripts/asetup.sh'

Then type

  lsetup rucio


Athena setup with CVMFS

CVMFS is a caching file system which hosts all athena releases and conditions data flat files. It was initially developed for the Cern-VM [2] Virtualization Software.

Athena Setup

To setup a particular release and testarea use the following commands and replace 17.0.2 and /path/to/my/testarea appropriately:

  export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase
  source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh
  alias asetup='source $AtlasSetup/scripts/asetup.sh'
  asetup 17.0.2 --testarea /path/to/my/testarea


Conditions DB files

The conditions DB flat files can be found in:

 /cvmfs/atlas-condb.cern.ch/repo/conditions/ 


How to use ROOT libraries from gcc

CINT, the compiler included within ROOT, is not fully reliable from my point of view just for at least two reasons. First, it allows some "bad code writing" hard to debug once you change your ROOT release. Second, CINT, as python, is not a real compiler but just interpret your code for each event. So, if your code becomes sophisticated (kinematic fits,...) CINT spend much more time to interpret your code for each event than running on the event. But, you can use ROOT as libraries from gcc, in a CMT environment in the following :

  source /terabig/atlsoft/LocalSoft/tools/config/setup.sh

Then, go in your package/cmt directory (an example can be found here [3]) This package, written for the SUSY/Trigger analysis as been documented using Doxygen. The documentation can be found here[4](page reachable only from unibe.ch domain)

  cmt config
  source setup.sh
  gmake (or make)
  cd ../run
  And finally run your code calling your program (the name defined in the cmt/requirement file)