Showing posts with label compile ecce. Show all posts
Showing posts with label compile ecce. Show all posts

25 March 2013

370. Compiling ECCE 6.4 on Scientific Linux 6.3 (Carbon; Red Hat; CentOS)

!NOTE! If you provide ECCE with 'localhost' as the hostname, be aware that this will block outside access: http://www.nwchem-sw.org/index.php/Special:AWCforum/st/id858/#post_3178

This build is almost identical to that on Fedora 18. I'm posting it as a separate post since I suspect a fair number of nwchem/ecce users are using Scientific Linux/CentOS/Red Hat. The main difference between fedora 18 and scientific linux 6.3 is that you don't have to worry about getops.pl errors on scientific linux. Yes, no manual patching, which probably reflects the more conservative approach in SL/RHEL than in fedora.


1. Get the files
Go to http://ecce.emsl.pnl.gov/using/download.shtml, fill out the form and download ecce-v6.4-src.tar.gz and put it in ~/tmp.

2. Install dependencies and build
sudo yum install vim csh gcc gcc-c++ gcc-gfortran java-1.7.0-openjdk-devel python-devel ant gtk2-devel libjpeg-turbo-devel libtool ImageMagick libXt-devel xterm mesa-libGLU-devel
cd ~/tmp
tar xvf ecce-v6.4-src.tar.bz2
cd ecce-v6.4/
export ECCE_HOME=`pwd`
cd build/
./build_ecce
Do you want to skip these checks for future build_ecce invocations (y/n)?
Answer yes

./build_ecce
Xerces built
./build_ecce
Mesa OpenGL built
./build_ecce
wxWidgets built
./build_ecce
wxPython built
./build_ecce
Apache HTTP server built
./build_ecce
Copying JMS server distribution jms_server.tar.bz2 Making server tar file ecce_server.tar Copying WebHelp tar file eccewebhelp.tar.bz2 Making combined tar file ecce.v6.4.tar Copying NWChem binary distribution nwchem-6.1.1-binary-rhel5-gcc4.1.2-m32.tar.bz2 Copying NWChem common distribution nwchem-6.1.1-binary-common.tar.bz2 Concatenating install script and combined tar file ecce.v6.4.tar create_ecce_bin finished ECCE built and distribution created in /home/me/tmp/ecce-v6.4
3. Install
cp ~/tmp/ecce-v6.4/install_ecce.v6.4.csh ~/
cd ~
./install_ecce.v6.4.csh
Extracting ECCE distribution from ./install_ecce.v6.4.csh... hostname: Unknown host Main ECCE installation menu =========================== 1) Help on main menu options 2) Prerequisite software check 3) Full install 4) Full upgrade 5) Application software install 6) Application software upgrade 7) Server install 8) Server upgrade IMPORTANT: If you are uncertain about any aspect of installing or running ECCE at your site, please refer to the detailed ECCE Installation and Administration Guide at http://ecce.pnl.gov/docs/installation/2864B-Installation.pdf Hit at prompts to accept the default value in brackets. Selection: [1] 3 Host name: [localhost (could not successfully ping science!)] localhost Application installation directory: [/home/me/ecce-v6.4/apps] Server installation directory: [/home/me/ecce-v6.4/server] ECCE v6.4 will be installed using the settings: Installation type: [full install] Host name: [localhost] Application installation directory: [/home/me/ecce-v6.4/apps] Server installation directory: [/home/me/ecce-v6.4/server] Are these choices correct (yes/no/quit)? [yes] yes

Put the following in your ~/.bashrc:
export ECCE_HOME=/home/me/ecce-v6.4/apps export PATH=${ECCE_HOME}/scripts:${ECCE_HOME}/scripts/parsers:$PATH alias stopecce='/home/me/ecce-v6.4/server/ecce-admin/stop_ecce_server' alias startecce='/home/me/ecce-v6.4/server/ecce-admin/start_ecce_server'
and do
source ~/.bashrc

Make sure that you hostname is in your hosts file, e.g.
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 science! ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
where science! is my hostname (I regret using ! in the hostname, but oddly I haven't actually had issues with it yet. Still, don't do it.). If it's not, you'll get a ton of errors.

4. Run.
You can now start ecce by doing
startecce
ecce

I've tested this on a headless box (small intel Atom with 512 Mb RAM) and it worked fine. I tried running calcs locally as well, and everything worked.

03 June 2012

171. Building ECCE on ROCKS/CentOS


I installed ECCE on a couple of a single workstation with ROCKS, and remotely on a 40 core cluster with ROCKS. The local, workstation install worked fine. I never really bothered much about the cluster install, and only recently looked closer at it. Well, I can launch the 'gateway' but nothing else -- when I click on e.g. the organizer button I get the rocks version of an hourglass that never goes away -- and I don't get any error messages. Turning on logging doesn't yield anything either. 

Ergo, I figured that building it myself  may yield a different result. It didn't on the ROCKS cluster, but everything worked just fine on the single-node ROCKS training box I keep in my office.


CentOS is a bit dated, so you'll need to build your own apr and apr-util. Build apr:
cd /share/apps/utils/
wget http://mirror.mel.bkb.net.au/pub/apache//apr/apr-1.4.6.tar.gz
wget http://mirror.mel.bkb.net.au/pub/apache//apr/apr-util-1.4.1.tar.gz
tar xvf apr-1.4.6.tar.gz
cd apr-1.4.6/
./configure --prefix=/share/apps/utils/apr
make
make install
cd ../
tar xvf apr-util-1.4.1.tar.gz
cd apr-util-1.4.1/
./configure --prefix=/share/apps/utils/apr-util --with-apr=/share/apps/utils/apr/


Time for ecce.
First download 
cd /share/apps/ecce/
tar xvf ecce-v6.3-src.tar.bz2
cd ecce-v6.3/
export ECCE_HOME=/share/apps/ecce/ecce-v6.3
cd build/

Edit build_ecce
889       ./configure --prefix=$ECCE_HOME/${ECCE_SYSDIR}3rdparty/httpd --enable-rewrite --enable-dav --enable-ss-compression
to
889       ./configure --prefix=$ECCE_HOME/${ECCE_SYSDIR}3rdparty/httpd --enable-rewrite --enable-dav --enable-ss-compression --with-apr=/share/apps/utils/apr/bin/apr-1-config --with-apr-util=/share/apps/utils/apr-util/bin/apu-1-config

./build_ecce
Just follow the instructions i.e. hit return, over and over again. Answer no to running tests again. Then run build_ecce again:
./build_ecce
Now stuff should be building. Do this another six times. From the README:
"At this stage the script will build one 3rd party package per invocation,
exiting after each package is built.  In order the 3rd party packages that
will be built are:
1. Apache Xerces XML parser
2. Mesa OpenGL
3. wxWidgets C++ GUI toolkit
4. wxPython GUI toolkit
5. Apache HTTP web server"
The httpd build ends with a minor error about "lib" missing. It's fine.

The sixth time ECCE itself is built, and that's the step that takes by far the longest. It finishes with:
 ECCE built and distribution created in /share/apps/ecce/ecce-v6.3
On a single-node desktop I got it to run a seventh time it seemed. The last step finished with the message above though.

Go to your /share/apps/ecce/ecce-v6.3/ dir where you'll find install_ecce.v6.3.csh
Do the install
csh -f install_ecce.v6.3.csh
Follow the instructions.

You may also want to
sudo mv /etc/csh.* ~/
to get rid of the crappy csh config files.

Edit your ~/.bashrc:

alias startecceserver='csh -f /share/apps/ecce/ecce-v6.3/server/ecce-admin/start_ecce_server'
alias stopecceserver='csh -f /share/apps/ecce/ecce-v6.3/server/ecce-admin/stop_ecce_server
export ECCE_HOME=/share/apps/ecce/ecce-v6.3/apps
export PATH=$PATH:${ECCE_HOME}/scripts

and your ~/.cshrc:

setenv ECCE_HOME /share/apps/ecce/ecce-v6.3/apps
set PATH= (/share/apps/nwchem/nwchem-6.1/bin/LINUX64 $PATH)

On my single-node box I had to edit the apps/siteconfig/DataServers and replace eccetera.emsl.pnl.gov with localhost (two instances), as well as the apps/siteconfig/jndi.properties file (one instance).

In spite of the hassle on the single node box, everything works there -- the builder, organizer etc. all open just fine. The rocks cluster, looks fine, but doesn't work.

The ROCKS Cluster:
Everything seems to work fine -- starting ecce launches the gateway, but clicking on anything sees the centos version of the hourglass churn over and over for all eternity. Nothing happens.

I looked through these two threads, and i also tried the pre-built 32 bit binary. All without luck.

I've also tried editing the site_runtime file:
ECCE_MESA_OPENGL true
ECCE_MESA_EXCEPT x86_64:RedHat:Fedora:CentOS
(matches the lsb_release -is output)