Automation of Level0 Data Load in Essbase


I was loading Level0 data into Essbase and was forced to wait for a lonnnnng time and I’m lot impatient these days…..probably because of more work…..

So I thought why not automate it instead of waiting for each single file to finish the load.


Level0 export is tuned in such a way that if Essbase anticipates that an export file exceeds 2 GB, it creates two or more export files, as needed. The requested file name for the main file is used. An underscore and a sequential
cardinal number are appended to the names of the additional files, starting with _1.

Either I have to hard code the file names in my Maxl or I have to generate a dynamic Maxl.


I didn’t like to first part, because I won’t be able to use that for a long time. Maxls don’t like if wildcards. Eg, if I gave load data Cera*.dat, it won’t accept it.

So I created a UNIX script (forgive me Windows guys….) to generate a dynamic Maxl….after generating the Maxl, add a code to call it. 🙂


Here you go.



#!/bin/sh


################################################################################
#
# Script      : CERALoadLevel0.sh
# Author      : Celvin Kattookaran – Original Version
# Description : Load Level0 data into Essbase
# Date        : June 2010
# ——————————————————————————
# Outputs     : Returns 0 if successful, 1 if Error occurred
# ——————————————————————————
# Updates     :
# Author Date Description
# ——————————————————————————
# celvin.v.kattookaran 10-Jun-2010 Creation

#


#################################################################################




#####################################################
########## GLOBAL VARIABLES DEFINITION ##############
#####################################################


# ———————————————————————————————————————-
# — Main directories definition —
# ———————————————————————————————————————-


BASE_DIR=/essbase/CERA; export BASE_DIR
CERARD_APP_DIR=/essbase/Hyperion/AnalyticServices/app/CERA_RD; export CERARD_APP_DIR


# ———————————————————————————————————————-
# — Directories definition —
# ———————————————————————————————————————-
# — Under {BASE_DIR}


ESS_SCRIPTS_DIR=${BASE_DIR}/scripts; export ESS_SCRIPTS_DIR
ESS_LOG_DIR=${BASE_DIR}/logs


# ———————————————————————————————————————-
# — File Definitions
# ———————————————————————————————————————-
# — Script file definitions
# ———————————————————————————————————————-
# — Main log file for script


SCRIPTLOG=”${ESS_LOG_DIR}/$(basename $0 .sh)_`date ‘+%Y%m%d_%H%M’`.txt”; export SCRIPTLOG


########################################
# Please don’t update below this point #
########################################


# Initialize Variables
EC=0      #returned value of script 
echo “” > ${SCRIPTLOG}


#####################################################
############# FUNCTIONS DEFINITION ##################
#####################################################


exit_process()
{
# exit process with returned value=1 (process failed)
msg_screen_and_log “>>>>> Exit process with code=1”
EC=1
cd ${BASE_DIR}/logs
chmod 660 *
echo
exit $EC
}


msg_screen_and_log()
{
# Echo message with formatted time stamp to LOG file and command line
DTSTMP=`date ‘+%Y%m%d %T’`
echo “$DTSTMP: $*” 
echo “$DTSTMP: $*” >> $SCRIPTLOG
}


msg_log_only()
{
# Echo message with formatted time stamp to LOG file and command line
DTSTMP=`date ‘+%Y%m%d %T’`
echo “$DTSTMP: $*” >> $SCRIPTLOG
}




get_level0_Filenames()
{
  cd $CERARD_APP_DIR
  find / -name “mCERA*.dat” -type f -print > level0filename
  
  if [ -s level0filename ]; then
   msg_screen_and_log “* Found level 0 export files and will be loaded to Essbase”
 else
     msg_screen_and_log “* There is No Level 0 export Files”
exit_process


fi
}


create_dataload_maxl()
{
printf “spool on to /essbase/CERA/logs/level0load.txt;n” > ${ESS_SCRIPTS_DIR}/CERA_Load_Level0.mxl
printf “login PMSAdmin identified by password on localhost;nn” >> ${ESS_SCRIPTS_DIR}/CERA_Load_Level0.mxl


for i in `cat level0filename`
do 


printf “import database sample.basic data from data_file `echo $i`’ on error write to ‘/essbase/CERA/logs/level0loaderr.err’;n” >> ${ESS_SCRIPTS_DIR}/CERA_Load_Level0.mxl

done

printf “logout;nn” >> ${ESS_SCRIPTS_DIR}/CERA_Load_Level0.mxl


printf “spool off;n” >> ${ESS_SCRIPTS_DIR}/CERA_Load_Level0.mxl
printf “exit;n” >> ${ESS_SCRIPTS_DIR}/CERA_Load_Level0.mxl
}




#####################################################
############# MAIN PART OF SCRIPT ###################
#####################################################


echo


msg_screen_and_log “*** Start Level 0 load Process….”
get_level0_Filenames
create_dataload_maxl
msg_screen_and_log “*** END of Level 0 load Process….”

# Exit
msg_screen_and_log   “Processing Completed Successfully with exit code=$EC”
exit $EC

##################################################################################################################

Additional Info

If you would like to make use of this script
PPMSRD_DATA_DIR – this should point to the place where the data files are residing.
printf “login PMSAdmin identified by password on localhost;nn” >> ${ESS_SCRIPTS_DIR}/PPMS_Load_Level0.mxl  – change accordingly
Change the base directory accordingly, create scripts and logs folder in BASE_DIR
I’ve not included the command for encrypting the Maxl and running it, you can use the encrypted strings in the above printf statement (then change the extension mxl to mxls)
Change the file name accordingly (mCERA*.dat)
Hope this helps.

About Celvin Kattookaran

I’m an EPM Consultant, my primary focus is on Hyperion Planning and Essbase. Some of you from Hyperion Support team might recognize me or have seen my support articles, I was with the WebAnalysis Support Team. I'm an Independent Consultant with “Intekgrate Corporation” based out of Aurora office. I’m from God’s Own Country (Kerala, India), lived in all southern states of India, Istanbul and Johannesburg (and of course United States). I’m core gamer :) and an avid reader. I was awarded Oracle ACE Director for my contributions towards EPM community.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.