EPMAutomate, PBCS, Backups and You!

G’day All,

So Cameron Lackpour suckered engaged another extremely talented EPM specialist – Chris Rothermel – to write a guest post.

It walks through a simple yet elegant code set to perform Automated Daily Timestamped Backups on PBCS. Given the consolidation Oracle Cloud products this process could be easily extended to any Oracle Cloud application that utilises LCM.

Absolutely brilliant, worth the read. It might just be me, but reading any cleanly documented, clearly written, optimised code is always an soothing experience. Errr.

Reaching out

Anyways, because we are all tragic and in dire need of better hobbies technical enthusiasts – I got in touch with Chris to discuss it further.

Quickly to go through his process:

  1. Login to PBCS using EPMAutomate
  2. Download the latest Artifact Snapshot
  3. Rename the snapshot on the local machine with a datestamp
  4. Upload the renamed snapshot to PBCS

My only problem with the process is the literally the bandwidth cost. Even with a single environment and wanting to backup a Prod and Test environment you could be downloading and uploading gigabytes \ day. If I’m at my day job and want to extend this functionality to our large (and growing quickly!) PBCS client base, we could be looking at 50gb+ of movements a day – which is problematic.

What to do…

So the first and easiest option – instead of downloading the Artifact Snapshot to rename it, why don’t we just rename the snapshot on the PBCS server?

Well, unfortunately it’s not that simple.

EPMAutomate doesn’t have that feature.

Even looking through the REST API documentation it doesn’t seem possible :

Available options are limited to:
Possible values: self, import, export, upload, download, or delete depending on the operation permitted on an application snapshot

So what to do?

Okay, so we can’t rename an LCM Export. And downloading and uploading them costs too much time/bandwidth. But maybe we can trick it.

Let’s examine some of the other EPMAutomate functions – specifically exportsnapshot.

exportsnapshot
Repeats a previously performed export operation to create a snapshot of Migration content. You can download the exported snapshot from the default location.
Usage: epmautomate exportsnapshot SNAPSHOT_NAME where SNAPSHOT_NAME is the name of an existing snapshot in Migration. This snapshot is replaced by the new snapshot.
Example: epmautomate exportsnapshot October16FullApp

Maybe we could use this function to create a ‘new’ snapshot with our datestamp – all with have to do is convince PBCS that the snapshot has already occurred.

Easiest way to do that? Upload something that looks like an LCM export.

First crack 2 eggs…

Interestingly PBCS fundamentally uses the same filesystem and file structure as on-premises Hyperion. Specifically, this the PBCS ‘get everything’ export.xml format – I’ve highlighted the few fields that are specific to an application\environment.

This is available from the Artifact Snapshot in the top level of the zip file.

Having done a bit of work with LCM backups in the past, I figured it would be a similar process within PBCS. The Export.xml file defines the export process – so all we should need to do is zip up a default Export.xml, upload it to the server, and Robert is your mother’s brother.

Classic Remix

So, let’s look at some basic code:

for /f %%g in ('powershell get-date -format yyyymmdd') do set datestamp=%%g
Set Snapshotname=%datestamp%_Backup
7za.exe a %SnapshotName%.zip C:\Test\Backup\Export.xml 
Call %workingdir%\epmautomate login %user% %pass% %url% %domain% >> %Log%
Call %workingdir%\epmautomate uploadfile C:\Test\Backup\%SnapshotName%.zip >> %Log%
Call %workingdir%\epmautomate exportsnapshot %SnapshotName% >> %Log%
Call %workingdir%\epmautomate downloadfile %SnapshotName% >> %Log%
Call %workingdir%\epmautomate logout >> %Log% 

Breaking it Down

 for /f %%g in ('powershell get-date -format yyyymmdd') do set datestamp=%%g

Firstly, a quick way to build a fully formatted datestamp in dos batch\powershell that deals with international locales.

Set Snapshotname=%datestamp%_Backup
7za.exe a %SnapshotName%.zip C:\Test\Backup\Export.xml

Setting the snapshot name, and creating a zip file with just the export.xml in it. The eagle eyed amongst you will note that there is no username and password in the export.xml file.


Next:

Call %workingdir%\epmautomate login %user% %pass% %url% %domain% >> %Log%
Call %workingdir%\epmautomate uploadfile C:\Test\Backup\%SnapshotName%.zip >> %Log%

Login to EPMAutomate (the variables have been set earlier) and then upload the new zip file.

Call %workingdir%\epmautomate exportsnapshot %SnapshotName% >> %Log%

Trigger the LCM export of the uploaded snapshot.

And after a few minutes:

Finally:

Call %workingdir%\epmautomate downloadfile %SnapshotName% >> %Log%

The last (and completely optional!) step of downloading the snapshot to the local machine.

Updated file size:

And the contents?

Wrapping Up

So, let’s talk about the new process

  1. Create a datestamp(ed? – sic) zip file containing only a generic Export.xml
  2. Login to PBCS using EPMAutomate
  3. Upload your empty Snapshot
  4. Trigger the uploaded artifact snapshot, adding to the empty snapshot file
  5. (Optional) Download the updated snapshot to the local machine

Key advantage? Because you’re only uploading a shell file, your bandwidth costs drop to 2kb per backup!

What’s next?

So the only item really left to cover off is how to remove the old files from the server. One of the key notes that Chris made in his post was that the EPMAutomate Listfiles does not show the LCM files.

In my trials and tribulations seamless process of writing this all out, I realised that the LCM files are accessible through the REST API – so it should be easy (cough) to write a process to parse and delete them.

And with that segue to a Part 2 – Peter tries to work out Groovy, Scripting and the REST APIs, I’ll leave it here.

Cheers
Pete

LCM Backups for Fun and Profit…

G’day,

So – from my first few posts you’ll see I’m exceptionally lazy efficient when it comes to back end processes. If it can be scripted/automated, then I’ll generally do that rather than keep a manual step.

http://www.xkcd.com/1319/
Err. Where was I.

Right – as part of a I’ve-got-a-few-hours-spare-this-week-piece, I rewrote an overnight process. Primary goals for the rewrite were to remove hard coding where possible, segment the overnight process out to discrete tasks\components and add a hefty backup component. Obviously the Essbase side of a backup process is reasonably simple and flexible. Database copies\dataexports\file system copies – you’ve got a plethora of options available.

What about the ‘rest of it‘?

In addition to the standard relational backups, I decided to see if I could automate an LCM export of the Planning application side – the theory being that between the LCM backup and the Essbase backup I could recreate any combination of an app.
How does this all work then?
LCM is quite an elegant tool – certainly compared to the earlier alternatives (I’ll give you a moment while you recover from the nightmarish memory of dataform migration pre v9!) Basically you first define a migration framework of the artifacts you wish to either export or import and then trigger it.

So – a very basic export framework is shown below.
1_Export_LCM_XML
This is going to export the entire application (Finance), including the Dataforms, Dimensions, Global Settings, Security & Relational data. I’m building this on 11.1.2.1 – hence it’s not going to include the BSO Essbase data – however later versions can also include a data export.

Note: If you’ve built a fairly standard Planning application this is going to include pretty much everything you need, including essbase calc scripts, rules files and partitions. However! If you’ve used HSP_UDF to define member formulas directly in essbase, that’s not going to be included and you’ll need to backup the otl file separately. Don’t say I didn’t warn you.

If there are items you don’t want to export (Security for instance) you can simply edit the lines out of this file directly. That being said, a simpler method of building this is to define and run an LCM export of the application with exactly what you’re chasing. Then download and open the export folder, and at the very top of the folder structure should be the Export.xml file.
2_Folder

One final thing to notice in the Export (and Import Files) is that the Username and Passwords are cleared out every time they are run. This is important later when we try to automate this step.

Introducing the…UTILITY
Now you’ve got your export definition sorted – the next step is to look at the automation steps. The LCM outline extraction utility (named, entertainingly, Utility) is held in the Foundation bin folder. On a standard install it should be Oracle\Middleware\user_projects\FOUNDATION1\bin.

To save you looking through the detail of that batch script, the string required is really basic:
Outline.bat ExportFolder\Export.xml

Unfortunately, this is one of those scripts that cannot be called directly from an external server (it internally attempts to parse the drive and folder locations), so you’ll need to use psexec to call it across servers.

You’ll notice that in the string above you don’t have to define an output folder, just the location of the output. Sadly this isn’t because you’ve got some flexibility to add that to the XML file, or alternatively set an option somewhere in shared services, instead it’s because the LCM export will be directly to the filesystem location of the Export.xml file. Yep – wherever you’ve stored it is where the data will end up. In an upside, that can be across mapped network drives, so you store the files somewhere centrally rather than clogging up your foundation server!

Hooking it all together

So – obviously with all of these back end scripting languages available we should choose something exoti….nah. This time we’re sticking with DOS batch. They didn’t call it the Quick and Dirty Operating system for nothing!

Okay, so we’ve got to handle the following items:

  • Usernames and passwords disappearing at runtime
  • Needing PSExec to run the utility (and requiring write access back across a network share)
  • Needing some steps to zip it all up and maintain only an appropriate backup
  • Having it all parameter based and scalable for multiple apps

Taking it from the top
Firstly, folder system setup:
0_Folder
I’ve created a nice simple structure. The folder LCM_Exports is empty and will be built up in the batch code. In LCM_XML I’ve got a copy of the XML export files, updated with the username and password (encrypted works, but you can also type them unencrypted). They are renamed, prefixed with the name of the application ie:
LCM_XML\Finance_Export.xml

Now the code:
3_Code
Nice and simple – we’re calling the name of the application as per the naming convention in the LCM_XML folder. In addition the variables include a file system user and password to use with PSExec to make sure the Utility can write back to wherever you had the export.xml saved. If you’re running it directly from your foundation server and saving the LCM Exports¬† you should be able to skip this – but it’s not going to hurt.
4_LCMExportCode
Where the magic starts. Here the batch code removes and rebuilds the top level folder name under LCM_Export using the %App% variable name, and copies in the export file, renaming it to Export.xml. The final step is to trigger the Utility using PSexec.

4_ZipCode

After the Utility is finished, 7zip is used to create a datestamped version of the LCM Export. As discussed earlier, I’m using 11.1.2.1 in this example, so I added compression it up in this step because by default it comes down uncompressed. Later versions (11.1.2.2+) are already compressed (or are they??), so compressing them further would likely only be useful to update the naming convention. The final step in this section will delete any of the zipped that are older than 3 days, but this could be updated to 30 (or more!) – just make sure you keep on top of your storage space requirements.

6_Cleanup

Final steps, some basic cleanup and error handling – you could also add some error handling for messages directly back from the LCM Utility which will return the following error codes.

REM - Return values -
REM - 1) 0 - Success
REM - 2) 1 - Failure in processing input parameters passed to CLU
REM - 3) 2 - Failure in CLU execution

 

Conclusion
And that’s it! A scripted and fully-automated way to take backups of LCM files. This can obviously be extended to other apps or even other artifacts (think Reporting\Shared Services\EPMA) by creating individual export.xml files, dumping them into the LCM_XML folder and simply calling them.

For bonus points – the next step is automating the import side – thus giving you the ability to script a true sandbox\QA environment every night. A topic for another post!

Full version of the code available below.

:: Name: Hyperion Planning Backup Script
:: Purpose: Perform LCM Backups of the Planning applications
:: Author: Peter Nitschke - M-Power Solutions
:: Change Control
:: When Who What
:: 19/07/2014 PN Created

:: INSTRUCTIONS
:: Script is built to backup Planning LCM Extracts
:: Script will copy the export XML files from the LCM_XML Folder to LCM_Exports Folder and run the LCM Extract Utility
:: Due to a UNC limitation in the LCM utility, requires using psexec to run remotely

:: To create LCM Export XML files, trigger a standard LCM Planning Export, this creates an Export.xml file in the defined folder
:: This export.xml file needs to have the User Name and Password updated and be renamed as follows below

:: NAMING CONVENTION
:: This script will look for files of the following format in the LCM_XML Folder
:: %AppName%_Export.xml
:: It will then copy them tp the following folder and rename before triggering
:: Backup\LCM_Export\%AppName%\Export.xml

:: Similar to the other overnight scripts, this script requires run time parameters
:: Variables
:: $1 = fsUser
:: $2 = fspassword
:: $3 = Env
:: $4 = Application

SetLocal
echo off

:: If no parameters exist, fail and return an error code of 1
if [%1]==[] goto NoParam

:: Define Runtime variables
set fsuser=%1
set fspassword=%2
set essbaseserver=hyp-es-%3
set webserver=hyp-web-%3
Set app=%4

:: Set the Batch variables
for /F "tokens=1-4 delims=/ " %%a in ('date /t') do (set datestamp=%%d%%c%%b)
Set LCMUtil=d:\Oracle\Middleware\user_projects\FOUNDATION1\bin\Utility.bat
set backupfolder=\\%essbaseserver%\d$\overnight\backup

:: Remove and recreate the LCM Exports Folder
RD ..\Backup\LCM_Exports\%app%_LCM\ /s /q
MD ..\Backup\LCM_Exports\%app%_LCM\

:: Copy the %AppName%_Export.xml file and rename it to Export.xml
Copy "..\Backup\LCM_XML\%app%_Export.xml" "..\Backup\LCM_Exports\%app%_LCM\Export.xml"

:: Trigger an LCM Export
Call "psexec.exe" /accepteula \\%webserver% -u %fsuser% -p %fspassword% %LCMUtil% "%backupfolder%\LCM_Exports\%app%_LCM\Export.xml"

:: Zipping Process
:: Delete the backup file if it already exists - shouldn't happen unless script is run twice
IF exist ..\Backup\AppBackups\%Datestamp%_%App%_Backup.7z del ..\Backup\AppBackups\%Datestamp%_%App%_Backup.7z

:: Run the 7z command line
:: Add the LCM Export
7za.exe a -t7z -r ..\Backup\AppBackups\%Datestamp%_%App%_Backup.7z ..\Backup\LCM_Exports\%app%_LCM -m0=LZMA2 -mx1 -mmt=8 -md=64k -mfb=16

:: Delete any of the 7zip files that are older than 3 days
:: Can be changed, but watch the space requirements!
forfiles /p ..\Backup\AppBackups\ /m *.7z /d -3 /c "cmd /c DEL /Q @file"

ENDLOCAL
EXIT /B

:NoParam
@echo Usage: Requires following parameters to run
Echo Runtime variables: "fsuser" "fspassword" "Env" "app"
ENDLOCAL
Pause
exit /B 1

Error Handling the Max Power(shell) way

Let’s start this off with a bang

If, like most of us, part of your day to day role is support – one of the key advantages of using ODI in your overnight process is to have the full gambit of ODI error handling available to you. Errors are nicely caught, with email warnings for failures in batch processes basically a default state.

However, what happens if ODI isn’t part of your implementation? What happens if the back end is basically a bunch of batch/maxl scripts hanging together (no matter how elegant it is). If it is the latter, at best you’re unlikely to have an easy method to trap errors and send notifications to the support team and at worst, you may not have any automated notification at all!

So your options are:

  1. Hope against hope that you never ever have any errors in your overnight processes
  2. Write your own error handling piece and include it in the processing

Rolling Your Own

So you’ve decided to steal cable write your own. While this process would be possible in batch – certainly you’re likely to be better off selecting a more advanced language with slightly better handling of strings (See also: Java/Perl/Powershell/Cygwin/VBA).

I’ve gone with Powershell: slightly because I’m generally on Windows environments, somewhat because it’s fairly decent at string manipulation and mostly because it’s one of the few languages I know.

Code Overview

So – to start with, basic documentation and variable structure. I’ve written this to poll through a folder containing log files (rules file errors, maxl return errors and an archive copy of all of the Essbase Log files from the previous day). These have been moved to this folder by earlier maxl/batch steps, mostly by using STDERR outputs.

PowerShell_ErrorHandle_1

Loop-de-Loop

For each of the files in the log folder it scans for any lines containing errors and pushes the entire line to a central log file.

I’ve added suppression to deal with a majority of ‘false positive’ errors – primarily related to errors in the Essbase log for Smartview user-side errors like ‘you’ve-got-suppression-on-by-default-and-your-top-level-retrieve-is-blank’ (man, if I had a nickel for that one…).

PowerShell_ErrorHandle_2

Letting you Know

Finally it checks if the log file isn’t blank and emails it through to the email addresses listed. The primary reason I’ve got the error suppression above is to make sure you’re only receiving automated emails when something has gone wrong. I find too many false positives in your inbox mean that the actual errors occasionally get lost in the underflow.
PowerShell_ErrorHandle_3

Code
Finally – the full code in all it’s glory. 20 Lines of working code all up!

# Name: Hyperion Mail Error script
# Purpose: Sends an email to the support team if there are any errors in the overnight
# Author: Peter Nitschke - M-Power Solutions

# Change Control
# When Who What
# 19/07/2014 PN Created

# INSTRUCTIONS
# Script is built as a powershell command to be called at the end of the overnight
# Script will poll through the available files in the log locations folder searching for errors
# Some errors are suppressed (var SUPPRESSEDERROR), additional error codes to be suppressed can be added to this variable
# If any errors are found, it will automatically send an email through to the support team

Set-Variable ERRORFOLDER “..\Errors\OvernightErrors”
Set-Variable ERRORLOG “..\Logs\OvernightErrors.log”

#ERRORS TO BE SUPPRESSED
#1020010 No data was generated: Suppress Missing = [%s], Zeros = [%s]. Sheet not overwritten.
#1020011 Maximum number of rows [%s] exceeded [%s].
#1020012 Maximum number of columns [%s] exceeded [%s].
Set-Variable SUPPRESSEDERROR “1020007”,”1020011″,”1020012″

# Create or clear the status file
new-item $ERRORLOG -type file -force

#For all of the error log files
(Get-ChildItem $ERRORFOLDER |

#Create array of objects
Foreach-object{
#Define the name and content
$name = $_.name
$content = Get-Content $_.FullName
#look for errors in the file
switch(($content | Select-string “Error” | Select-string $SUPPRESSEDERROR -NotMatch | measure | select -exp count) -gt 0) {
#if errors found, print the filename and the error
True {Echo $name >> $ERRORLOG
$content | Select-string “Error” | Select-string $SUPPRESSEDERROR -NotMatch >> $ERRORLOG
Echo “” >> $ERRORLOG}
}
}
)
#Once the loop is finished, check if the status file has any content and email if true
If ((Get-Content $ERRORLOG) -eq $Null) {
}
Else{
#Calling send email function
$smtp = “mail.gmail”
$emailto = “HyperionSupport@EssbaseDownUnder.com”
$emailfrom = “HyperionSupport@EssbaseDownUnder.com”
$subject = “Hyperion Overnight – Errors” + (get-date -f “dd/MM/yyyy”)
$body =(get-content $ERRORLOG | Out-string)

send-mailmessage -to $emailto -from $emailfrom -smtp $smtp -subject $subject -body $body
}

As an aside, I’m aiming to provide full code snippets as often as possible. Most of my learning over the years has been picking up code from everywhere and jerry-rigging it all together. (Subnote: greatest wikipedia ‘Not to be confused with’ ever!). Please feel free to use, send to colleagues, share with friends, or claim as your own to help you pick up ladies.

However, standard rules of never putting something into your environment that you haven’t first tested and understood apply. I’m not to blame if it takes these lemons and burns your house down.

Conclusions

Hopefully this has given you some ideas as to adding basic error handling to your own environments, allowing you to both rest easier at night and feel like you’re always one step ahead of any problems as they occur!

Cheers
Pete