Notice: WP_Scripts::localize was called incorrectly. The $l10n parameter must be an array. To pass arbitrary data to scripts, use the wp_add_inline_script() function instead. Please see Debugging in WordPress for more information. (This message was added in version 5.7.0.) in /var/www/wp-includes/functions.php on line 5663

EAS copy of a BSO Planning Cube post

G’day All,

A very quick and very essbase hacky one today (since we’re all essbase hackers at heart!)

From onwards you can no longer take an copy of a Planning BSO Application through EAS. You can still copy ASO apps and Essbase only BSO apps, but can no longer take a quick snapshot of a Planning app to do some modelling in. The function is greyed out when you right click on the App.
11111 - Copy

11122 Onwards - No Copy

Because that is still quite useful to do, particularly when:

  • testing data changes
  • reviewing historical recalculations so you can see a before and after
  • making sure my code actually works and doesn’t set fire to everything

Below is some Maxl code to get around it. This will copy the outlines, rules, scripts and data across. Note the 8 character limit for both apps and databases.

Create Application ‘TargetAppName’;
Create Database ‘TargetAppName’.’SourceDB1′ as ‘SourceAppName’.’SourceDB1′;
Create Database ‘TargetAppName’.’SourceDB2′ as ‘SourceAppName’.’SourceDB2′;
Create Database ‘TargetAppName’.’SourceDB3′ as ‘SourceAppName’.’SourceDB3′;

Best run from the MAXL editor in EAS.

This is a staggeringly inglorious hack, so treat it with caution. It has obviously been removed for a reason – though this may simply be to remove the confusion of planning/essbase apps.

You also need to be aware of your licensing agreement to make sure you’ve got coverage for Essbase only applications.

Finally, and obviously, I strongly recommend running this only in your dev/test environments. But as a great poet once said “there’s no test like production”


ps: I’m kidding. Don’t run this in Production. Seriously. They are Planning apps remember?? That’s why you’ve got LCM. Read the previous post. Sheesh.

LCM Backups for Fun and Profit…


So – from my first few posts you’ll see I’m exceptionally lazy efficient when it comes to back end processes. If it can be scripted/automated, then I’ll generally do that rather than keep a manual step.
Err. Where was I.

Right – as part of a I’ve-got-a-few-hours-spare-this-week-piece, I rewrote an overnight process. Primary goals for the rewrite were to remove hard coding where possible, segment the overnight process out to discrete tasks\components and add a hefty backup component. Obviously the Essbase side of a backup process is reasonably simple and flexible. Database copies\dataexports\file system copies – you’ve got a plethora of options available.

What about the ‘rest of it‘?

In addition to the standard relational backups, I decided to see if I could automate an LCM export of the Planning application side – the theory being that between the LCM backup and the Essbase backup I could recreate any combination of an app.
How does this all work then?
LCM is quite an elegant tool – certainly compared to the earlier alternatives (I’ll give you a moment while you recover from the nightmarish memory of dataform migration pre v9!) Basically you first define a migration framework of the artifacts you wish to either export or import and then trigger it.

So – a very basic export framework is shown below.
This is going to export the entire application (Finance), including the Dataforms, Dimensions, Global Settings, Security & Relational data. I’m building this on – hence it’s not going to include the BSO Essbase data – however later versions can also include a data export.

Note: If you’ve built a fairly standard Planning application this is going to include pretty much everything you need, including essbase calc scripts, rules files and partitions. However! If you’ve used HSP_UDF to define member formulas directly in essbase, that’s not going to be included and you’ll need to backup the otl file separately. Don’t say I didn’t warn you.

If there are items you don’t want to export (Security for instance) you can simply edit the lines out of this file directly. That being said, a simpler method of building this is to define and run an LCM export of the application with exactly what you’re chasing. Then download and open the export folder, and at the very top of the folder structure should be the Export.xml file.

One final thing to notice in the Export (and Import Files) is that the Username and Passwords are cleared out every time they are run. This is important later when we try to automate this step.

Introducing the…UTILITY
Now you’ve got your export definition sorted – the next step is to look at the automation steps. The LCM outline extraction utility (named, entertainingly, Utility) is held in the Foundation bin folder. On a standard install it should be Oracle\Middleware\user_projects\FOUNDATION1\bin.

To save you looking through the detail of that batch script, the string required is really basic:
Outline.bat ExportFolder\Export.xml

Unfortunately, this is one of those scripts that cannot be called directly from an external server (it internally attempts to parse the drive and folder locations), so you’ll need to use psexec to call it across servers.

You’ll notice that in the string above you don’t have to define an output folder, just the location of the output. Sadly this isn’t because you’ve got some flexibility to add that to the XML file, or alternatively set an option somewhere in shared services, instead it’s because the LCM export will be directly to the filesystem location of the Export.xml file. Yep – wherever you’ve stored it is where the data will end up. In an upside, that can be across mapped network drives, so you store the files somewhere centrally rather than clogging up your foundation server!

Hooking it all together

So – obviously with all of these back end scripting languages available we should choose something exoti….nah. This time we’re sticking with DOS batch. They didn’t call it the Quick and Dirty Operating system for nothing!

Okay, so we’ve got to handle the following items:

  • Usernames and passwords disappearing at runtime
  • Needing PSExec to run the utility (and requiring write access back across a network share)
  • Needing some steps to zip it all up and maintain only an appropriate backup
  • Having it all parameter based and scalable for multiple apps

Taking it from the top
Firstly, folder system setup:
I’ve created a nice simple structure. The folder LCM_Exports is empty and will be built up in the batch code. In LCM_XML I’ve got a copy of the XML export files, updated with the username and password (encrypted works, but you can also type them unencrypted). They are renamed, prefixed with the name of the application ie:

Now the code:
Nice and simple – we’re calling the name of the application as per the naming convention in the LCM_XML folder. In addition the variables include a file system user and password to use with PSExec to make sure the Utility can write back to wherever you had the export.xml saved. If you’re running it directly from your foundation server and saving the LCM Exports  you should be able to skip this – but it’s not going to hurt.
Where the magic starts. Here the batch code removes and rebuilds the top level folder name under LCM_Export using the %App% variable name, and copies in the export file, renaming it to Export.xml. The final step is to trigger the Utility using PSexec.


After the Utility is finished, 7zip is used to create a datestamped version of the LCM Export. As discussed earlier, I’m using in this example, so I added compression it up in this step because by default it comes down uncompressed. Later versions ( are already compressed (or are they??), so compressing them further would likely only be useful to update the naming convention. The final step in this section will delete any of the zipped that are older than 3 days, but this could be updated to 30 (or more!) – just make sure you keep on top of your storage space requirements.


Final steps, some basic cleanup and error handling – you could also add some error handling for messages directly back from the LCM Utility which will return the following error codes.

REM - Return values -
REM - 1) 0 - Success
REM - 2) 1 - Failure in processing input parameters passed to CLU
REM - 3) 2 - Failure in CLU execution


And that’s it! A scripted and fully-automated way to take backups of LCM files. This can obviously be extended to other apps or even other artifacts (think Reporting\Shared Services\EPMA) by creating individual export.xml files, dumping them into the LCM_XML folder and simply calling them.

For bonus points – the next step is automating the import side – thus giving you the ability to script a true sandbox\QA environment every night. A topic for another post!

Full version of the code available below.

:: Name: Hyperion Planning Backup Script
:: Purpose: Perform LCM Backups of the Planning applications
:: Author: Peter Nitschke - M-Power Solutions
:: Change Control
:: When Who What
:: 19/07/2014 PN Created

:: Script is built to backup Planning LCM Extracts
:: Script will copy the export XML files from the LCM_XML Folder to LCM_Exports Folder and run the LCM Extract Utility
:: Due to a UNC limitation in the LCM utility, requires using psexec to run remotely

:: To create LCM Export XML files, trigger a standard LCM Planning Export, this creates an Export.xml file in the defined folder
:: This export.xml file needs to have the User Name and Password updated and be renamed as follows below

:: This script will look for files of the following format in the LCM_XML Folder
:: %AppName%_Export.xml
:: It will then copy them tp the following folder and rename before triggering
:: Backup\LCM_Export\%AppName%\Export.xml

:: Similar to the other overnight scripts, this script requires run time parameters
:: Variables
:: $1 = fsUser
:: $2 = fspassword
:: $3 = Env
:: $4 = Application

echo off

:: If no parameters exist, fail and return an error code of 1
if [%1]==[] goto NoParam

:: Define Runtime variables
set fsuser=%1
set fspassword=%2
set essbaseserver=hyp-es-%3
set webserver=hyp-web-%3
Set app=%4

:: Set the Batch variables
for /F "tokens=1-4 delims=/ " %%a in ('date /t') do (set datestamp=%%d%%c%%b)
Set LCMUtil=d:\Oracle\Middleware\user_projects\FOUNDATION1\bin\Utility.bat
set backupfolder=\\%essbaseserver%\d$\overnight\backup

:: Remove and recreate the LCM Exports Folder
RD ..\Backup\LCM_Exports\%app%_LCM\ /s /q
MD ..\Backup\LCM_Exports\%app%_LCM\

:: Copy the %AppName%_Export.xml file and rename it to Export.xml
Copy "..\Backup\LCM_XML\%app%_Export.xml" "..\Backup\LCM_Exports\%app%_LCM\Export.xml"

:: Trigger an LCM Export
Call "psexec.exe" /accepteula \\%webserver% -u %fsuser% -p %fspassword% %LCMUtil% "%backupfolder%\LCM_Exports\%app%_LCM\Export.xml"

:: Zipping Process
:: Delete the backup file if it already exists - shouldn't happen unless script is run twice
IF exist ..\Backup\AppBackups\%Datestamp%_%App%_Backup.7z del ..\Backup\AppBackups\%Datestamp%_%App%_Backup.7z

:: Run the 7z command line
:: Add the LCM Export
7za.exe a -t7z -r ..\Backup\AppBackups\%Datestamp%_%App%_Backup.7z ..\Backup\LCM_Exports\%app%_LCM -m0=LZMA2 -mx1 -mmt=8 -md=64k -mfb=16

:: Delete any of the 7zip files that are older than 3 days
:: Can be changed, but watch the space requirements!
forfiles /p ..\Backup\AppBackups\ /m *.7z /d -3 /c "cmd /c DEL /Q @file"


@echo Usage: Requires following parameters to run
Echo Runtime variables: "fsuser" "fspassword" "Env" "app"
exit /B 1

Smartview Magic and an Inglorious Hack (well kinda…see notes at the bottom)

G’day All,

Those who know me well know I’m not exactly prone to fits of exaggeration nor exuberance when it comes to pieces of new functionality that interest me. I definitely don’t go running around grabbing clients and workmates and saying ‘come and look at this! Isn’t it awesome!’. I also definitely never would spend a hungover Saturday writing it up and sending it out.

So with that all out of the way. This is almost magical. Certainly awesome at the very least.

History: A Smartview Curmudgeon
Backtracking a little bit first. I was one of the holdouts with Smartview. Compared to the beauty and simplicity that was the excel add-in (particularly once you tweaked it a little), the bugs and functionality issues with early versions of Smartview (having to refresh before submitting??) held me off for ages. Even after changing over, I really wondered about some of the functionality / design choices that Oracle had made.

One of the ‘functions’ that always threw me was the integration of Smartview into Word. Certainly in the first few versions, this function had little to write home about – embedding Hyperion Financial Reports into word documents ‘kinda’ worked, but had issues: occasionally losing connections, and not being able to perform basic functions like resizing / cropping the embedded report without reconnecting all the sheets. As such, I’ve not seen an awful lot of it in the wild.

That said, there is definitely a user desire to bring some Hyperion data into Word. Month end reports are generally a combination of upper level datasets and textual analytics, and generally a lot of time and effort goes into ensure the data is verifiable and consistent – all of which can be wasted effort if the underlying data sets aren’t sourced correctly. So – the ability to integrate Smartview functionality into both Excel and Word is clearly desired. Let’s have a look at some of the options.

Reporting Reporting Everywhere, Reporting Reporting I Don’t Care

Really simply, this is what normally happens.

  • You have a requirement to embed some reports in a end of month Word reporting pack
  • You mock up the report exactly to spec, utilising sub variables where appropriate
  • You walk the user through embedding the report, all goes swimmingly for 3 months
  • The user changes their mind slightly, wants to add a new metric
  • You then spend the next 3 months fixing formatting issues in a Word document, because it’s now a slightly different size
  • You cry yourself to sleep at night due to the constant abuse of Word Styles

Right, so lets look at the other option.

Smart Queries and Dynamic Datapoints

Right, so we’ve got to get the users to do it themselves. Let’s walk through the process of building a query in excel and mapping it across.

First, build an adhoc retrieve in excel for the dataset. I’m not using a certain Michigan based soda dealership – mostly because I want the report to be a bit more complex. None the less, a nice simple report:

1 - Full Report
You’ll note the Year and Month in the POV, as well as a nice selection of columns.

From there, you want to select the components of the ad hoc retrieve that you wish to copy to Word.
2 - Report Selection and Copy
As can be seen, I’m only selecting the first few columns – nevertheless this is going to copy ALL of the associated metadata for the table – a fact that has very interesting repercussions later!

Okay – then open your Word document->Smartview and click the Paste button
3 - Paste to Word

Ta-da! Excel ad hoc retrieve in Microsoft Word!
4 - Original Word Formatting

Okay, so the format is a little ugly…but after some basic formatting.
4 - After Word Formatting

Awesome. Okay – now for those of you who were originally HFM users – the ‘#NEED_REFRESH’ will be of no surprise. Basically it’s rebuilt the query as discrete datapoints, everything is now HSGETVALUE with the full intersection. Hovering over the data item we can see further information as to what it’s doing – each cell has a direct link back, so you can change formatting, add columns, rows, headers etc and the individual queries will keep working.
6 - Cell Information

Refreshing the query will bring back all the data.
6 - Post Refresh

This even extends to the ability to copy and paste a single datapoint directly into a sentence and have it dynamically refresh as needed:
9 - All together


Do you believe in Magic
So – I promised some magic.

Earlier on I noted that when we copy the cells across the we are actually copying ALL of the metadata of the table, rather than just the discrete members. The interesting repercussion of that is that if you select any of the data cells and click on the Visualize in Excel button – it will go back and dynamically rebuild the original query in its entirety!
11 - Dynamic Rebuild
12 - Rebuild in Excel

I can just imagine the strokey beard meeting where that was decided!

Anyway, as a piece of functionality it’s staggering. Basically you can build a full Management P&L report with detail in Excel – embed only the important lines in the Word document – but each of those lines contain the metadata to rebuild the entire report from scratch for further analytical queries! (And yes – even the inline text fields can do it).

That’s all Fine and Dandy….What About Next Month?

One of the nice features of Smartview in Word is the POV Manager. From here we can update the POV of the queries and it will refresh the data – this is reverse mapped to the Analyse in Excel function as well, so if you update to the next month in the POV Manager it updates in the full report. The POV manager is very simple, only allowing you to select a new member, and not giving you access to functions like SubVariables, user variables or even UDAs.
10 - POV Manager

13 - Update to Apr

So what’s the Catch

Well – after I’d sent this out to a few people I had some feedback along the lines of

  • That’s awesome
  • Unfortunately you can’t mass update the POV – every discrete report has it’s own setting
  • Therefore at best, it’s rather frustrating to update for big report packs
  • The risk of ‘stuffing this up’ makes this not really a viable option
  • Still – that analyse thing is pretty awesome. Imagine that strokey beard meeting where they decided to do that!

Right – so we need a workaround.

We Can’t Stop Here…this is Hack Country

So – back to Excel. One of the ‘new features’ in Smartview (v11.1.1.3) which offered a benefit over the excel add-in was the ability to use a Subvariable in an adhoc query – basically you could write a sub variable name instead of a member name (prefixing it with an “&”) and it would re-render to be the variable value.

The big issue with this was that the query would be updated at runtime, and be replaced by the member as a text field. If you went back and later updated the substitution variable through EAS, it wouldn’t then go and update the smartview retrieve (thus, basically defeating the entire purpose of the function in the first place). Fortunately, in a later version (v11.1.2.1?), Oracle added the HSGetVariable member function. This is a very useful little function, as it provides the ability to add an on demand sub variable to an ad hoc query, but have it persist while retrieving.

So – to use it within a query:

Select the member that you wish to turn into a variable, goto the Connection Panel and select Build Function
14 - Build Function

From there, you want to select the HSGetVariable Function, and then add the Variable.
15 - HSGetVariable
The yellow highlighted box will actually pull all the members into a list for you to select from.
17 - Function Arguments

Click Validate to test the variable, and then Okay to return to Excel. Returning to excel gets you the very familiar ‘#NEED_REFRESH’ text – but clicking refresh brings back the month correctly (and all the data) but you’ll see that the member has not reverted to ‘Apr’ but is instead still showing the HSGetVariable Function!
16 - Variable

Now updating the sub variable will automatically update this member and therefore the retrieve. Awesome. Let’s use that function to get around the problem! We’ll define a view that uses the variable, copy that across to Word and Bob’s your mother’s brother!

And… doesn’t work

So, back to the query, Smartview tab->Copy, Open Word, Smartview tab -> Paste! And… doesn’t work.

To see why, let’s have a look at the associated SmartTag in Word (alt-f9 or right click and ‘view field codes’)
18 - Smart Tag

When it brings the member across, it redefines every member in the query as a discrete HsGetValue. This is great in one sense, because it means that each cell is fully rendered every time. You don’t have to worry about keeping all the headers etc in place, because the cell knows what value it is querying. I’ll break it down in a bit more detail below:
19 - Smart Tag Breakdown

This does give a bit of a clue as to the work around. There is nothing stopping us from nesting into the period query the function that we used before to query the sub variable. As such, if we replace period reference with a subvariable reference, it should all work.
20 - Smart Tag Updated

And… doesn’t work Take 2

And it does!

Or at least…it did. Until

So yes. I’ve just spent about 10 hours trying every combination of string possible to man trying to replicate my testing from 6 months ago. But it turns out that they’ve changed the way it works, so now it…er…doesn’t work. I’ve replicated it perfectly in, but as soon as I move to a later version of Smartview.

You can still reference HsGetVariable dynamically within Excel – but not within Word.

Since I’d written all this down, I’ve decided to post what I’ve got anyways and come back to it once I’ve found another hack for it. If anyone had any ideas – please leave me a note at the bottom!




Error Handling the Max Power(shell) way

Let’s start this off with a bang

If, like most of us, part of your day to day role is support – one of the key advantages of using ODI in your overnight process is to have the full gambit of ODI error handling available to you. Errors are nicely caught, with email warnings for failures in batch processes basically a default state.

However, what happens if ODI isn’t part of your implementation? What happens if the back end is basically a bunch of batch/maxl scripts hanging together (no matter how elegant it is). If it is the latter, at best you’re unlikely to have an easy method to trap errors and send notifications to the support team and at worst, you may not have any automated notification at all!

So your options are:

  1. Hope against hope that you never ever have any errors in your overnight processes
  2. Write your own error handling piece and include it in the processing

Rolling Your Own

So you’ve decided to steal cable write your own. While this process would be possible in batch – certainly you’re likely to be better off selecting a more advanced language with slightly better handling of strings (See also: Java/Perl/Powershell/Cygwin/VBA).

I’ve gone with Powershell: slightly because I’m generally on Windows environments, somewhat because it’s fairly decent at string manipulation and mostly because it’s one of the few languages I know.

Code Overview

So – to start with, basic documentation and variable structure. I’ve written this to poll through a folder containing log files (rules file errors, maxl return errors and an archive copy of all of the Essbase Log files from the previous day). These have been moved to this folder by earlier maxl/batch steps, mostly by using STDERR outputs.



For each of the files in the log folder it scans for any lines containing errors and pushes the entire line to a central log file.

I’ve added suppression to deal with a majority of ‘false positive’ errors – primarily related to errors in the Essbase log for Smartview user-side errors like ‘you’ve-got-suppression-on-by-default-and-your-top-level-retrieve-is-blank’ (man, if I had a nickel for that one…).


Letting you Know

Finally it checks if the log file isn’t blank and emails it through to the email addresses listed. The primary reason I’ve got the error suppression above is to make sure you’re only receiving automated emails when something has gone wrong. I find too many false positives in your inbox mean that the actual errors occasionally get lost in the underflow.

Finally – the full code in all it’s glory. 20 Lines of working code all up!

# Name: Hyperion Mail Error script
# Purpose: Sends an email to the support team if there are any errors in the overnight
# Author: Peter Nitschke - M-Power Solutions

# Change Control
# When Who What
# 19/07/2014 PN Created

# Script is built as a powershell command to be called at the end of the overnight
# Script will poll through the available files in the log locations folder searching for errors
# Some errors are suppressed (var SUPPRESSEDERROR), additional error codes to be suppressed can be added to this variable
# If any errors are found, it will automatically send an email through to the support team

Set-Variable ERRORFOLDER “..\Errors\OvernightErrors”
Set-Variable ERRORLOG “..\Logs\OvernightErrors.log”

#1020010 No data was generated: Suppress Missing = [%s], Zeros = [%s]. Sheet not overwritten.
#1020011 Maximum number of rows [%s] exceeded [%s].
#1020012 Maximum number of columns [%s] exceeded [%s].
Set-Variable SUPPRESSEDERROR “1020007”,”1020011″,”1020012″

# Create or clear the status file
new-item $ERRORLOG -type file -force

#For all of the error log files
(Get-ChildItem $ERRORFOLDER |

#Create array of objects
#Define the name and content
$name = $
$content = Get-Content $_.FullName
#look for errors in the file
switch(($content | Select-string “Error” | Select-string $SUPPRESSEDERROR -NotMatch | measure | select -exp count) -gt 0) {
#if errors found, print the filename and the error
True {Echo $name >> $ERRORLOG
$content | Select-string “Error” | Select-string $SUPPRESSEDERROR -NotMatch >> $ERRORLOG
Echo “” >> $ERRORLOG}
#Once the loop is finished, check if the status file has any content and email if true
If ((Get-Content $ERRORLOG) -eq $Null) {
#Calling send email function
$smtp = “”
$emailto = “”
$emailfrom = “”
$subject = “Hyperion Overnight – Errors” + (get-date -f “dd/MM/yyyy”)
$body =(get-content $ERRORLOG | Out-string)

send-mailmessage -to $emailto -from $emailfrom -smtp $smtp -subject $subject -body $body

As an aside, I’m aiming to provide full code snippets as often as possible. Most of my learning over the years has been picking up code from everywhere and jerry-rigging it all together. (Subnote: greatest wikipedia ‘Not to be confused with’ ever!). Please feel free to use, send to colleagues, share with friends, or claim as your own to help you pick up ladies.

However, standard rules of never putting something into your environment that you haven’t first tested and understood apply. I’m not to blame if it takes these lemons and burns your house down.


Hopefully this has given you some ideas as to adding basic error handling to your own environments, allowing you to both rest easier at night and feel like you’re always one step ahead of any problems as they occur!



So – after some Jonesing (wait, wrong link ) from certain people, I’ve finally bitten the bullet and am writing some thoughts down.

The intent of the blog is to be full of useful insights on design principles, efficient coding and detailed optimisations to make best use of the Oracle Hyperion suite. In reality it’ll probably be full of bloody-minded hacks to make it all work. The road to hell and all.

And now Ladies and Gentlemen. Let’s start the show.