So it’s always exciting being very cloud focused – every month you have new toys, bug fixes, changes to read. However even for 2019 (which has had some big releases already) June 19 is a pretty big one!
I’m writing this as part of a blog hop (organised by the incredibly enthusiastic and talented Opal) around all of the changes coming in the EPM landscape. Forewarning now – this is literally hot of the press (or more accurately, while the presses are still running!) so we’re all likely to have to come back and update as more information comes through.
So what exactly has changed?
Well on Friday 31st June, the Jun-19 SAAS release notes were pushed out, and the Oracle EPM Product lineup got some new branding.
*First edit!* It got changed again on Monday!
Fundamentally, the EPM stack as a whole has been split into two separate product streams: an Enterprise level and Standard level.
What we’re hearing is that these product streams will be consolidated – so you’re purchasing a EPM license potentially across multiple products rather than purchasing separate licensing for each product (ie: FCCS, ARCS, PBCS)
This is borne out even more by an interesting bit of new functionality in EPMAutomate – the ability to change the service type of the pod between the primary service streams!
And what does this mean for Planning specifically?
Within the EPM stack PBCS\EPBCS was definitely the odd one out – containing two separately licensed products that were masquerading as one. But as a client you either purchased one or the other and generally went with that – very few people were shifting between PBCS to EPBCS or vice versa mostly because the cost and time investment to shift was pretty huge.
The big push from the Oracle side has been to move people towards the out of the box options where at all possible – and this is borne out by the new licensing options.
Standard will include the Out of the Box applications (Workforce, Financials, Projects) and the ability to create 1 custom BSO\Hybrid and 1 custom ASO cube. This will definitely be ‘EPBCS-lite’ – ability to configure will be minimal specifically as it comes to updating \ editing \ creating Groovy scripts. The theory is that it will (only?) suit small clients that can make do without customisation.
For a pure PBCS build (as an awful lot of mine are) it is very disappointing with standard losing out on any customisation – as it is a substantial price bump for no additional functionality.
Enterprise on the other hand is a whole ‘nother kettle of fish and then some. You get:
All of the out of the box applications available with the ability to add 3 more custom BSO and 3 more custom ASO
A complete custom setup with 6 BSO cubes and 6 ASO
Groovy \ Strategic planning \ Enterprise Data Management
Completely Free Form Planning (basically Essbase SAAS wrapped in a Planning shell) . See the link to Jazmin Ribeiro’s post at the bottom.
And what about Clients?
This is a more interesting arc – even in Standard you’re no longer a Planning and Budgeting Cloud Services user anymore, instead you have access across more products in the EPM Stack inculding FCCS and ARCS (likely in their respective cut down offerings). Interesting if only because I know personally we’ve built a lot of Planning applications across the years that would have been better suited for FCCS\HFM or Profitability\PCMCS (the old, if Planning is your hammer, every business process problem looks like a nail).
As such, if we have the ability to easily and seamlessly extend the footprint into other products without having to go back to the client and start the licensing dance again – fantastic.
This also better places EPBCS \ Enterprise in the market. It was, more often than not, a tough sell. If a client’s needs didn’t cleanly fit into the Oracle EPBCS module offerings then it was a significantly greater expense for licensing for limited visible benefit.
“I can do anything with Groovy… except get a client to pay for it!”
The 64 Million Dollar Question
Licensing is going to be the kicker here. Obviously Oracle list price is a fairly arbitrary metric with the way sales and inventive discounts are done by the Oracle Sales group, but assuming a price bump for the additional products and flexibility – it may price small businesses out of a small PBCS-only build.
Obviously there are plenty of other EPM products available and coming into the market, so it may well be that Oracle has decided to stop chasing the small end of the market on price and focus only on mid-size and up companies where their branding and image is stronger.
Edit! We now have pricing:
So yeah – my note about pricing out a small Planning-only build is pretty apt here. There is some value in the Standard license, but if you’re not using ARCS \ FCCS – you’ve got significantly more expense. We’ll have to see how flexible the Pricing can be, but it definitely feels like Oracle have repriced themselves deliberately up – with the lowest price of an EPM deal effectively doubling.
Comparatively (and perhaps compared to the Standard pricing) the Enterprise Licensing is a great deal. There are now real options for bigger clients with multiple on-prem Applications to migrate to the cloud.
Bigger than Ben Hur
Probably the biggest take-away for this change for me is that some of these builds could be enormous. An Enterprise build including FCCS \ ARCS \ Tax \ PCMCS \ Data Management \ Enterprise Planning is a cross-platform \ cross-function implementation covering almost all of the Financial functional areas with potentially 10-40 primary critical stakeholders and teams in the hundreds. Cross-skilled technical business consulting skills are going to be vital – it may not be good enough to simply be a Planning FP&A specialist anymore!
That said the way it is setup, it may be feasible to become more agile and incrementally deliver products quickly, running POC’s where needed and leveraging some of the back end and data integration setups.
So that’s it from me with the June changes – continue your educational journey by “hopping” over to the following posts to find out more about specific product updates, as well as gain different partner perspectives on the licensing, naming, and consolidation changes:
1. Opal Alapat gives an overview of the June 2019 Oracle EPM Cloud blog hop, as well as details on the product name changes, product consolidations, and new licensing: here
2. Jun Zhang does a comprehensive summary of the June updates by product: over here
3. Eric Erikson provides his perspective on FCCS: right this way
4. Jay Chapman covers the notable EPRCS (now called Narrative Reporting) June updates: right thatta way
5. Kate Helmer summarizes important EDMCS updates: in this direction
6. And finally Tony Scalese covers changes to the REST API and EPM Automate: right here!
7. And another late addition – not in the original blog hop, but too good not to share. Jazmin Ribeiro put up some great details around the new standalone planning build here
So – I spent 7 hours working through a fancy REST API script, reading optimisation notes on Powershell queries, and re-working and re-writing the code to be simple and clean to finally post up a detailed ‘warts and all’ article.
And then I actually tried to use it! And it, er, didn’t quite go to plan.
So what is the issue?
Let’s look at a fairly basic hierarchy:
I’m using the Entity hierarchy in a fairly standard format. Reasonably ragged, some levels go to state and some into country, but functionally nothing that Essbase doesn’t handle perfectly.
Let me add the level zero flags.
Top section makes perfect sense – the issue comes with the Entity_NA member. Remembering the way our query works, for summary members we iterate to find the parent member of each level zero.
And now you can see the problem (and hopefully not just because I highlighted it in bright red). We have a ragged hierarchy, so the summary levels are ragged as well. While perfectly correct, this is very likely not what we want.
Fortunately- we’re in Powershell Country
Looking at the qualified names gives us a good idea on how to solve the issue though. By adding a “ParentMember” variable to the prompt, we can then filter on the qualifiedName to get only the members that have this upper level member.
{$member.items | Where qualifiedName -like "*``[$ParentMember*" | where childCount -eq "0"} #select only level zero UNDER the parent member
$Filename = "F:\PBCS_REST\Output.csv"
$AppName = "Demo"
$Dim = "Entity"
$Lev = 1
$ParentMember = "Total_Entity"
# Convert creds to auth headers
$Enc_Auth = [System.Text.Encoding]::UTF8.GetBytes(("{0}.{1}:{2}" -f $Domain, $Username, $Password))
$Enc_PW = [System.Convert]::ToBase64String($Enc_Auth)
$PBCS_REST_URL = "{0}/aif/rest/V1/applications/{1}/dimension/{2}/" -f $URL, $Appname, $Dim
$AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{"Authorization"="Basic $($Enc_PW)"} -ContentType "application/json"
$Finalmembers = foreach($member in $AllMembers) # iterate through all the members
{$member.items | Where qualifiedName -like "*``[$ParentMember*" | where childCount -eq "0"} #select only level zero UNDER the parent member
while ($Lev -gt 0) # for any level higher than 0, recursively query
{
$lookup = foreach ($member in $Finalmembers) # for every member in the previous list (or level 0 for the first pass)
{$member.parentName} #add the parent names to the lookup table
$lookup = $lookup | sort -Unique # grab only the unique parents
$Uppermbrs = foreach ($member in $allmembers) #iterate through the entire list again
{$member.items | where {$_.memberName -in $lookup}} #filter now on the members in the lookup
$lev -- #iterate through the levels
$Finalmembers = $upperMbrs #assign the upper members to the new list
}
$Finalmembers | Select-Object memberName -Unique | ConvertTo-Csv > $Filename -NoTypeInformation # select the uniques and convert to csv
Much better! Now I can actually use this function to set up my Hybrid Dynamic Calc members correctly. The full updated Get-LevMbrs function is available below!
Function Get-LevMbrs
{
<#
.SYNOPSIS
Queries PBCS Data Management Rest API to the get the dimensional members referenced by level
.DESCRIPTION
Queries PBCS Data Management Rest API to the get the dimensional members referenced by level - will write to a file
.PARAMETER Username
The PBCS username for the instance
.PARAMETER password
The PBCS password for the instance
.PARAMETER URL
The PBCS URL including the HTTPS prefix
.PARAMETER AppName
The Data management application name (refer to the target application screen in setup for the names)
.PARAMETER Dim
The dimension you wish to return the members for
.PARAMETER Lev
The level in the dimension you wish to return the members for
.PARAMETER ParentMember
An optional filter to only get the dimensional members under the named member - defaults to the top level dimension name
.PARAMETER filname
Optional local path to where you would like the file saved - defaults to $Dim_Lev$Lev_Members.csv in the current directory.
.OUTPUTS
Returns a csv file with the member names
.LINK
http://essbasedownunder.com/2019/04/extracting-metadata-from-pbcs-using-the-rest-api/
.EXAMPLE
/Get-LevMbrs -Username admin -Password password -URL https://planning-test-domain.pbcs.ap1.oraclecloud.com/ -App Demo -Dim Entity -Lev 0
.EXAMPLE
/Get-LevMbrs admin password https://planning-test-domain.pbcs.ap1.oraclecloud.com/ Demo Entity 0 "F:\PBCS\Members.csv"
.NOTES
Author: Peter Nitschke
#>
[cmdletbinding()]
Param(
[Parameter(Mandatory=$True)][string]$Username,
[Parameter(Mandatory=$True)][string]$Password,
[Parameter(Mandatory=$True)][string]$URL,
[Parameter(Mandatory=$True)][string]$App, #note – this is the Appname from Data Management target application
[Parameter(Mandatory=$True)][string]$Dim,
[Parameter(Mandatory=$True)][int]$Lev,
[Parameter(Mandatory=$False)][string]$ParentMember = $Dim,
[Parameter(Mandatory=$False)][string]$Filename = $PSScriptRoot + ‘\’ + $Dim + ‘_Lev’ + $Lev + ‘_Members.csv’
)
# Construct Domains
$URLSplit = ([uri]”$URL”).Host.split(‘-‘).split(‘.’)
if ($URLSplit[1] -eq “test”)
{$Domain = $URLSplit[2]}
else
{$Domain=$URLSplit[1]}
#Trim the URLs in case they have trailing /
$URL = $URL.TrimEnd(‘/’)
# Construct URL
$PBCS_REST_URL = “{0}/aif/rest/V1/applications/{1}/dimension/{2}/” -f $URL, $App, $Dim
# Convert creds to auth headers
$Enc_Auth = [System.Text.Encoding]::UTF8.GetBytes((“{0}.{1}:{2}” -f $Domain, $Username, $Password))
$Enc_PW = [System.Convert]::ToBase64String($Enc_Auth)
# Invoke PBCS REST API to get Latest API version
try
{
$AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{“Authorization”=”Basic $($Enc_PW)”} -ContentType “application/json”
$Finalmembers = foreach($member in $AllMembers)
{$member.items | Where qualifiedName -like “*“[$ParentMember*” | where childCount -eq “0”} #select only level zero UNDER the parent member
while ($Lev -gt 0) # for any level higher than 0, recursively query
{
$lookup = foreach ($member in $Finalmembers) # for every member in the previous list (or level 0 for the first pass)
{$member.parentName} #add the parent names to the lookup table
$lookup = $lookup | sort -Unique # grab only the unique parents
$Uppermbrs = foreach ($member in $allmembers) #iterate through the entire list again
{$member.items | where {$_.memberName -in $lookup}} #filter now on the members in the lookup
$lev — #iterate through the levels
$Finalmembers = $upperMbrs #assign the upper members to the new list
}
$Finalmembers | Select-Object memberName | ConvertTo-Csv > $Filename -NoTypeInformation # select the uniques and convert to csv
}
# Return the error if the connection fails
Catch
{throw $_.Exception.Message}
}
Obviously there are a number of manual ways to do it – simple Smartview exports, exporting the dimension and parsing the file, even writing a custom database export out to flat file – but none of them are particularly elegant and\or streamlined and would be very difficult to integrate into a metadata update.
When John mentioned that he had found a way, I was obviously intrigued! Basically in his investigation of the FDMEE on-prem to Cloud integration processes he found an undocumented data management query to return all of the members from within the requested dimension:
GET URL/aif/rest/V1/applications/App/dimension/Dim/
This query is from the data management layer – specifically querying the underlying FDMEE \ Data management tables that are populated by the target application metadata queries.
Running this query manually then returns a detailed listing of every single member including a large volume of metadata.
Unfortunately? It’s not the best metadata. ‘MemberLevel’ is a misnomer – basically you have +1 for the dimension name and -1 for everything else. ‘QualifiedName’ is useful, but because it’s a parent child listing you still need to iterate in order to actually identify where you are. Still – we should be able to construct some queries to give us what we want!
Let’s start nice and simple – setting up our query:
$Username = "Username"$
Password = "password"
$URL = "https://planning-test-pivot2.pbcs.ap1.oraclecloud.com"
$Domain = "pivot2"
$AppName = "Demo" # This is the name of the Planning application
$Dim = "Entity" # This is the dimension you're querying
$Filename "F:\PBCS_REST\Example.csv"
# Convert creds to auth headers
$Enc_Auth = [System.Text.Encoding]::UTF8.GetBytes(("{0}.{1}:{2}" -f $Domain, $Username, $Password))
$Enc_PW = [System.Convert]::ToBase64String($Enc_Auth)
$PBCS_REST_URL = "{0}/aif/rest/V1/applications/{1}/dimension/{2}/" -f $URL, $Appname, $Dim
$AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{"Authorization"="Basic $($Enc_PW)"} -ContentType "application/json"
$AllMembers | ConvertTo-Csv > $Filename -NoTypeInformation
This returns everything and utilises powershell’s built in csv converter to render something reasonable.
Useful but not what we want – let’s build out a query to just return level 0 members.
$Finalmembers= @() # Create an array for the members
try{
#Trigger the dim query
$AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{"Authorization"="Basic $($Enc_PW)"} -ContentType "application/json"
foreach ($member in $AllMembers)
{$Finalmembers += $AllMembers.items | Where-Object {$_.childCount -eq "0"} | Select-object memberName -Unique }# filter for the level 0 members (where no children)
$Finalmembers | Select-object memberName -Unique | ConvertTo-Csv > $Filename -NoTypeInformation # select the uniques and convert to csv
}
Catch {throw $_.Exception.Message}
We can’t use a level indicator, but it’s simple enough to just check if the members have children. This now returns us a nice simple list of members.
Great! So Level 0 is nice and easy. Recently however, I’ve been doing some work with EPBCS Hybrid and found myself reminiscing about Celvin’s incredible Essbase Member Operation Utility – specifically it’s ability to bulk update higher level members to a new storage type. So I got wondering, what if we want greater than level 0?
$Lev = 2 #define the level to return
# Trigger the Dim Function
$Finalmembers= @() # Create an array for the members
#Trigger the dim query
$AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{"Authorization"="Basic $($Enc_PW)"} -ContentType "application/json"
foreach ($member in $AllMembers) {
$Finalmembers += $AllMembers.items | Where-Object {$_.childCount -eq "0"} | Select-object memberName,parentName -Unique # filter for the level 0 members (where no children)
}
while ($Lev -gt 0) # for any level higher than 0, recursively query
{
$upperMbrs=@() # build an upper level array
foreach ($member in $Finalmembers) # for every member in the previous list (or level 0 for the first pass)
{
$upperMbrs += $AllMembers.items | Where-Object {$Finalmembers.parentName -eq $_.memberName} #give the details where the previous levels parent member equals the member name
}
$lev -- #iterate through the levels
$Finalmembers = $upperMbrs | Select-object memberName,parentName,alias -Unique # select only the unique members (to make the recursion quicker)
}
$Finalmembers | Select-object memberName -Unique | ConvertTo-Csv > $Filename -NoTypeInformation # select the uniques and convert to csv
So this is a little more convoluted – but basically we’re finding the Level x members – getting all their parents and then looping and recursively searching up the hierarchy.
One note to make at this junction – this is not speedy code. Recursive loops through the pipeline with a where-object as a filter is….about as bad as you can get for performance! It’s not that awful – pulling big dimensions (10k+ members) takes around 5 mins. Buuuuut….yeah, performance is important. Let’s re-write it.
$AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{"Authorization"="Basic $($Enc_PW)"} -ContentType "application/json"
foreach ($member in $AllMembers) {
$FinalMembers = $member.items | where childCount -eq "0"
}
Firstly changing to a where clause from a where-object clause removes our script block meaning we can only do a simple comparison, but this massively speeds up the iteration.
while ($Lev -gt 0) # for any level higher than 0, recursively query
{
$lookup = foreach ($member in $Finalmembers) # for every member in the previous list (or level 0 for the first pass)
{$member.parentName} #add the parent names to the lookup table
$lookup = $lookup | sort -Unique # grab only the unique parents
$Uppermbrs = foreach ($member in $allmembers) #iterate through the entire list again
{$member.items | where {$_.memberName -in $lookup}} #filter now on the members in the lookup
$lev -- #iterate through the levels
$Finalmembers = $upperMbrs #assign the upper members to the new list
}
$Finalmembers | Select-Object memberName | ConvertTo-Csv > $Filename -NoTypeInformation # select the uniques and convert to csv
For the recursion – building the parent member as a lookup list with a single iteration means we can run through it much quicker.
Much much quicker.
But wait! You want to query using generations instead of levels? Alright then. We can’t use recursion up the ParentName hierarchy, instead we’ll use a (glorious) hackjob and look at the qualifiedName. By counting the number of members in the qualifiedName reference we can work out which generation it is!
$Gen = 5
$v2 = (measure-command{
$AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{"Authorization"="Basic $($Enc_PW)"} -ContentType "application/json"
foreach ($member in $AllMembers)
{$Finalmembers = $AllMembers.items | Where-object {(($_.qualifiedName) -ne $null) -and (($_.qualifiedName).split(".").count -eq $Gen)}}
$Finalmembers | Select-object memberName -Unique | ConvertTo-Csv > $Filename -NoTypeInformation # select the uniques and convert to csv
}).totalseconds
Write-Host ('15k members {0} secs' -f $v2)
Same basic logic – except we’re filtering out the nulls, and then counting the number of “.”‘s to define how deep we are in the hierarchy.
A bit slower – but reasonable given the size of the dimension.
Hopefully that helps – I’ve built out the queries above as powershell functions available below – feel free to modify, use etc as needed, making sure to always test test and test again before running anything in Production.
Function Get-LevMbrs
{
<#
.SYNOPSIS
Queries PBCS Data Management Rest API to the get the dimensional members referenced by level
.DESCRIPTION
Queries PBCS Data Management Rest API to the get the dimensional members referenced by level - will write to a file
.PARAMETER Username
The PBCS username for the instance
.PARAMETER password
The PBCS password for the instance
.PARAMETER URL
The PBCS URL including the HTTPS prefix
.PARAMETER AppName
The Data management application name (refer to the target application screen in setup for the names)
.PARAMETER Dim
The dimension you wish to return the members for
.PARAMETER Lev
The level in the dimension you wish to return the members for
.PARAMETER filname
Optional local path to where you would like the file saved - defaults to $Dim_Lev$Lev_Members.csv in the current directory.
.OUTPUTS
Returns a csv file with the member names
.LINK
http://essbasedownunder.com/2019/04/extracting-metadata-from-pbcs-using-the-rest-api/
.EXAMPLE
/Get-LevMbrs -Username admin -Password password -URL https://planning-test-domain.pbcs.ap1.oraclecloud.com/ -App Demo -Dim Entity -Lev 0
.EXAMPLE
/Get-LevMbrs admin password https://planning-test-domain.pbcs.ap1.oraclecloud.com/ Demo Entity 0 "F:\PBCS\Members.csv"
.NOTES
Author: Peter Nitschke
#>
[cmdletbinding()]
Param(
[Parameter(Mandatory=$True)][string]$Username,
[Parameter(Mandatory=$True)][string]$Password,
[Parameter(Mandatory=$True)][string]$URL,
[Parameter(Mandatory=$True)][string]$App,
[Parameter(Mandatory=$True)][string]$Dim,
[Parameter(Mandatory=$True)][string]$Lev,
[Parameter(Mandatory=$False)][string]$Filename = $PSScriptRoot + ‘\’ + $Dim + ‘_Lev’ + $Lev + ‘_Members.csv’
)
# Construct Domains
$URLSplit = ([uri]”$URL”).Host.split(‘-‘).split(‘.’)
if ($URLSplit[1] -eq “test”)
{$Domain = $URLSplit[2]}
else
{$Domain=$URLSplit[1]}
#Trim the URLs in case they have trailing /
$URL = $URL.TrimEnd(‘/’)
# Construct URL
$PBCS_REST_URL = “{0}/aif/rest/V1/applications/{1}/dimension/{2}/” -f $URL $App $Dim
# Convert creds to auth headers
$Enc_Auth = [System.Text.Encoding]::UTF8.GetBytes((“{0}.{1}:{2}” -f $Domain, $Username, $Password))
$Enc_PW = [System.Convert]::ToBase64String($Enc_Auth)
# Invoke PBCS REST API to get Latest API version
try
{
$AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{“Authorization”=”Basic $($Enc_PW)”} -ContentType “application/json”
$Finalmembers = foreach($member in $AllMembers)
{$member.items | where childCount -eq “0”}
while ($Lev -gt 0) # for any level higher than 0, recursively query
{
$lookup = foreach ($member in $Finalmembers) # for every member in the previous list (or level 0 for the first pass)
{$member.parentName} #add the parent names to the lookup table
$lookup = $lookup | sort -Unique # grab only the unique parents
$Uppermbrs = foreach ($member in $allmembers) #iterate through the entire list again
{$member.items | where {$_.memberName -in $lookup}} #filter now on the members in the lookup
$lev — #iterate through the levels
$Finalmembers = $upperMbrs #assign the upper members to the new list
}
$Finalmembers | Select-Object memberName | ConvertTo-Csv > $Filename -NoTypeInformation # select the uniques and convert to csv
}
# Return the error if the connection fails
Catch
{throw $_.Exception.Message}
}
Function Get-GenMbrs
{
<#
.SYNOPSIS
Queries PBCS Data Management Rest API to the get the dimensional members referenced by generation
.DESCRIPTION
Queries PBCS Data Management Rest API to the get the dimensional members referenced by generation - will write to a file
.PARAMETER Username
The PBCS username for the instance
.PARAMETER password
The PBCS password for the instance
.PARAMETER URL
The PBCS URL including the HTTPS prefix
.PARAMETER AppName
The Data management application name (refer to the target application screen in setup for the names)
.PARAMETER Dim
The dimension you wish to return the members for
.PARAMETER Gen
The Generation in the dimension you wish to return the members for
.PARAMETER filname
Optional local path to where you would like the file saved - defaults to $Dim_Gen$Gen_Members.csv in the current directory.
.OUTPUTS
Returns a csv file with the member names
.LINK
http://essbasedownunder.com/2019/04/extracting-metadata-from-pbcs-using-the-rest-api/
.EXAMPLE
/Get-GenMbrs -Username admin -Password password -URL https://planning-test-domain.pbcs.ap1.oraclecloud.com/ -App Demo -Dim Entity -Gen 3
.EXAMPLE
/Get-GenMbrs admin password https://planning-test-domain.pbcs.ap1.oraclecloud.com/ Demo Entity 4 "F:\PBCS\Members.csv"
.NOTES
Author: Peter Nitschke
#>
[cmdletbinding()]
Param(
[Parameter(Mandatory=$True)][string]$Username,
[Parameter(Mandatory=$True)][string]$Password,
[Parameter(Mandatory=$True)][string]$URL,
[Parameter(Mandatory=$True)][string]$App,
[Parameter(Mandatory=$True)][string]$Dim,
[Parameter(Mandatory=$True)][string]$Lev,
[Parameter(Mandatory=$False)][string]$Filename = $PSScriptRoot + ‘\’ + $Dim + ‘_Lev’ + $Lev + ‘_Members.csv’
)
# Construct Domains
$URLSplit = ([uri]”$URL”).Host.split(‘-‘).split(‘.’)
if ($URLSplit[1] -eq “test”)
{$Domain = $URLSplit[2]}
else
{$Domain=$URLSplit[1]}
#Trim the URLs in case they have trailing /
$URL = $URL.TrimEnd(‘/’)
# Construct URL
$PBCS_REST_URL = “{0}/aif/rest/V1/applications/{1}/dimension/{2}/” -f $URL $App $Dim
# Convert creds to auth headers
$Enc_Auth = [System.Text.Encoding]::UTF8.GetBytes((“{0}.{1}:{2}” -f $Domain, $Username, $Password))
$Enc_PW = [System.Convert]::ToBase64String($Enc_Auth)
# Invoke PBCS REST API to get Latest API version
try
{
$AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{“Authorization”=”Basic $($Enc_PW)”} -ContentType “application/json”
foreach ($member in $AllMembers)
{$Finalmembers = $AllMembers.items | Where-object {(($_.qualifiedName) -ne $null) -and (($_.qualifiedName).split(“.”).count -eq $Gen)}}
$Finalmembers | Select-object memberName -Unique | ConvertTo-Csv > $Filename -NoTypeInformation # select the uniques and convert to csv
}
# Return the error if the connection fails
Catch
{throw $_.Exception.Message}
}
Apologies – haven’t written in forever, real life got in the way as it sometimes does!
But I decided to put something together on a new feature dropping in the Sept-18 PBCS release: Summary Level Drill.
Summary what now?
So FDMEE\Data Management has had the ability to drill back to staging table for a while – and for specific sources (read: Oracle) back to the transaction detail in the GL. A great feature sure, but not without it’s limitations. Specifically you were limited to drilling only from the intersection where the data was loaded – notionally almost always level-0.
Dependant on the structure of the chart of accounts, that can often mean users need to go to the bottom of a number of reasonably superfluous dimensions in order to get to a point where you can drill from. Making it even less functional for the users, often you want to see comparisons of transactions across all dimensions ie: all the transactions for a certain account, no matter which entity \ cost centre \ product \ customer \ project it was posted. This problem is only gets worse with ASO cubes with additional levels of detail and additional dimensions for reporting.
This lead to a number of interesting alternatives.
Essbase Studio \ Essbase Integration Services
Old school – but it works. Cubes built through essbase studio natively inherited the ability to drill from upper levels – mostly because the query could be handed off to essbase studio which already knew the level-0 members.
Downsides are plenty, but fundamentally
– it doesn’t work for planning
– it’s basically deprecated
– an equivalent doesn’t exist in the cloud
Posting to an upper level in FDMEE
The talented Ahmed Hafez (aka: HyperionJedi) worked out this gloriously hacky method.
Basically because FDMEE allows drill from anywhere you load data – you load data to the summary nodes.
The downsides?
– You need to create an additional load per upper level dimension
– You need to always load the data multiple times
– it has (significant) scale issues past 2-3 additional dimensions
Drillbridge
The brainchild of essbase nerd Jason Jones – this is a full featured drill through reporting tool.
Basically he’s doing it ‘properly’. When you hit an upper level – drillbridge fires off a query to the API to get the level 0 members and builds the query that way. Intentionally so, delivering an effective replacement for Essbase Studio (so people didn’t implement Studio JUST for drillthrough) was one of the main reasons behind the build.
So what’s annoying?
– It’s another app in the stack
– You need to store the data ‘somewhere’ – that will likely easy to do on-premise, but a possibly a little harder if you’re going 100% cloud.
So let’s look at Oracle’s crack at it
For this example I’m going to assume you’ve already got drillthrough turned on.
It’s fair to say they’ve made summary drill through as simple as possible – simply go into the Application Options in target application settings, and Enable Drill from Summary
Then trigger a quick reload:
And Bob’s your mother’s brother!
Or…err…Not. We still can’t drill from the upper levels.
So what went wrong?
Let’s go and look at the drill through definition that has been defined by FDMEE:
So the level-0 flag is now disabled, but the @LIST of drill intersections is a bit weird.
Basically we’ve ended up with mostly the level zero members – specifically the members where data is loaded.
Let’s change it to something else for the account dimension.
And let’s check our form?
Right click on a parent member
Success! Boil in a bag etc etc.
So did it work? Oh Yes..
It’s gone and brought back all of the accounts from the parent account node that we selected.
As a bonus they’ve built a target data grid that shows all the level zero intersections from Essbase.
AND a bonus drillthrough log!
with the important details – showing all of the accounts that it has returned.
So what about that hack job earlier?
Oh yes – let’s go and fix that.
So in our earlier efforts only the target accounts were being marked as valid in the drill definition. In order to make the upper levels available to drill on we need to turn off this marking so all the parent members are available to drill from.
Returning to our Application Options – untick the Create Drill Region check box on the affected dimension – in this example COA.
Trigger a reload.
And now all the accounts are missing from the list.
And all the summary accounts are available to drill from!
So I just turn off all the drill definitions? Easy!
Er… Not quite.
Let’s see what happens if we turn off the the view dimension definition?
Yeah – that’s not what we want. All of those nodes aren’t drillable…be awfully nice if we could customise it.
So…you know where this is going
Okay – we want to use the FDMEE drill through tables and views, and the ability to drill from summary detail, but we DON’T want to use the default definitions. Welcome to hack country.
Firstly, create a normal definition as per the process above and open it in Calc Manager. Take a copy of the the XML Definition.
Once you’ve got a copy of the definition, delete the default one.
Create a NEW drill through definition with a new URL name, copy in the XML definition from the default, and create your own drill through region with the summary levels defined.
Return to Data Management and turn off the Drill region creation.
And now, whenever you load the data into PBCS it doesn’t recreate the definition. Instead users drill into the custom drill through definition that you’ve defined. Very useful in the circumstances where you have non consolidating members in a hierarchy – or alternatively want specific members to be enabled
Wrapping it up
Okay – so it’s very possible that the default functionality will work for you without needing to turn to hacks. None the less, it’s a really good way to get the absolute most out of the new drillthrough logic!
Now just hoping that Oracle adds some more functionality to the drillthrough report itself – even simple items like giving us the ability to easily organise, hide and reformat columns within the report would really enhance it’s end user appeal. Still, not bad for a 30 minute change over!
So after a week of terrible terrible REST puns from yours truly and the usual suspects.
I figured I’d write up something non REST related….you might even say unREST rela…okay, I’m done.
So What Are We Breaking Today?
For a recent project, we have been building a standard workforce \ position management cube. Fairly basic – uses some of the time factor stuff I wrote about here – but fundamentally nothing too fancy.
During the first feedback session with the client, we were asked – specifically while showing the ‘Move Position to a New Org Unit’ section – could we set a flag on the position for the Source Business Unit and Target Business Unit so both Business owners knew where the employee had come from and gone to?
This is actually reasonably nuanced. Security is setup per Business Unit, so if you move an employee to a Business Unit you don’t have access to, you actually lose access to the employee and vice versa for the receiving Business Unit. Hence having the visibilty of the Source and Target BU at least provides visibility as to whom to contact.
Anyway, since it sounded like a really good idea, and I’m a cocky little prick confident young man I immediately said yes.
Okay so the fundamental problem is that we need to be able to ‘Prompt’ for a Target Business Unit to move data into – therefore it really needs to return a member name. But we also want to grab both the Source and Target Business Units and update them to Smartlists.
This is interestingly difficult. Both Essbase and Planning scripts don’t ‘really’ like handling strings. Specifically it’s possible to transition strings into member names, but it’s difficult to use strings as variables and very difficult to create a text information on the fly. Needless to say, what I thought was going to be a 10 min job turned into a significantly longer process.
Now, as Celvin pointed out (possibly while sighing) all of this would be very simple in Groovy – but I’m old school. And Stubborn. And we have got a few new functions to play with in PBCS, so lets see what we do in a stock PBCS instance.
FIX([[PlanningFunctions.getUserVarValue("UV_Scenario")]],[[PlanningFunctions.getUserVarValue("UV_Version")]],&Year_Assumption,"Local","Month")
FIX({Var_MovePosition})
FIX({Var_BUSource},{Var_EntitySource},{Var_ProjSource})
&Period_Assumption(
IF(@COUNT(SKIPNONE,@MERGE({Var_BUSource},@MEMBER(@CONCATENATE("HSP_ID_",@HspNumToString({Var_SL_Target_BU}))))) == 1)
@RETURN("Source and Target Business Units cannot match",ERROR);
ENDIF
);
ENDFIX
FIX({Var_BUSource},{Var_EntitySource},{Var_ProjSource})
&Period_Assumption(
/*Push the data from Source to Target Business Unit and update the end date and transfer flag */
@XWRITE({Var_EntitySource},@LOOPBACK,@CONCATENATE("HSP_ID_",@HspNumToString({Var_SL_Target_BU})),{Var_EntityTarget},{Var_ProjTarget});
);
"Emp_End_Date_Override" = @CalcMgrRollDay({Var_StartDate}, @_false);
"Transfer" = {Var_SL_Target_BU};
ENDFIX
FIX(@MEMBER(@CONCATENATE("HSP_ID_",@HspNumToString({Var_SL_Target_BU}))),{Var_EntityTarget},{Var_ProjTarget},&Period_Assumption)
/* Update the Target Start date and Transfer Flag */
"Emp_Start_Date_Override" = {Var_StartDate};
"Transfer" = @CalcMgrDoubleFromString(@CalcMgrFindFirst(@ALIAS(@NAME({Var_BUSource}),"SLAliases"),"[0-9]+", @_true));
ENDFIX
ENDFIX
ENDFIX
What?
Okay, I realise I’m using a few new functions here. Let’s start from the top and work down.
Okay – so the documentation on this has gotten significantly better. Basically this first section is calling User Variables as variables in a Business rule.
This is a massive time saving versus the old hack job that we’ve been using for years.
These are really simple, just standard member variables. The only thing to bear in mind here is that you need to setup your action menu so it brings in all the variables into the rule, otherwise it will show the Source members as prompts. The way action menus are best understood (in my mind at least) are that they will only natively bring in everything ‘left and up’ of where you’ve selected.
As such, I’ve elected to make the required parameter Position which means that I can only bring up the context menu by selection to the right of where Position is shown on the form. Note: Business Unit and Entity are in the Point of View.
The left hand panel shows a selection on the first column (Project), the right hand panel shows the selection from the second column onwards.
Essbase Error in Your Favour – Collect $200
&Period_Assumption(
IF(@COUNT(SKIPNONE,@MERGE({Var_BUSource},@MEMBER(@CONCATENATE("HSP_ID_",@HspNumToString({Var_SL_Target_BU}))))) == 1)
@RETURN("Source and Target Business Units cannot match",ERROR);
ENDIF
);
Okay, the first of the fun lines. Functionally what this is designed to do is give an error message to the user if they pick the same Target Business Unit as the Source.
Starting from the right hand side, we’re first converting the Smartlist Target Business Unit {Var_SL_Target_BU} into a member:
when you automatically create a smartlist in PBCS it also creates a hidden alias table
This alias table (SLAlias) is added to the members you used as a Source of your smartlist
It contains every member with HSP_ID_ as a prefix and the assigned integer as a suffix
By wrapping @MEMBER around the @CONCATENTATE we return a full valid member name using the SLAlias table. The @HspNumtoString function converts the Smartlist ID into a numeric string that can be concatenated.
We then wrap the @COUNT(SKIPNONE, around the function, passing through the {Var_BUSource} member and the Target member from the Smartlist.
The @MERGE function wrapped around both members will merge all duplicate members.
The end state for this is – if the Source Business Unit is the SAME as the Target Business Unit the count will be 1 and the @Return will fire stopping the code.
So for the second section of code, we push the data into the Target Business Unit.
FIX({Var_BUSource},{Var_EntitySource},{Var_ProjSource})
&Period_Assumption(
/*Push the data from Source to Target Business Unit and update the end date and transfer flag */
@XWRITE({Var_EntitySource},@LOOPBACK,@CONCATENATE("HSP_ID_",@HspNumToString({Var_SL_Target_BU})),{Var_EntityTarget},{Var_ProjTarget});
);
"Emp_End_Date_Override" = @CalcMgrRollDay({Var_StartDate}, @_false);
"Transfer" = {Var_SL_Target_BU};
Okay, still a few interesting functions here. @XWRITE is pretty common – but the optional @LOOPBACK function is useful in it’s ability to write back to the same database. Using @XWrite here has two advantages
We don’t have to worry about block creation (XWrite takes care of that)
Natively Xwrite accepts ‘strings’ and converts them into member names – makes the code significantly easy to manage
You can see here we’re using strings, so we don’t need to wrap the smartlist lookup @CONCATENATE(“HSP_ID_”,@HspNumToString({Var_SL_Target_BU})) with @MEMBER – the Alias string works just as well.
We’re also updating the Emp_End_Date_Override member with a new date using a fancy CDF. Specifically @CalcmgrRollDay wrapped with the @_false boolean flag to roll BACK one day (bonus points if you worked out @_true would roll forward one day).
Finally we assign the Transfer flag with the selected Smartlist. This is just an integer at this point, so can be directly assigned.
Last step – assign the flag and start dates on the Target Business Unit.
FIX(@MEMBER(@CONCATENATE("HSP_ID_",@HspNumToString({Var_SL_Target_BU}))),{Var_EntityTarget},{Var_ProjTarget},&Period_Assumption)
/* Update the Target Start date and Transfer Flag */
"Emp_Start_Date_Override" = {Var_StartDate};
"Transfer" = @CalcMgrDoubleFromString(@CalcMgrFindFirst(@ALIAS(@NAME({Var_BUSource}),"SLAliases"),"[0-9]+", @_true));
ENDFIX
Okay, the FIX is basically the same function again – this time wrapping it in @MEMBER so we can put it in a fix statement, but everything else is simple. Secondly Emp_Start_Date_Override just equals the start date (since we rolled the end date one day back in the previous step, there is no double up of costs on that day).
The last bit is fun though. We’re basically attempting to work out the Smartlist ID value that is associated with a Source Business Unit member variable.
To do that, we’re actually calling the @ALIAS function. This function, in PBCS at least, has a semi-undocumented feature of allowing you to select which alias table to look at. The documentation above is actually for Oracle Analytics Cloud – alternatively, have a look at the function itself within calc manager:
So we can convert the member name into the alias table, using the SLAliases alias list, which will give us our HSP_ID_xxxxxx.
At that point, we need to parse the function to get the number – and here regular expressions come into play.
We’re using the new @CalcMgrFindFirst function to provide the first match of the defined regular expression.
In order to match just the numbers in the string, I’m using the very simple match:
"[0-9]+"
Basically one or more of any number.
For anyone just starting in regular expressions (and I’m a neophyte myself), I cannot recommend Regex101 enough. I mean, look at it.
Once we’ve got a returned number from the regular expression match, we then convert it from a string into a number (strictly speaking, a double) using the @CalcMgrDoubleFromString.
So basically we’re converting a member to it’s alias so we can parse the ID from that string, then converting that to a number which just happens to be the Smartlist member that we actually want! Serendipitous.
And that’s that
Definitely didn’t expect to have to crack open regular expressions to get there, but I had some fun with lots of the new features and functions!
So, this is really a continuation of the previous one. Just as a recap:
the REST API is awesome
It lets us do all of the same functions as EPMAutomate, but with significantly better error handling and logging abilities
It also has a few ‘new’ tricks up it’s sleeve
I make terrible REST puns.
So – RESTating the Obvious
Where I’d left off, we’d written a powershell script to delete the existing load file, upload a new load file, run a data load run and IF it errors download the log.
Great for one file, but what if we have a bit of data…what if we have a folder?
And this is what that log file looks like – remember we’re only getting the logs for the failed loads.
SquaREST peg in a Round Hole
So let’s talk about the FDMEE\Data Management log for a moment.
In many ways, this is literally the best log of…well, anything…that you can currently get out of PBCS. No access to the Essbase calc log, no access to the FR logs, no access to the MDXQuery logs.
So this is literally a replica of the FDMEE log, with options to set the log level and capture all the detail.
But, it’s still a bit all over the place. The actual reporting of errors is pretty good, but we’re definitely going to need some parsing to get anything useful out of it.
So looking through it, the errors we’re looking for are here:
This is the standard error code for an essbase (ASO) data load with missing metadata.
A pipe and comma delimited error handle broken down by line with the following fields:
Note: If anyone can think of a method of pulling that order out dynamically from anywhere else (specifically the UD1+ order), please let me know!
The FaiREST Parser
So if we grab the second field, we’ve got the specific missing members that failed. That’s fairly easy to parse in Powershell
# Okay - this is where we get fun...
$AllMissingMembers = New-Object System.Collections.Generic.List[System.Object] #create a list for the missing members
Get-Content $Logfilename | # open the file
Select-string "Error: 3303" | # find any Error 3303 strings (Missing members)
foreach{ #for each error row
$data = $_.line.split("|",[System.StringSplitOptions]::RemoveEmptyEntries).Trim(); # split them into members and data rows
$AllMissingMembers.add($data[1]) #build an list of missing members
}
$UniqueMembers = $AllMissingMembers | Select -unique # select the unique items
Without wanting to write a discourse on powershell, the functions are as follows:
Create a generic list object
Open the file (get-content is a little slow with bigger files, but these are pretty small)
Grab all of the rows that start with “Error:3303”
For each of those rows, split up the rows using the “|” delimiter
Grab the second item in the row (arrays start at zero!) and add it to the list
Once you’ve got all the members, create a unique list
Wham Bam – we have a unique list of members in a powershell list.
Now, to be honest we could stop here. It’s pretty easy in Powershell to just write that member list into an email – attach the data lines to give it context – and send it to the administrator.
But, 8 lines of code does still not a good blog post make. We’ve got the REST API – let’s try to get fancy.
Conceptually it’s great. Useful for new projects, new employees, new assets etc. Gives the users the ability to dynamically add metadata and can be fully scripted in with @RETURN controls and validation.
It’s the biggest downside? It doesn’t work in Smartview – and in reality, that’s such a gamebreaker that I’ve gone off it completely in PBCS, or alternatively on premise ‘created’ my own with the groovy CDF.
But – this process isn’t user facing – so let’s see what we can do with it!
Let’s create a couple of “Unmapped” nodes in our dimensions
Obviously you’d want to select only as many members as you’re likely to need per load file – and remember these get created in the essbase cube, so don’t add 300 additional hidden members in your dense dimension!
Once that’s done and refreshed into essbase – we can call the REST command to add members to those nodes.
For this we need to connect to the Planning API:
REST Resource - Add Member
POST /HyperionPlanning/rest/{api_version}/applications/{application}/dimensions/{dimname}/members
Similar to the dataload process, we’ll be creating a payload to post through the body of the REST POST query.
Similar, also to the dataload process, the documentation is slightly incorrect.
Instead of Name the member must be listed as memberName.
foreach($UniqueMember in $UniqueMembers) #Now we have a unique list of members, lets iterate and add them!
{
$CreateMbrURL = $URL + $RESTPlanning + "/dimensions/Product/members" # Create the Rest Add Member URL
# Generate the Add Member payload
$AddMemberPayload = "{
""memberName"": ""$UniqueMember"",
""parentName"": ""Unmapped Product""
}"
#Create the member
$restReponse = Invoke-RestMethod -Uri $CreateMbrURL -Method Post -Body $AddMemberPayload -Headers $Headers -ContentType "application/json"
If ($restReponse.Status -gt "0")
{write-host "Member Creation Unsuccessful - " + $restReponse.Details}
Else
{write-host $UniqueMember "- Member created Sucessfully"
}
Obviously I’m being lazy and not generating the ParentName dynamically – up to you as to how you achieve this, but assuming your dimension members are all prefixed with the same letter it should be reasonably easy. You could also create multiple dynamic child enabled members within a single hierarchy – might be useful to map unmapped Balance Sheet members into one node and unmapped P&L members to another.
Importantly, the new member seems to inherit the data storage operator from it’s parent member. Which is useful, because every possible permutation of the actual documentation that I tried didn’t work.
"dataStorage": "STOREDATA" #what the documentation says
"dataStorage": "Store Data" #what the response body is
"DataStorage": "StoreData" #removing spaces
"DataStorage(FINRPT)": "Store Data" #Similar to the planning header
This was made even more frustrating because every failed attempt doesn’t return an error message, but instead a full 500: HTML Error. So, once again, if anyone knows the correct syntax, please let me know?
RESTructuring the cube
So now we’ve dynamically added the members that are missing, we need to refresh the essbase layer. In ‘theory’ (though I didn’t test this) – if you were loading through the Planning connection, you wouldn’t even need to do this step! But because we’re loading into the ASO cube, let’s refresh the cube.
This also opens up another REST API command: executing a job.
Nice and simple – first you create a job within PBCS and then you call it. Since Cameron Lackpour has already enthused the world with multiple blog posts on the topic I won’t belabour the point.
Basically create a refresh cube job and give it a name you remember.
Once that’s done, we call it very similarly to any other planning command, setup a quick payload and run it.
#Now all the metadata has been added, trigger a database refresh
$RefreshURL = $URL + $RESTPlanning + "/jobs" # Create the Rest refresh URL
# Generate the refresh cube payload
$RefreshPayload = "{
""jobType"": ""CUBE_REFRESH"",
""jobName"": ""Refresh_with_Kill""
}"
$restReponse = Invoke-RestMethod -Uri $RefreshURL -Method Post -Body $RefreshPayload -Headers $Headers -ContentType "application/json"
$RefreshURLResponse = $RefreshURL + "/" + $restReponse.jobID
$RefreshComplete = $false
while(-not $RefreshComplete)
{
#Call the status of the job
$Checkrunning = Invoke-RestMethod -Uri $RefreshURLResponse -Method Get -Headers $Headers -ContentType "application/json"
#check if the status is still running
if ($Checkrunning.status -eq "-1")
{
#wait for 5 secs
Start-sleep -seconds 5
}
Else
{
#if the refresh is a sucesss,
if($Checkrunning.status -eq "0")
{
#Job was sucessful
write-host "Refresh Sucessful - " $Checkrunning.details
}
Else
{
#job failed
write-host "Refresh Failed - " $Checkrunning.details
}
$RefreshComplete = $true
}
}
Once again – we have to roll our own process to check when the job is complete. Calling the REST API job by itself will simply run the job, but for this step we need to wait for it to complete . Perhaps surprisingly (or perhaps not), this is different to the loaddata API calls.
From the original POST REST command we get the JobID, and then we run a GET command on that Job ID and we get the status. Why Oracle decided to change between the definition of Status and Details between the Data Management API and the Planning API is a question best left for the Oracles.
RESTuffing the Turkey
(The puns are really getting tortured. Sorry)
Final Step? Reload the data of course!
# invoke Request to Reload the data
$restReponse = Invoke-RestMethod -Uri $DataLoadURL -Method Post -Body $Payload -Headers $Headers -ContentType "application/json"
$responseURL = $restReponse.links[0].href
$ReloadComplete = $false
while(-not $ReloadComplete)
{
#Call the status of the job
$ReloadRunning = Invoke-RestMethod -Uri $responseURL -Method Get -Headers $Headers -ContentType "application/json"
#check if the status is still running
if ($ReloadRunning.Jobstatus -eq "RUNNING")
{
#wait for 5 secs
Start-sleep -seconds 5
}
Else
{
#If the load is a sucess, confirm
if($ReloadRunning.Jobstatus -eq "SUCCESS")
{
#job completed
write-host "Data Reload Sucessful - " $ReloadRunning.jobStatus "Executed by " $ReloadRunning.executedBy
}
Else
#It may have failed for another reason (not metadata) - post details
{
#job failed
write-host "Data Reload Failed - " $ReloadRunning.details
}
$ReloadComplete = $true
}
}
This is simply a repeat of the original data load commands. Only item of interest is that I’m using different variables ($ReloadComplete) as strictly speaking I’m still within the error handle of the original data load process.
So. What does this actually look like over a folder? Specifically this folder?
Well – to be honest, it looks pretty damn amazing.
Dynamically creating metadata, refreshing the cube as it goes and reloading only the impacted data files.
Going TerRESTrial
Once completed, it’s Child’s Play…I mean Childz play…I mean Child’s Play to add the new members to an email to the Admin\Business owner to move them to their correct home.
So there you have it, a fully automated, metadata creating, folder loading, fully logging, back to base emailing, data load process that doesn’t require EPMAutomate to run.
Not too bad for less than 250 lines of code!
As always, I’m deeply indebted to previous works from Jake Turrell and John Goodwin – without whom I would have undoubtedly given up.
Full code line available below. As before don’t run code against a production environment that you don’t understand, and make sure it’s doing exactly what you want before you do!
Cheers
Pete
<#
.SYNOPSIS
This script loads in data through the Rest API and downloads the log if there are errors
.DESCRIPTION
This script requires both a target environment, and accepts a switch for a file or folder
.EXAMPLE
./LoadData.ps1 -Env Prod -Rule BS_ASO -Folder c:\Test\BS
.NOTES
Change Control
When Who What
13/15/2017 PN Created
.LINK
#>
<#Param (
[Parameter(Mandatory=$True)]
[String]$Env = $(throw "Target Environment is mandatory"),
[Parameter(Mandatory=$True)]
[String]$Rule = $(throw "Rule name is mandatory"),
[Parameter(Mandatory=$True)]
[String]$Folder = $(throw "Folder name is mandatory")
)#>
# Parse the config file
$Config = "C:\Git\PBCS_Overnight\Rebuild\" + $Env + "_Config.xml"
[xml]$ConfigFile = Get-Content $Config
# Import configuration settings
$Username = $ConfigFile.Settings.ClientSettings.Username
$Password = $ConfigFile.Settings.ClientSettings.Password
$Env = $ConfigFile.Settings.ClientSettings.Env
$Domain = $ConfigFile.Settings.ClientSettings.Domain
$URL = $ConfigFile.Settings.ClientSettings.URL
$AppName = $ConfigFile.Settings.ClientSettings.AppName
$LogFolder = $ConfigFile.Settings.ScriptSettings.LogFolder
#Encode the authorisation
$EncodedAuthorisation = [System.Text.Encoding]::UTF8.GetBytes($Domain + "." + $Username + ':' + $password)
$EncodedPassword = [System.Convert]::ToBase64String($EncodedAuthorisation)
$headers = @{"Authorization"="Basic $($EncodedPassword)"}
#REST API URLS
$RESTLCM = "interop/rest/11.1.2.3.600/applicationsnapshots/"
$RESTData = "aif/rest/V1/jobs"
$RESTPlanning = "HyperionPlanning/rest/v3/applications/$AppName"
# Define the folder items as an array
$files = Get-ChildItem $Folder
# Now for each file in the file array
Foreach ($file in $files){
#Parse the filename
$LoadFile = Split-Path -Path $file.fullname -leaf -Resolve
$LoadPath = Split-Path -Path $file.fullname
$LoadParent = Split-Path (Split-Path $file.fullname -Parent) -Leaf
$inboxfolder = "inbox\" + $LoadParent + "\"
$LoadPeriod = ([io.fileinfo]"$LoadFile").basename.Substring(([io.fileinfo]"$LoadFile").basename.get_Length()-6)
# Delete file if it exists
$DeleteURL = $URL + $RESTLCM + [System.Web.HttpUtility]::UrlEncode($inboxfolder + $LoadFile)
$restReponse = Invoke-RestMethod -Uri $DeleteURL -Method DELETE -Headers $Headers -ContentType "application/json"
If ($restReponse.Status -gt "0")
{write-host "File Delete Unsucessful - "$restReponse.Details}
Else
{write-host "File Delete Sucessful -" $LoadFile}
# LCM Rest URL (for uploading files)
$inboxfolder = $inboxfolder.Replace("\","/") # The insanity as to why I have to do this....
$UploadURL = $URL + $RESTLCM + $LoadFile + "/contents?q={""isLast"":true,""chunkSize"":500,""isFirst"":true,""extDirPath"":""$inboxfolder""}"
#Upload File
$restReponse = Invoke-RestMethod -Uri $UploadURL -Method Post -Headers $Headers -Infile ($LoadPath + "\" + $LoadFile) -ContentType "application/octet-stream"
If ($restReponse.Status -gt "0")
{write-host "File Upload Unsucessful - "$restReponse.Details}
Else
{write-host "File Upload Sucessful -" $LoadFile}
#Build a data load payload (JSON)
$Payload = "{
""jobType"": ""DATARULE"",
""jobName"": ""$Rule"",
""startPeriod"": ""$LoadPeriod"",
""endPeriod"": ""$LoadPeriod"",
""importMode"": ""REPLACE"",
""exportMode"": ""STORE_DATA"",
""filename"":""$($inboxfolder + $LoadFile)""
}"
#Data Load Rest URL
$DataLoadURL = $URL + $RESTData
# invoke Request to trigger data load
$restReponse = Invoke-RestMethod -Uri $DataLoadURL -Method Post -Body $Payload -Headers $Headers -ContentType "application/json"
$responseURL = $restReponse.links[0].href
$Jobcomplete = $false
#Error Handling for data loads
while(-not $Jobcomplete)
{
#Call the status of the job
$Checkrunning = Invoke-RestMethod -Uri $responseURL -Method Get -Headers $Headers -ContentType "application/json"
#check if the status is still running
if ($Checkrunning.Jobstatus -eq "RUNNING")
{
#wait for 5 secs
Start-sleep -seconds 5
}
Else
{
#if the load is a sucess
if($Checkrunning.Jobstatus -eq "SUCCESS")
{
#job completed sucessfully
write-host "Job Completed Sucessfully - " $LoadFile $Checkrunning.jobStatus- "Executed by" $Checkrunning.executedBy
}
Else
{
$Logfilename = $LogFolder + "\" + $Rule + "_" + $LoadPeriod + ".txt"
$Logname = [System.Web.HttpUtility]::UrlEncode($restReponse.LogfileName.Replace("/","\")) # The definition of insanity....
$DownloadURL = $URL + $RESTLCM + $Logname + "/contents"
Invoke-RestMethod -Uri $DownloadURL -Method Get -Headers $Headers -ContentType "application/json" | Out-File -FilePath $Logfilename -Force
write-host "Job Failed - Log Downloaded -" $Logfilename " - Parsing for Missing Members..."
# Okay - this is where we get fun...
$AllMissingMembers = New-Object System.Collections.Generic.List[System.Object] #create a list for the missing members
Get-Content $Logfilename | # open the file
Select-string "Error: 3303" | # find any Error 3303 strings (Missing members)
foreach{ #for each error row
$data = $_.line.split("|",[System.StringSplitOptions]::RemoveEmptyEntries).Trim(); # split them into members and data rows
$AllMissingMembers.add($data[1]) #build an list of missing members
}
$UniqueMembers = $AllMissingMembers | Select -unique # select the unique items
write-host "Creating the following members:" $UniqueMembers
#Now we have a unique list of members, lets iterate and add them!
foreach($UniqueMember in $UniqueMembers)
{
$CreateMbrURL = $URL + $RESTPlanning + "/dimensions/Product/members" # Create the Rest Add Member URL
# Generate the Add Member payload
$AddMemberPayload = "{
""memberName"": ""$UniqueMember"",
""parentName"": ""Unmapped Product""
}"
#Create the member
$restReponse = Invoke-RestMethod -Uri $CreateMbrURL -Method Post -Body $AddMemberPayload -Headers $Headers -ContentType "application/json"
If ($restReponse.Status -gt "0")
{write-host "Member Creation Unsuccessful - " + $restReponse.Details}
Else
{write-host $UniqueMember "- Member created Sucessfully"}
}
#Now all the metadata has been added, trigger a database refresh
$RefreshURL = $URL + $RESTPlanning + "/jobs" # Create the Rest refresh URL
# Generate the refresh cube payload
$RefreshPayload = "{
""jobType"": ""CUBE_REFRESH"",
""jobName"": ""Refresh_with_Kill""
}"
$restReponse = Invoke-RestMethod -Uri $RefreshURL -Method Post -Body $RefreshPayload -Headers $Headers -ContentType "application/json"
$RefreshURLResponse = $RefreshURL + "/" + $restReponse.jobID
$RefreshComplete = $false
while(-not $RefreshComplete)
{
#Call the status of the job
$Checkrunning = Invoke-RestMethod -Uri $RefreshURLResponse -Method Get -Headers $Headers -ContentType "application/json"
#check if the status is still running
if ($Checkrunning.status -eq "-1")
{
#wait for 5 secs
Start-sleep -seconds 5
}
Else
{
#if the refresh is a sucesss,
if($Checkrunning.status -eq "0")
{
#Job was sucessful
write-host "Refresh Sucessful - " $Checkrunning.details
}
Else
{
#job failed
write-host "Refresh Failed - " $Checkrunning.details
}
$RefreshComplete = $true
}
}
# invoke Request to Reload the data
$restReponse = Invoke-RestMethod -Uri $DataLoadURL -Method Post -Body $Payload -Headers $Headers -ContentType "application/json"
$responseURL = $restReponse.links[0].href
$ReloadComplete = $false
while(-not $ReloadComplete)
{
#Call the status of the job
$ReloadRunning = Invoke-RestMethod -Uri $responseURL -Method Get -Headers $Headers -ContentType "application/json"
#check if the status is still running
if ($ReloadRunning.Jobstatus -eq "RUNNING")
{
#wait for 5 secs
Start-sleep -seconds 5
}
Else
{
#If the load is a sucess, confirm
if($ReloadRunning.Jobstatus -eq "SUCCESS")
{
#job completed
write-host "Data Reload Completed Sucessfully - " $LoadFile $ReloadRunning.jobStatus- "Executed by" $ReloadRunning.executedBy
}
Else
#It may have failed for another reason (not metadata) - post details
{
#job failed
write-host "Data Reload Failed -" $LoadFile $ReloadRunning.details
}
$ReloadComplete = $true
}
}
}
$Jobcomplete = $true
}
}
}
I’m not one to wax lyrical, but y’all are amazing. I’m honoured to call you my friends.
So – what’s next
So after an amazing week, returning from KScope17 I found myself full of verve and enthusiasm, motivation and endorphins Actually, Becky is always this enthusiastic…bad example.
So I figured I’d hit the ground running and bang out multiple quick posts about the REST API in PBCS. Also gives me an excuse to make bad REST puns through the entire post.
Needless to say, 4 weeks, 1 SR, 2 OTN posts, hours trolling though the amazing John Goodwin’s blog backlog and some urgent twitter DM’s to the guru and I’m finally ready to post.
RESTating the background
So, I could write a whole background of what a REST API is and it’s benefit to developers – but I’ll leave this to the very talented Jake Turrell. This is his recorded session from KScope 16, where he discusses the anatomy of a REST API request, talks through JSON and does a live demo (because doing something completely crazy and unsupported in front of a live audience is ALWAYS a great idea!).
It was one of my favourite presentations from KScope16 so I cannot strongly recommend it enough!
Without further RESTraint
So now you know what the REST API can do, let’s throw some code at the wall and see what sticks.
I’m going to walk through a very simple (and oft repeated process!) of uploading a datafile to a PBCS instance and triggering a data load rule.
Firstly though, let’s see what the same process looks like in EPMAutomate.
The code lines are fairly simple – Oracle has exposed REST functions through very simple batch parameters
Steps are:
Login
Delete File
Upload File
Trigger data load
Logout
And this is what it looks like for a successful run:
And this is what it looks like with a failed load – I’ve added a data record for an entity that doesn’t exist
UncleaREST logging ever…
So we know we ‘can’ do it through EPMAutomate, but that logging is pretty atrocious. For anyone wondering, the EPMAutomate command does return an Errorlevel, but it doesn’t return any real details and there is no way to easily identify and grab the log.
POST /interop/rest/{api_version}/applicationsnapshots/{applicationSnapshotName}/contents? q={"isLast":true,"chunkSize":444,"isFirst":true,"extDirPath":inbox\}
GET /interop/rest/{api_version}/applicationsnapshots/{applicationSnapshotName}/contents
My code language of choice for today is Powershell – I’m still a bit rusty with Groovy, and Powershell has a very useful set of REST functions built in.
So the first tasks are to set variables, pass through some parameters and define the authorisation headers. I’m going to use the very elegant get-content function to pull all my variables from an XML file (read more about that here). In addition, I’m defining the specific REST API URLs for both the LCM and Data load functions.
We also need to create an encoded header for authentication – the format of which is as follows:
Domain.Username:Password
This then needs to be encoded into UTF8 and THEN converted into a base64string and added to a JSON header.
We can then split up the folder and file names to generate all of the required parameters. This is where using a ‘real’ code language is incredibly helpful – giving you a set of string and file system parsing parameters rather than needing to roll your own.
So we finally get to our first REST command – the delete function.
For this command we need to URL encode the file path of the target to pass it through
Basically that converts the following path:
inbox/BS/BS_Actual_Jan-13.txt
into
inbox%5cBS%5cBS_Actual_Jan-13.txt
We then send the REST Delete command with the authentication header and the URL encoded filename. We also set the content type:
-contenttype "application/json"
in order to get the query return information in a JSON format.
This is where we start to see the value of using the REST API for queries. If a REST query fails, we will get a set of return information – in the case of the Delete function, we get the following return dataset in a JSON format.
Which we can easily parse into an actual error handle – Note: I got this simply by running the delete function twice
Much better than EPMAutomate and a simple ErrorLevel!
PRESTaging Data
# LCM Rest URL (for uploading files)
$inboxfolder = $inboxfolder.Replace("\","/") # The insanity as to why I have to do this....
$UploadURL = $URL + $RESTLCM + $LoadFile + "/contents?q={""isLast"":true,""chunkSize"":500,""isFirst"":true,""extDirPath"":""$inboxfolder""}"
#Upload File
$restReponse = Invoke-RestMethod -Uri $UploadURL -Method Post -Headers $Headers -Infile ($LoadPath + "\" + $LoadFile) -ContentType "application/octet-stream"
If ($restReponse.Status -gt "0")
{write-host "File Upload Unsuccessful - "$restReponse.Details}
Else
{write-host "File Upload Successful"}
Now we deal with the upload, and this is where the ‘/’ deliminators get a little funky.
My URL is basically the REST API plus a filename, but I have to now define an extDirPath to load a file into the inbox directories for data management. Unfortunately this needs to be parsed as a Unix file path, therefore the slashes must be reversed
$inboxfolder = $inboxfolder.Replace("\","/") # The insanity as to why I have to do this....
In another ‘learn from my mistakes and frustrations’ – the documentation indicates that the folder structure is “extDirPath”:”/inbox”
However this is incorrect – you don’t need a leading ‘/’ before inbox. This is the correct final syntax:
Now onto the data load – the first step is defining a REST body payload – this defines the load periods, job name, import and export modes as well as the (optional) filename.
Once again, this is where the documentation differs from the reality.
Note only a single Period is listed? Yeah. Anyways, through trial and error – mostly by looking at what the EPMAutomate commands were using – I tested startperiod and endperiod and it worked.
So here we get to the final load commands. The data load trigger is a POST command, passing through the Payload in the body, and the standard headers.
Remember that the API URLs are different between the LCM (data import\export) and the Data Management URLs. Specifically, the data management URL is
aif/rest/V1/jobs
The second command
$responseURL = $restReponse.links[0].href
is defining the rest response URL for this specific load – once again, very useful because we can poll this using a REST GET to check the status of the load.
$Jobcomplete = $false
while(-not $Jobcomplete)
{
#Call the status of the job
$Checkrunning = Invoke-RestMethod -Uri $responseURL -Method Get -Headers $Headers -ContentType "application/json"
#check if the status is still running
if ($Checkrunning.Jobstatus -eq "RUNNING")
{
#wait for 5 secs
Start-sleep -seconds 5
}
Else
{
#if the load is a failure, download the log
if($Checkrunning.Jobstatus -eq "FAILED")
{
$Logname = [System.Web.HttpUtility]::UrlEncode($restReponse.LogfileName.Replace("/","\")) <# The definition of insanity.... #>
$DownloadURL = $URL + $RESTLCM + $Logname + "/contents"
Invoke-RestMethod -Uri $DownloadURL -Method Get -Headers $Headers -ContentType "application/json" | Out-File -FilePath $($LogFolder + "\" + $Rulename + "_" + $LoadPeriod + ".txt") -Force
write-host "Job Failed - Log Downloaded -" $($LogFolder + "\" + $Rulename + "_" + $LoadPeriod + ".txt")
}
Else
{
#job completed sucessfully
write-host "Job completed Sucessfully - " $Checkrunning.jobStatus "Executed by " $Checkrunning.executedBy
}
$Jobcomplete = $true
}
}
The final component of the script – and this is where using the REST API shines. By polling the job status we can check for a failed run – and then, because one of the return codes is the logfile name (see $restReponse.LogfileName) we can then selectively download that specific log file.
Note once again we have to deal with some slight insansity – the filename of the log is a returned as a windows format (‘\’ slashes) whereas to download the file we need to convert it BACK to a unix format. We then use a fairly standard GET command and write the output to a text file.
ARESTted Development
So there you have it, a full walkthrough of uploading a datafile and triggering a load rule using the REST API.
I’ll put the full code at the bottom – mostly to help anyone deal with the insanity of needing to chop and change the folder paths. Standard caveats apply, don’t run code you don’t understand in your production instances (or any instances!) and no warranties implied or otherwise.
I’ll be noting some of the updates\issues in the documentation and sending something through to Oracle – would make a lot of this development significantly easier!
Next I’ll be extending this process a bit further – actually parsing the downloaded log into actionable errors!
Any questions\issues, leave a comment. Happy RESTing!
Cheers
Pete
<#
.SYNOPSIS
This script loads in data through the Rest API and downloads the log if there are errors
.DESCRIPTION
This script requires both a target environment, and accepts a switch for a file or folder
.EXAMPLE
./LoadData.ps1 -Env Prod -Rule BS_ASO -Folder c:\Test\BS
.NOTES
Change Control
When Who What
13/15/2017 PN Created
.LINK
#>
<#
Param (
[Parameter(Mandatory=$True)]
[String]$Env = $(throw "Target Environment is mandatory"),
[Parameter(Mandatory=$True)]
[String]$Rule = $(throw "Rule name is mandatory"),
[Parameter(Mandatory=$True)]
[String]$FiletoLoad = $(throw "File name is mandatory")
)
#>
#Params
$Env = "Test"
$FiletoLoad = "C:\Git\PBCS_Overnight\Rebuild\BS\BS_Actual_Feb-13.txt"
$Rule = "BS_ASO"
# Parse the config file
$Config = "C:\Git\PBCS_Overnight\Rebuild\" + $Env + "_Config.xml"
[xml]$ConfigFile = Get-Content $Config
# Import configuration settings
$Username = $ConfigFile.Settings.ClientSettings.Username
$Password = $ConfigFile.Settings.ClientSettings.Password
$Env = $ConfigFile.Settings.ClientSettings.Env
$Domain = $ConfigFile.Settings.ClientSettings.Domain
$URL = $ConfigFile.Settings.ClientSettings.URL
$LogFolder = $ConfigFile.Settings.ScriptSettings.LogFolder
#Encode the authorisation
$EncodedAuthorisation = [System.Text.Encoding]::UTF8.GetBytes($Domain + "." + $Username + ':' + $password)
$EncodedPassword = [System.Convert]::ToBase64String($EncodedAuthorisation)
$headers = @{"Authorization"="Basic $($EncodedPassword)"}
#Parse the Folder Path
$LoadFile = Split-Path -Path $FiletoLoad -leaf -Resolve
$LoadPath = Split-Path -Path $FiletoLoad
$LoadParent = Split-Path (Split-Path $FiletoLoad -Parent) -Leaf
#Parse the filename
$inboxfolder = "inbox\" + $LoadParent + "\"
$LoadPeriod = ([io.fileinfo]"$LoadFile").basename.Substring(([io.fileinfo]"$LoadFile").basename.get_Length()-6)
#REST API URLS
$RESTLCM = "interop/rest/11.1.2.3.600/applicationsnapshots/"
$RESTData = "aif/rest/V1/jobs"
# Delete file if it exists
$DeleteURL = $URL + $RESTLCM + [System.Web.HttpUtility]::UrlEncode($inboxfolder + $LoadFile)
write-host $DeleteURL
$restReponse = Invoke-RestMethod -Uri $DeleteURL -Method DELETE -Headers $Headers -ContentType "application/json"
If ($restReponse.Status -gt "0")
{write-host "File Delete Unsucessful - "$restReponse.Details}
Else
{write-host "File Delete Sucessful"}
# LCM Rest URL (for uploading files)
$inboxfolder = $inboxfolder.Replace("\","/") # The insanity as to why I have to do this....
$UploadURL = $URL + $RESTLCM + $LoadFile + "/contents?q={""isLast"":true,""chunkSize"":500,""isFirst"":true,""extDirPath"":""$inboxfolder""}"
#Upload File
$restReponse = Invoke-RestMethod -Uri $UploadURL -Method Post -Headers $Headers -Infile ($LoadPath + "\" + $LoadFile) -ContentType "application/octet-stream"
If ($restReponse.Status -gt "0")
{write-host "File Upload Unsucessful - "$restReponse.Details}
Else
{write-host "File Upload Sucessful"}
#Build a data load payload (JSON)
$Payload = "{
""jobType"": ""DATARULE"",
""jobName"": ""$Rule"",
""startPeriod"": ""$LoadPeriod"",
""endPeriod"": ""$LoadPeriod"",
""importMode"": ""REPLACE"",
""exportMode"": ""STORE_DATA"",
""filename"":""$($inboxfolder + $LoadFile)""
}"
#Data Load Rest URL
$DataLoadURL = $URL + $RESTData
# invoke Request to trigger data load
$restReponse = Invoke-RestMethod -Uri $DataLoadURL -Method Post -Body $Payload -Headers $Headers -ContentType "application/json"
$responseURL = $restReponse.links[0].href
$Jobcomplete = $false
while(-not $Jobcomplete)
{
#Call the status of the job
$Checkrunning = Invoke-RestMethod -Uri $responseURL -Method Get -Headers $Headers -ContentType "application/json"
#check if the status is still running
if ($Checkrunning.Jobstatus -eq "RUNNING")
{
#wait for 5 secs
Start-sleep -seconds 5
}
Else
{
#if the load is a failure, download the log
if($Checkrunning.Jobstatus -eq "FAILED")
{
$Logname = [System.Web.HttpUtility]::UrlEncode($restReponse.LogfileName.Replace("/","\")) # The definition of insanity....
$DownloadURL = $URL + $RESTLCM + $Logname + "/contents"
Invoke-RestMethod -Uri $DownloadURL -Method Get -Headers $Headers -ContentType "application/json" | Out-File -FilePath $($LogFolder + "\" + $Rulename + "_" + $LoadPeriod + ".txt") -Force
write-host "Job Failed - Log Downloaded -" $($LogFolder + "\" + $Rulename + "_" + $LoadPeriod + ".txt")
}
Else
{
#job completed sucessfully
write-host "Job completed Sucessfully - " $Checkrunning.jobStatus "Executed by " $Checkrunning.executedBy
}
$Jobcomplete = $true
}
}
It walks through a simple yet elegant code set to perform Automated Daily Timestamped Backups on PBCS. Given the consolidation Oracle Cloud products this process could be easily extended to any Oracle Cloud application that utilises LCM.
Absolutely brilliant, worth the read. It might just be me, but reading any cleanly documented, clearly written, optimised code is always an soothing experience. Errr.
Anyways, because we are all tragic and in dire need of better hobbies technical enthusiasts – I got in touch with Chris to discuss it further.
Quickly to go through his process:
Login to PBCS using EPMAutomate
Download the latest Artifact Snapshot
Rename the snapshot on the local machine with a datestamp
Upload the renamed snapshot to PBCS
My only problem with the process is the literally the bandwidth cost. Even with a single environment and wanting to backup a Prod and Test environment you could be downloading and uploading gigabytes \ day. If I’m at my day job and want to extend this functionality to our large (and growing quickly!) PBCS client base, we could be looking at 50gb+ of movements a day – which is problematic.
Even looking through the REST API documentation it doesn’t seem possible :
Available options are limited to: Possible values: self, import, export, upload, download, or delete depending on the operation permitted on an application snapshot
Okay, so we can’t rename an LCM Export. And downloading and uploading them costs too much time/bandwidth. But maybe we can trick it.
Let’s examine some of the other EPMAutomate functions – specifically exportsnapshot.
exportsnapshot Repeats a previously performed export operation to create a snapshot of Migration content. You can download the exported snapshot from the default location.
Usage: epmautomate exportsnapshot SNAPSHOT_NAME where SNAPSHOT_NAME is the name of an existing snapshot in Migration. This snapshot is replaced by the new snapshot.
Example: epmautomate exportsnapshot October16FullApp
Maybe we could use this function to create a ‘new’ snapshot with our datestamp – all with have to do is convince PBCS that the snapshot has already occurred.
Easiest way to do that? Upload something that looks like an LCM export.
Interestingly PBCS fundamentally uses the same filesystem and file structure as on-premises Hyperion. Specifically, this the PBCS ‘get everything’ export.xml format – I’ve highlighted the few fields that are specific to an application\environment.
This is available from the Artifact Snapshot in the top level of the zip file.
Having done a bit of work with LCM backups in the past, I figured it would be a similar process within PBCS. The Export.xml file defines the export process – so all we should need to do is zip up a default Export.xml, upload it to the server, and Robert is your mother’s brother.
for /f %%g in ('powershell get-date -format yyyymmdd') do set datestamp=%%g
Set Snapshotname=%datestamp%_Backup
7za.exe a %SnapshotName%.zip C:\Test\Backup\Export.xml
Call %workingdir%\epmautomate login %user% %pass% %url% %domain% >> %Log%
Call %workingdir%\epmautomate uploadfile C:\Test\Backup\%SnapshotName%.zip >> %Log%
Call %workingdir%\epmautomate exportsnapshot %SnapshotName% >> %Log%
Call %workingdir%\epmautomate downloadfile %SnapshotName% >> %Log%
Call %workingdir%\epmautomate logout >> %Log%
Breaking it Down
for /f %%g in ('powershell get-date -format yyyymmdd') do set datestamp=%%g
Firstly, a quick way to build a fully formatted datestamp in dos batch\powershell that deals with international locales.
Set Snapshotname=%datestamp%_Backup
7za.exe a %SnapshotName%.zip C:\Test\Backup\Export.xml
Setting the snapshot name, and creating a zip file with just the export.xml in it. The eagle eyed amongst you will note that there is no username and password in the export.xml file.
Create a datestamp(ed? – sic) zip file containing only a generic Export.xml
Login to PBCS using EPMAutomate
Upload your empty Snapshot
Trigger the uploaded artifact snapshot, adding to the empty snapshot file
(Optional) Download the updated snapshot to the local machine
Key advantage? Because you’re only uploading a shell file, your bandwidth costs drop to 2kb per backup!
What’s next?
So the only item really left to cover off is how to remove the old files from the server. One of the key notes that Chris made in his post was that the EPMAutomate Listfiles does not show the LCM files.
In my trials and tribulations seamless process of writing this all out, I realised that the LCM files are accessible through the REST API – so it should be easy (cough) to write a process to parse and delete them.
And with that segue to a Part 2 – Peter tries to work out Groovy, Scripting and the REST APIs, I’ll leave it here.
I’ve been a bit lax with regard to posts – apologies – but Cameron harassed me to post this up after he’d not heard of it. To be honest, this is one of those ‘been in my back-pocket for so long I just assumed everyone knew’ kinda features (is there an acronym for that?)
Basically I was asked if anyone knew a way of showing only certain periods on a dataform…without having to update the form every month. This is one of those things that feels like it should be possible (nay easy) but…well…isn’t. Hence this hack.
Where did this all come about…
One of the common problems with Planning dataforms is a fairly hefty limitation in how they can be structured. This has general led to a number of obscure and imaginative hacks to show the data that you want. Significant examples of limitations include:
No UDAs available to dynamically filter member selections
For those of you already heavily using PBCS you’ll note that a number of these functions have been significantly improved. Including:
Member Exclusions Significantly More Member Selections
This gives you the flexibility to build dataforms however you like. In our case, it allows you to selectively show Right or Left members of a member.
So setting a subvar to &Period_Plan = Nov and defining a column header as IRSIBLINGS(&Period_Plan) will show just November and December!
Terrestrial Based Development?
On-premises you’re more limited. The functionality to allow for member selections is basically only Level-0 references and children – difficult if you wish to selectively show certain periods based upon a sub variable.
So what are our options? Interestingly what you would ‘expect’ to work doesn’t. Defining “Nov,Dec” as a sub variable will not allow it to be used – see both ActPeriods and Period_Act_Range for examples of what I tried (although they are obviously for the Actual periods!) Using these, saving the dataform and attempting to open it will just throw a default ‘dataform is invalid’ error message.
What’s this about Merging SubVars then?
One of the interesting default behaviours of Planning dataforms is to ‘merge’ the same member into a single column. If you have a column or a row selection with the same member listed sequentially (ie: Actual, Actual, Budget, Forecast) it will merge the Actual columns together rather than duplicating the column.
Therefore we need to work out a way that we can use and abuse that behaviour to suppress the columns that we don’t want to show. In order to do this, we create 12 sub variables.
The variables are then updated with the ‘same’ value for the first month to be shown. In this example I’m showing just Nov and Dec, but with 12 variables it could obviously be extended to a full year. Obviously in a case like this you’re going to want to automate the setting of sub variables… but I’ll leave that as an exercise for the reader.
Building the Dataform
Then our dataform becomes easy – the period dimension is defined as all 12 variables.
The closed variables then merge together on the dataform and we are left with only the plan periods. Simple!
Obviously we’re all hoping for the next round of 11.1.2.4 patches to bring some of this PBCS functionality back to on-premise (my kingdom for on-premise smartpush!) – however for the moment it’s a very effective, if a touch inglorious, hack!
So Kscope16 is over, done and dusted, and I’m back home. A quick thankyou to everyone who I met – was amazing to be able to put a face and a voice to the names/blogs/twitter accounts I’ve followed over the years!
For everyone else – you’ll be glad to know the likes of Jason Jones, Cameron Lackpour, Tim German and Celvin Kattookaran (and others I’ve naught mentioned for brevity, but you know who you are) are as tragically nerdy amazingly bright, excited and enthused about this space as their blogs suggest. I was very endeared by their absolute state of gregarious comradeship – a spirit of learning and challenging yourself to become better – lead to both a humbling and inspiring couple of days.
A really quick story before I get started on the actual post. I was standing in the main foyer with a few people watching the boats go by when Celvin (a Chi-City native) commented that: ‘It’s amazing you know – tour guides on the boats volunteer all their time! They don’t get paid at all, they just do it because they love the city of Chicago’. Silence reigned for a few moments – until the irony of Celvin (of all people) commenting in shocked admiration that ‘people that volunteer their time, just for the love of it’ sunk in!
So this will be the first in a series – perhaps not as compleat as others – looking at some of the new features in PBCS that will hopefully wend their way back to On-Premises at some point!
This is a 16.07 PBCS Test environment – released early July.
First thing I’ll be looking at is the automated Smartlist generation.
Nice and simple, we create a few new members. You’ll see here I’ve created a list of members under a parent Super Rates.
The keen eyed amongst you might notice one of the other changes – duplicate aliases are now in place and supported completely. Aliases can have the same name as the member (and other members!) as long as the member names are unique. To quickly reference the release notes:
“When referencing an alias that resolves to multiple members in free-form mode, an error is returned noting that the alias cannot be resolved to a single member. You can do this either by using the Member Selector to select the correct alias name or by hand-typing the qualified name. Using the parent member name as a qualifier should be sufficient in most cases, given that only duplicate alias names are supported, not duplicate member names.”
At the bottom you’ll see the “Create From Members” Checkbox with a member selection. I’m obviously picking the descendants of Super rates rather than hard coding the list (which becomes very cool shortly.
After saving the smartlist, returning to the entries page shows the members!
You can see that the Name is set to the Member Name, while Label has been updated to the Alias. In an earlier test I used the Aliases as the member names (because smartlist names have some limitations) – the spaces were removed as well as the special characters (%) and replaced by underscores, but the label was still correct.
So that’s awesome and should help massively in defining smartlists going forward! Stay tuned for some more new features (as I get a chance to find them and test them!)