Extracting Metadata from PBCS using the REST API – Updated!

G’day All,

So – I spent 7 hours working through a fancy REST API script, reading optimisation notes on Powershell queries, and re-working and re-writing the code to be simple and clean to finally post up a detailed ‘warts and all’ article.

And then I actually tried to use it! And it, er, didn’t quite go to plan.

So what is the issue?

Let’s look at a fairly basic hierarchy:

I’m using the Entity hierarchy in a fairly standard format. Reasonably ragged, some levels go to state and some into country, but functionally nothing that Essbase doesn’t handle perfectly.

Let me add the level zero flags.









Top section makes perfect sense – the issue comes with the Entity_NA member. Remembering the way our query works, for summary members we iterate to find the parent member of each level zero.

And now you can see the problem (and hopefully not just because I highlighted it in bright red). We have a ragged hierarchy, so the summary levels are ragged as well. While perfectly correct, this is very likely not what we want.

Fortunately- we’re in Powershell Country

Looking at the qualified names gives us a good idea on how to solve the issue though. By adding a “ParentMember” variable to the prompt, we can then filter on the qualifiedName to get only the members that have this upper level member.

{$member.items |  Where qualifiedName -like "*``[$ParentMember*" | where childCount -eq "0"} #select only level zero UNDER the parent member
$Filename = "F:\PBCS_REST\Output.csv"
$AppName = "Demo"
$Dim = "Entity"
$Lev = 1
$ParentMember = "Total_Entity"

# Convert creds to auth headers
$Enc_Auth = [System.Text.Encoding]::UTF8.GetBytes(("{0}.{1}:{2}" -f $Domain, $Username, $Password))
$Enc_PW = [System.Convert]::ToBase64String($Enc_Auth)
$PBCS_REST_URL = "{0}/aif/rest/V1/applications/{1}/dimension/{2}/" -f $URL, $Appname, $Dim

$AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{"Authorization"="Basic $($Enc_PW)"} -ContentType "application/json"   

$Finalmembers = foreach($member in $AllMembers) # iterate through all the members
    {$member.items |  Where qualifiedName -like "*``[$ParentMember*" | where childCount -eq "0"} #select only level zero UNDER the parent member
while ($Lev -gt 0) # for any level higher than 0, recursively query 
    {
    $lookup = foreach ($member in $Finalmembers) # for every member in the previous list (or level 0 for the first pass) 
        {$member.parentName}  #add the parent names to the lookup table
        $lookup = $lookup | sort -Unique # grab only the unique parents
    $Uppermbrs = foreach ($member in $allmembers) #iterate through the entire list again
        {$member.items | where {$_.memberName -in $lookup}} #filter now on the members in the lookup
    $lev -- #iterate through the levels 
    $Finalmembers = $upperMbrs #assign the upper members to the new list
    }
$Finalmembers | Select-Object memberName -Unique | ConvertTo-Csv > $Filename -NoTypeInformation  # select the uniques and convert to csv 

Much better! Now I can actually use this function to set up my Hybrid Dynamic Calc members correctly. The full updated Get-LevMbrs function is available below!

Function Get-LevMbrs
{
<#
.SYNOPSIS
Queries PBCS Data Management Rest API to the get the dimensional members referenced by level 
.DESCRIPTION
Queries PBCS Data Management Rest API to the get the dimensional members referenced by level - will write to a file
.PARAMETER Username
The PBCS username for the instance
.PARAMETER password
The PBCS password for the instance
.PARAMETER URL
The PBCS URL including the HTTPS prefix
.PARAMETER AppName
The Data management application name (refer to the target application screen in setup for the names)
.PARAMETER Dim
The dimension you wish to return the members for
.PARAMETER Lev
The level in the dimension you wish to return the members for
.PARAMETER ParentMember
An optional filter to only get the dimensional members under the named member - defaults to the top level dimension name
.PARAMETER filname
Optional local path to where you would like the file saved - defaults to $Dim_Lev$Lev_Members.csv in the current directory. 
.OUTPUTS
Returns a csv file with the member names
.LINK
http://essbasedownunder.com/2019/04/extracting-metadata-from-pbcs-using-the-rest-api/
.EXAMPLE
/Get-LevMbrs -Username admin -Password password -URL https://planning-test-domain.pbcs.ap1.oraclecloud.com/ -App Demo -Dim Entity -Lev 0
.EXAMPLE
/Get-LevMbrs admin password https://planning-test-domain.pbcs.ap1.oraclecloud.com/ Demo Entity 0 "F:\PBCS\Members.csv"
.NOTES
Author: Peter Nitschke
#>
[cmdletbinding()]
Param(
[Parameter(Mandatory=$True)][string]$Username, 
[Parameter(Mandatory=$True)][string]$Password, 
[Parameter(Mandatory=$True)][string]$URL,
[Parameter(Mandatory=$True)][string]$App, #note – this is the Appname from Data Management target application
[Parameter(Mandatory=$True)][string]$Dim, 
[Parameter(Mandatory=$True)][int]$Lev,
[Parameter(Mandatory=$False)][string]$ParentMember = $Dim,
[Parameter(Mandatory=$False)][string]$Filename = $PSScriptRoot + ‘\’ + $Dim + ‘_Lev’ + $Lev + ‘_Members.csv’
)

    # Construct Domains   
    $URLSplit = ([uri]”$URL”).Host.split(‘-‘).split(‘.’)
    if ($URLSplit[1] -eq “test”)
    {$Domain = $URLSplit[2]}
    else
    {$Domain=$URLSplit[1]}
    
    #Trim the URLs in case they have trailing /
    $URL = $URL.TrimEnd(‘/’)

    # Construct URL
    $PBCS_REST_URL = “{0}/aif/rest/V1/applications/{1}/dimension/{2}/” -f $URL, $App, $Dim

    # Convert creds to auth headers
    $Enc_Auth = [System.Text.Encoding]::UTF8.GetBytes((“{0}.{1}:{2}” -f $Domain, $Username, $Password))
    $Enc_PW = [System.Convert]::ToBase64String($Enc_Auth)

    # Invoke PBCS REST API to get Latest API version
    try
    {
    $AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{“Authorization”=”Basic $($Enc_PW)”} -ContentType “application/json”   
        $Finalmembers = foreach($member in $AllMembers)
            {$member.items |  Where qualifiedName -like “*“[$ParentMember*” | where childCount -eq “0”} #select only level zero UNDER the parent member
        while ($Lev -gt 0) # for any level higher than 0, recursively query 
        {
            $lookup = foreach ($member in $Finalmembers) # for every member in the previous list (or level 0 for the first pass) 
                {$member.parentName}  #add the parent names to the lookup table
            $lookup = $lookup | sort -Unique # grab only the unique parents
            $Uppermbrs = foreach ($member in $allmembers) #iterate through the entire list again
                {$member.items | where {$_.memberName -in $lookup}} #filter now on the members in the lookup
            $lev — #iterate through the levels
        $Finalmembers = $upperMbrs #assign the upper members to the new list
        }
        $Finalmembers | Select-Object memberName | ConvertTo-Csv > $Filename -NoTypeInformation  # select the uniques and convert to csv 
    }
    # Return the error if the connection fails
    Catch
    {throw $_.Exception.Message}
}

Extracting Metadata from PBCS using the REST API

G’day All,

As is seemingly often the case – most of my REST API work is simply trailing on the coat-tails of greatness, aka: John Goodwin.

Specifically a question came up in the OTN Planning forums Identifying Level 0 in metadata export

Obviously there are a number of manual ways to do it – simple Smartview exports, exporting the dimension and parsing the file, even writing a custom database export out to flat file – but none of them are particularly elegant and\or streamlined and would be very difficult to integrate into a metadata update.

When John mentioned that he had found a way, I was obviously intrigued! Basically in his investigation of the FDMEE on-prem to Cloud integration processes he found an undocumented data management query to return all of the members from within the requested dimension:

GET URL/aif/rest/V1/applications/App/dimension/Dim/ 

This query is from the data management layer – specifically querying the underlying FDMEE \ Data management tables that are populated by the target application metadata queries.

Running this query manually then returns a detailed listing of every single member including a large volume of metadata.

Unfortunately? It’s not the best metadata. ‘MemberLevel’ is a misnomer – basically you have +1 for the dimension name and -1 for everything else. ‘QualifiedName’ is useful, but because it’s a parent child listing you still need to iterate in order to actually identify where you are. Still – we should be able to construct some queries to give us what we want!

Let’s start nice and simple – setting up our query:

$Username = "Username"$
Password = "password"
$URL = "https://planning-test-pivot2.pbcs.ap1.oraclecloud.com"
$Domain = "pivot2"
$AppName = "Demo" # This is the name of the Planning application
$Dim = "Entity" # This is the dimension you're querying
$Filename "F:\PBCS_REST\Example.csv"

# Convert creds to auth headers
$Enc_Auth = [System.Text.Encoding]::UTF8.GetBytes(("{0}.{1}:{2}" -f $Domain, $Username, $Password))
$Enc_PW = [System.Convert]::ToBase64String($Enc_Auth)
$PBCS_REST_URL = "{0}/aif/rest/V1/applications/{1}/dimension/{2}/" -f $URL, $Appname, $Dim

$AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{"Authorization"="Basic $($Enc_PW)"} -ContentType "application/json"
$AllMembers | ConvertTo-Csv > $Filename -NoTypeInformation

This returns everything and utilises powershell’s built in csv converter to render something reasonable.

Useful but not what we want – let’s build out a query to just return level 0 members.

$Finalmembers= @() # Create an array for the members
try{
    #Trigger the dim query
    $AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{"Authorization"="Basic $($Enc_PW)"} -ContentType "application/json"
    foreach ($member in $AllMembers)
        {$Finalmembers += $AllMembers.items | Where-Object {$_.childCount -eq "0"} | Select-object memberName -Unique }# filter for the level 0 members (where no children)
    $Finalmembers | Select-object memberName -Unique | ConvertTo-Csv > $Filename -NoTypeInformation  # select the uniques and convert to csv 
}
Catch {throw $_.Exception.Message}

We can’t use a level indicator, but it’s simple enough to just check if the members have children. This now returns us a nice simple list of members.

Great! So Level 0 is nice and easy. Recently however, I’ve been doing some work with EPBCS Hybrid and found myself reminiscing about Celvin’s incredible Essbase Member Operation Utility – specifically it’s ability to bulk update higher level members to a new storage type. So I got wondering, what if we want greater than level 0?

$Lev = 2 #define the level to return
# Trigger the Dim Function
$Finalmembers= @() # Create an array for the members
    #Trigger the dim query
    $AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{"Authorization"="Basic $($Enc_PW)"} -ContentType "application/json"
    foreach ($member in $AllMembers) {
        $Finalmembers += $AllMembers.items | Where-Object {$_.childCount -eq "0"} | Select-object memberName,parentName -Unique # filter for the level 0 members (where no children)
        }
    while ($Lev -gt 0) # for any level higher than 0, recursively query 
    {
    $upperMbrs=@() # build an upper level array 
    foreach ($member in $Finalmembers) # for every member in the previous list (or level 0 for the first pass) 
        {
        $upperMbrs += $AllMembers.items | Where-Object {$Finalmembers.parentName -eq $_.memberName} #give the details where the previous levels parent member equals the member name
        }
    $lev -- #iterate through the levels 
    $Finalmembers = $upperMbrs | Select-object memberName,parentName,alias -Unique # select only the unique members (to make the recursion quicker)
    }
$Finalmembers | Select-object memberName -Unique | ConvertTo-Csv > $Filename -NoTypeInformation  # select the uniques and convert to csv 

So this is a little more convoluted – but basically we’re finding the Level x members – getting all their parents and then looping and recursively searching up the hierarchy.

One note to make at this junction – this is not speedy code. Recursive loops through the pipeline with a where-object as a filter is….about as bad as you can get for performance! It’s not that awful – pulling big dimensions (10k+ members) takes around 5 mins. Buuuuut….yeah, performance is important. Let’s re-write it.

    $AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{"Authorization"="Basic $($Enc_PW)"} -ContentType "application/json"
    foreach ($member in $AllMembers) {
    $FinalMembers = $member.items | where childCount -eq "0"
    }

Firstly changing to a where clause from a where-object clause removes our script block meaning we can only do a simple comparison, but this massively speeds up the iteration.

while ($Lev -gt 0) # for any level higher than 0, recursively query 
{
$lookup = foreach ($member in $Finalmembers) # for every member in the previous list (or level 0 for the first pass) 
    {$member.parentName}  #add the parent names to the lookup table
    $lookup = $lookup | sort -Unique # grab only the unique parents
$Uppermbrs = foreach ($member in $allmembers) #iterate through the entire list again
    {$member.items | where {$_.memberName -in $lookup}} #filter now on the members in the lookup
$lev -- #iterate through the levels 
$Finalmembers = $upperMbrs #assign the upper members to the new list
}
$Finalmembers | Select-Object memberName | ConvertTo-Csv > $Filename -NoTypeInformation  # select the uniques and convert to csv

For the recursion – building the parent member as a lookup list with a single iteration means we can run through it much quicker.

Much much quicker.

But wait! You want to query using generations instead of levels? Alright then. We can’t use recursion up the ParentName hierarchy, instead we’ll use a (glorious) hackjob and look at the qualifiedName. By counting the number of members in the qualifiedName reference we can work out which generation it is!

$Gen = 5
$v2 = (measure-command{
$AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{"Authorization"="Basic $($Enc_PW)"} -ContentType "application/json"
foreach ($member in $AllMembers)
    {$Finalmembers = $AllMembers.items | Where-object {(($_.qualifiedName) -ne $null) -and (($_.qualifiedName).split(".").count -eq $Gen)}}
    $Finalmembers | Select-object memberName -Unique | ConvertTo-Csv > $Filename -NoTypeInformation  # select the uniques and convert to csv 
}).totalseconds

Write-Host ('15k members {0} secs' -f  $v2)

Same basic logic – except we’re filtering out the nulls, and then counting the number of “.”‘s to define how deep we are in the hierarchy.

A bit slower – but reasonable given the size of the dimension.

Hopefully that helps – I’ve built out the queries above as powershell functions available below – feel free to modify, use etc as needed, making sure to always test test and test again before running anything in Production.

Function Get-LevMbrs
{
<#
.SYNOPSIS
Queries PBCS Data Management Rest API to the get the dimensional members referenced by level 
.DESCRIPTION
Queries PBCS Data Management Rest API to the get the dimensional members referenced by level - will write to a file
.PARAMETER Username
The PBCS username for the instance
.PARAMETER password
The PBCS password for the instance
.PARAMETER URL
The PBCS URL including the HTTPS prefix
.PARAMETER AppName
The Data management application name (refer to the target application screen in setup for the names)
.PARAMETER Dim
The dimension you wish to return the members for
.PARAMETER Lev
The level in the dimension you wish to return the members for
.PARAMETER filname
Optional local path to where you would like the file saved - defaults to $Dim_Lev$Lev_Members.csv in the current directory. 
.OUTPUTS
Returns a csv file with the member names
.LINK
http://essbasedownunder.com/2019/04/extracting-metadata-from-pbcs-using-the-rest-api/
.EXAMPLE
/Get-LevMbrs -Username admin -Password password -URL https://planning-test-domain.pbcs.ap1.oraclecloud.com/ -App Demo -Dim Entity -Lev 0
.EXAMPLE
/Get-LevMbrs admin password https://planning-test-domain.pbcs.ap1.oraclecloud.com/ Demo Entity 0 "F:\PBCS\Members.csv"
.NOTES
Author: Peter Nitschke
#>
[cmdletbinding()]
Param(
[Parameter(Mandatory=$True)][string]$Username, 
[Parameter(Mandatory=$True)][string]$Password, 
[Parameter(Mandatory=$True)][string]$URL,
[Parameter(Mandatory=$True)][string]$App, 
[Parameter(Mandatory=$True)][string]$Dim, 
[Parameter(Mandatory=$True)][string]$Lev,
[Parameter(Mandatory=$False)][string]$Filename = $PSScriptRoot + ‘\’ + $Dim + ‘_Lev’ + $Lev + ‘_Members.csv’
)

    # Construct Domains   
    $URLSplit = ([uri]”$URL”).Host.split(‘-‘).split(‘.’)
    if ($URLSplit[1] -eq “test”)
    {$Domain = $URLSplit[2]}
    else
    {$Domain=$URLSplit[1]}
    
    #Trim the URLs in case they have trailing /
    $URL = $URL.TrimEnd(‘/’)

    # Construct URL
    $PBCS_REST_URL = “{0}/aif/rest/V1/applications/{1}/dimension/{2}/” -f $URL $App $Dim

    # Convert creds to auth headers
    $Enc_Auth = [System.Text.Encoding]::UTF8.GetBytes((“{0}.{1}:{2}” -f $Domain, $Username, $Password))
    $Enc_PW = [System.Convert]::ToBase64String($Enc_Auth)

    # Invoke PBCS REST API to get Latest API version
    try
    {
    $AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{“Authorization”=”Basic $($Enc_PW)”} -ContentType “application/json”   
        $Finalmembers = foreach($member in $AllMembers)
            {$member.items | where childCount -eq “0”}
        while ($Lev -gt 0) # for any level higher than 0, recursively query 
        {
            $lookup = foreach ($member in $Finalmembers) # for every member in the previous list (or level 0 for the first pass) 
                {$member.parentName}  #add the parent names to the lookup table
            $lookup = $lookup | sort -Unique # grab only the unique parents
            $Uppermbrs = foreach ($member in $allmembers) #iterate through the entire list again
                {$member.items | where {$_.memberName -in $lookup}} #filter now on the members in the lookup
            $lev — #iterate through the levels 
            $Finalmembers = $upperMbrs #assign the upper members to the new list
        }
        $Finalmembers | Select-Object memberName | ConvertTo-Csv > $Filename -NoTypeInformation  # select the uniques and convert to csv 
    }
    # Return the error if the connection fails
    Catch
    {throw $_.Exception.Message}
}

Function Get-GenMbrs
{
<#
.SYNOPSIS
Queries PBCS Data Management Rest API to the get the dimensional members referenced by generation 
.DESCRIPTION
Queries PBCS Data Management Rest API to the get the dimensional members referenced by generation - will write to a file
.PARAMETER Username
The PBCS username for the instance
.PARAMETER password
The PBCS password for the instance
.PARAMETER URL
The PBCS URL including the HTTPS prefix
.PARAMETER AppName
The Data management application name (refer to the target application screen in setup for the names)
.PARAMETER Dim
The dimension you wish to return the members for
.PARAMETER Gen
The Generation in the dimension you wish to return the members for
.PARAMETER filname
Optional local path to where you would like the file saved - defaults to $Dim_Gen$Gen_Members.csv in the current directory. 
.OUTPUTS
Returns a csv file with the member names
.LINK
http://essbasedownunder.com/2019/04/extracting-metadata-from-pbcs-using-the-rest-api/
.EXAMPLE
/Get-GenMbrs -Username admin -Password password -URL https://planning-test-domain.pbcs.ap1.oraclecloud.com/ -App Demo -Dim Entity -Gen 3
.EXAMPLE
/Get-GenMbrs admin password https://planning-test-domain.pbcs.ap1.oraclecloud.com/ Demo Entity 4 "F:\PBCS\Members.csv"
.NOTES
Author: Peter Nitschke
#>
[cmdletbinding()]
Param(
[Parameter(Mandatory=$True)][string]$Username, 
[Parameter(Mandatory=$True)][string]$Password, 
[Parameter(Mandatory=$True)][string]$URL,
[Parameter(Mandatory=$True)][string]$App, 
[Parameter(Mandatory=$True)][string]$Dim, 
[Parameter(Mandatory=$True)][string]$Lev,
[Parameter(Mandatory=$False)][string]$Filename = $PSScriptRoot + ‘\’ + $Dim + ‘_Lev’ + $Lev + ‘_Members.csv’
)
    # Construct Domains   
    $URLSplit = ([uri]”$URL”).Host.split(‘-‘).split(‘.’)
    if ($URLSplit[1] -eq “test”)
    {$Domain = $URLSplit[2]}
    else
    {$Domain=$URLSplit[1]}
    
    #Trim the URLs in case they have trailing /
    $URL = $URL.TrimEnd(‘/’)

    # Construct URL
    $PBCS_REST_URL = “{0}/aif/rest/V1/applications/{1}/dimension/{2}/” -f $URL $App $Dim

    # Convert creds to auth headers
    $Enc_Auth = [System.Text.Encoding]::UTF8.GetBytes((“{0}.{1}:{2}” -f $Domain, $Username, $Password))
    $Enc_PW = [System.Convert]::ToBase64String($Enc_Auth)

    # Invoke PBCS REST API to get Latest API version
    try
    {
    $AllMembers = Invoke-RestMethod -Uri $PBCS_REST_URL -Method GET -Headers @{“Authorization”=”Basic $($Enc_PW)”} -ContentType “application/json”
    foreach ($member in $AllMembers)
        {$Finalmembers = $AllMembers.items | Where-object {(($_.qualifiedName) -ne $null) -and (($_.qualifiedName).split(“.”).count -eq $Gen)}}
    $Finalmembers | Select-object memberName -Unique | ConvertTo-Csv > $Filename -NoTypeInformation  # select the uniques and convert to csv 
    }
    # Return the error if the connection fails
    Catch
    {throw $_.Exception.Message}
}

RESTing around with PBCS 2 – Still RESTless

G’day All,

Two posts in two weeks!

So, this is really a continuation of the previous one. Just as a recap:

  • the REST API is awesome
  • It lets us do all of the same functions as EPMAutomate, but with significantly better error handling and logging abilities
  • It also has a few ‘new’ tricks up it’s sleeve
  • I make terrible REST puns.

So – RESTating the Obvious

Where I’d left off, we’d written a powershell script to delete the existing load file, upload a new load file, run a data load run and IF it errors download the log.

Great for one file, but what if we have a bit of data…what if we have a folder?

Well – we’re in Powershell country now!

# Define the folder items as an array
$files = Get-ChildItem $Folder 
# Now for each file in the file array
Foreach ($file in $files){

    #Parse the filename
    $LoadFile = Split-Path -Path $file.fullname -leaf -Resolve 
    $LoadPath = Split-Path -Path $file.fullname
    $LoadParent = Split-Path (Split-Path $file.fullname -Parent) -Leaf
    $inboxfolder = "inbox\" + $LoadParent + "\"
    $LoadPeriod = ([io.fileinfo]"$LoadFile").basename.Substring(([io.fileinfo]"$LoadFile").basename.get_Length()-6)

<# insert rest of code #> 
}

We can just define a set of files (as defined by a folder parameter) and then iterate through through them one by one.

Express your DisinteREST

That’s great, but…two additional lines of code does not a blog post make.

How about we actually go and ‘do’ something with the log files that we’ve carefully curated.

How about we try and work out ‘what’ the errors are and attempt to do something about them.

Okay – so this is how we got the files last time:

$Logfilename = $LogFolder + "\" + $Rule + "_" + $LoadPeriod + ".txt"
$Logname = [System.Web.HttpUtility]::UrlEncode($restReponse.LogfileName.Replace("/","\")) # The definition of insanity....
$DownloadURL = $URL + $RESTLCM + $Logname + "/contents"
Invoke-RestMethod -Uri $DownloadURL -Method Get -Headers $Headers -ContentType "application/json" | Out-File -FilePath $Logfilename -Force
write-host "Job Failed - Log Downloaded -" $Logfilename "

And this is what that log file looks like – remember we’re only getting the logs for the failed loads.

SquaREST peg in a Round Hole

So let’s talk about the FDMEE\Data Management log for a moment.

In many ways, this is literally the best log of…well, anything…that you can currently get out of PBCS. No access to the Essbase calc log, no access to the FR logs, no access to the MDXQuery logs.

The internal Calc Manager log is…not great, although it is getting better.

So this is literally a replica of the FDMEE log, with options to set the log level and capture all the detail.

But, it’s still a bit all over the place. The actual reporting of errors is pretty good, but we’re definitely going to need some parsing to get anything useful out of it.

So looking through it, the errors we’re looking for are here:

This is the standard error code for an essbase (ASO) data load with missing metadata.

| Error: 3303 | P9621 | "Non_Driver_Value","BNZG","Final","A26911","GS_BU001","Local","NA_Project","C900","D515","P9621","Month","Actual","FY13","Jan",-16330.55 | 
| Error: 3303 | P9620 | "Non_Driver_Value","BNZG","Final","A26911","GS_BU001","Local","NA_Project","C900","D515","P9620","Month","Actual","FY13","Jan",-32661.1 | 

A pipe and comma delimited error handle broken down by line with the following fields:

Note: If anyone can think of a method of pulling that order out dynamically from anywhere else (specifically the UD1+ order), please let me know!

The FaiREST Parser

So if we grab the second field, we’ve got the specific missing members that failed. That’s fairly easy to parse in Powershell

# Okay - this is where we get fun... 
$AllMissingMembers = New-Object System.Collections.Generic.List[System.Object] #create a list for the missing members
Get-Content $Logfilename | # open the file
Select-string "Error: 3303" | # find any Error 3303 strings (Missing members)
foreach{ #for each error row
	$data = $_.line.split("|",[System.StringSplitOptions]::RemoveEmptyEntries).Trim(); # split them into members and data rows
	$AllMissingMembers.add($data[1])  #build an list of missing members 
	}
$UniqueMembers = $AllMissingMembers | Select -unique # select the unique items

Without wanting to write a discourse on powershell, the functions are as follows:

  • Create a generic list object
  • Open the file (get-content is a little slow with bigger files, but these are pretty small)
  • Grab all of the rows that start with “Error:3303”
  • For each of those rows, split up the rows using the “|” delimiter
  • Grab the second item in the row (arrays start at zero!) and add it to the list
  • Once you’ve got all the members, create a unique list

Wham Bam – we have a unique list of members in a powershell list.

Now, to be honest we could stop here. It’s pretty easy in Powershell to just write that member list into an email – attach the data lines to give it context – and send it to the administrator.

But, 8 lines of code does still not a good blog post make. We’ve got the REST API – let’s try to get fancy.

RESTitute the Members

So one of the recent new features that I have a love and hate relationship with is dynamic member creation aka: members on the fly.

Conceptually it’s great. Useful for new projects, new employees, new assets etc. Gives the users the ability to dynamically add metadata and can be fully scripted in with @RETURN controls and validation.

It’s the biggest downside? It doesn’t work in Smartview – and in reality, that’s such a gamebreaker that I’ve gone off it completely in PBCS, or alternatively on premise ‘created’ my own with the groovy CDF.

But – this process isn’t user facing – so let’s see what we can do with it!

Let’s create a couple of “Unmapped” nodes in our dimensions

Obviously you’d want to select only as many members as you’re likely to need per load file – and remember these get created in the essbase cube, so don’t add 300 additional hidden members in your dense dimension!

Once that’s done and refreshed into essbase – we can call the REST command to add members to those nodes.

For this we need to connect to the Planning API:

REST Resource - Add Member 
POST /HyperionPlanning/rest/{api_version}/applications/{application}/dimensions/{dimname}/members

Similar to the dataload process, we’ll be creating a payload to post through the body of the REST POST query.

Similar, also to the dataload process, the documentation is slightly incorrect.

Instead of Name the member must be listed as memberName.

foreach($UniqueMember in $UniqueMembers) #Now we have a unique list of members, lets iterate and add them!
{                                              
	$CreateMbrURL = $URL + $RESTPlanning + "/dimensions/Product/members" # Create the Rest Add Member URL
	# Generate the Add Member payload
	$AddMemberPayload = "{
		""memberName"": ""$UniqueMember"",
		""parentName"": ""Unmapped Product""
	}"
	#Create the member
	$restReponse = Invoke-RestMethod -Uri $CreateMbrURL -Method Post -Body $AddMemberPayload -Headers $Headers -ContentType "application/json"
	If ($restReponse.Status -gt "0")
	{write-host "Member Creation Unsuccessful - " + $restReponse.Details}
	Else
	{write-host $UniqueMember "- Member created Sucessfully"
}

Obviously I’m being lazy and not generating the ParentName dynamically – up to you as to how you achieve this, but assuming your dimension members are all prefixed with the same letter it should be reasonably easy. You could also create multiple dynamic child enabled members within a single hierarchy – might be useful to map unmapped Balance Sheet members into one node and unmapped P&L members to another.

Importantly, the new member seems to inherit the data storage operator from it’s parent member. Which is useful, because every possible permutation of the actual documentation that I tried didn’t work.

 

 "dataStorage": "STOREDATA" #what the documentation says
"dataStorage": "Store Data" #what the response body is
"DataStorage": "StoreData" #removing spaces
"DataStorage(FINRPT)": "Store Data" #Similar to the planning header

 

This was made even more frustrating because every failed attempt doesn’t return an error message, but instead a full 500: HTML Error. So, once again, if anyone knows the correct syntax, please let me know?

RESTructuring the cube

So now we’ve dynamically added the members that are missing, we need to refresh the essbase layer. In ‘theory’ (though I didn’t test this) – if you were loading through the Planning connection, you wouldn’t even need to do this step! But because we’re loading into the ASO cube, let’s refresh the cube.

This also opens up another REST API command: executing a job.

Nice and simple – first you create a job within PBCS and then you call it. Since Cameron Lackpour has already enthused the world with multiple blog posts on the topic I won’t belabour the point.

Basically create a refresh cube job and give it a name you remember.

Once that’s done, we call it very similarly to any other planning command, setup a quick payload and run it.

#Now all the metadata has been added, trigger a database refresh
$RefreshURL = $URL + $RESTPlanning + "/jobs" # Create the Rest refresh URL
# Generate the refresh cube payload
$RefreshPayload = "{
	""jobType"": ""CUBE_REFRESH"",
	""jobName"": ""Refresh_with_Kill""
}"
$restReponse = Invoke-RestMethod -Uri $RefreshURL -Method Post -Body $RefreshPayload -Headers $Headers -ContentType "application/json"

$RefreshURLResponse = $RefreshURL + "/" + $restReponse.jobID
$RefreshComplete = $false
while(-not $RefreshComplete)
{
	#Call the status of the job 
	$Checkrunning = Invoke-RestMethod -Uri $RefreshURLResponse -Method Get -Headers $Headers -ContentType "application/json"
	#check if the status is still running
	if ($Checkrunning.status -eq "-1")
		{
		#wait for 5 secs
		Start-sleep -seconds 5
		}
	Else
		{
		#if the refresh is a sucesss,
		if($Checkrunning.status -eq "0")
			{
				#Job was sucessful 
				write-host "Refresh Sucessful - " $Checkrunning.details
			}
		Else
			{
				#job failed
				write-host "Refresh Failed - " $Checkrunning.details
			}
		$RefreshComplete = $true
		}
} 

Once again – we have to roll our own process to check when the job is complete. Calling the REST API job by itself will simply run the job, but for this step we need to wait for it to complete . Perhaps surprisingly (or perhaps not), this is different to the loaddata API calls.

$RefreshURLResponse = $RefreshURL + "/" + $restReponse.jobID
$Checkrunning = Invoke-RestMethod -Uri $RefreshURLResponse -Method Get -Headers $Headers -ContentType "application/json"

From the original POST REST command we get the JobID, and then we run a GET command on that Job ID and we get the status. Why Oracle decided to change between the definition of Status and Details between the Data Management API and the Planning API is a question best left for the Oracles.

RESTuffing the Turkey

(The puns are really getting tortured. Sorry)

Final Step? Reload the data of course!

# invoke Request to Reload the data
$restReponse = Invoke-RestMethod -Uri $DataLoadURL -Method Post -Body $Payload -Headers $Headers -ContentType "application/json"
$responseURL = $restReponse.links[0].href
$ReloadComplete = $false

while(-not $ReloadComplete)
{
#Call the status of the job 
$ReloadRunning = Invoke-RestMethod -Uri $responseURL -Method Get -Headers $Headers -ContentType "application/json"

#check if the status is still running
if ($ReloadRunning.Jobstatus -eq "RUNNING")
	{
	#wait for 5 secs
	Start-sleep -seconds 5
	}
Else
	{
	#If the load is a sucess, confirm 
	if($ReloadRunning.Jobstatus -eq "SUCCESS")
		{
			#job completed 
			write-host "Data Reload Sucessful - " $ReloadRunning.jobStatus "Executed by " $ReloadRunning.executedBy
		}
	Else
	#It may have failed for another reason (not metadata) - post details
		{
			#job failed 
			write-host "Data Reload Failed - " $ReloadRunning.details

		}
	$ReloadComplete = $true
	}
}

This is simply a repeat of the original data load commands. Only item of interest is that I’m using different variables ($ReloadComplete) as strictly speaking I’m still within the error handle of the original data load process.

So. What does this actually look like over a folder? Specifically this folder?

Well – to be honest, it looks pretty damn amazing.

Dynamically creating metadata, refreshing the cube as it goes and reloading only the impacted data files.

Going TerRESTrial

Once completed, it’s Child’s Play…I mean Childz play…I mean Child’s Play to add the new members to an email to the Admin\Business owner to move them to their correct home.

So there you have it, a fully automated, metadata creating, folder loading, fully logging, back to base emailing, data load process that doesn’t require EPMAutomate to run.

Not too bad for less than 250 lines of code!

As always, I’m deeply indebted to previous works from Jake Turrell and John Goodwin – without whom I would have undoubtedly given up.

Full code line available below. As before don’t run code against a production environment that you don’t understand, and make sure it’s doing exactly what you want before you do!

Cheers
Pete

<#
.SYNOPSIS
This script loads in data through the Rest API and downloads the log if there are errors

.DESCRIPTION
This script requires both a target environment, and accepts a switch for a file or folder

.EXAMPLE
./LoadData.ps1 -Env Prod -Rule BS_ASO -Folder c:\Test\BS

.NOTES
Change Control
When       Who    What
13/15/2017  PN   Created 

.LINK
#>
<#Param (
	
    [Parameter(Mandatory=$True)]
    [String]$Env = $(throw "Target Environment is mandatory"), 

    [Parameter(Mandatory=$True)]
	[String]$Rule = $(throw "Rule name is mandatory"),

    [Parameter(Mandatory=$True)]
	[String]$Folder = $(throw "Folder name is mandatory")
)#>

# Parse the config file
$Config = "C:\Git\PBCS_Overnight\Rebuild\" + $Env + "_Config.xml"
[xml]$ConfigFile = Get-Content $Config

# Import configuration settings
$Username = $ConfigFile.Settings.ClientSettings.Username
$Password = $ConfigFile.Settings.ClientSettings.Password
$Env = $ConfigFile.Settings.ClientSettings.Env
$Domain = $ConfigFile.Settings.ClientSettings.Domain
$URL = $ConfigFile.Settings.ClientSettings.URL
$AppName = $ConfigFile.Settings.ClientSettings.AppName
$LogFolder = $ConfigFile.Settings.ScriptSettings.LogFolder

#Encode the authorisation
$EncodedAuthorisation = [System.Text.Encoding]::UTF8.GetBytes($Domain + "." + $Username + ':' + $password)
$EncodedPassword = [System.Convert]::ToBase64String($EncodedAuthorisation)
$headers = @{"Authorization"="Basic $($EncodedPassword)"}

#REST API URLS
$RESTLCM = "interop/rest/11.1.2.3.600/applicationsnapshots/"
$RESTData = "aif/rest/V1/jobs"
$RESTPlanning = "HyperionPlanning/rest/v3/applications/$AppName"

# Define the folder items as an array
$files = Get-ChildItem $Folder 

# Now for each file in the file array
Foreach ($file in $files){

    #Parse the filename
    $LoadFile = Split-Path -Path $file.fullname -leaf -Resolve 
    $LoadPath = Split-Path -Path $file.fullname
    $LoadParent = Split-Path (Split-Path $file.fullname -Parent) -Leaf
    $inboxfolder = "inbox\" + $LoadParent + "\"
    $LoadPeriod = ([io.fileinfo]"$LoadFile").basename.Substring(([io.fileinfo]"$LoadFile").basename.get_Length()-6)

    # Delete file if it exists
    $DeleteURL = $URL + $RESTLCM + [System.Web.HttpUtility]::UrlEncode($inboxfolder + $LoadFile)
    $restReponse = Invoke-RestMethod -Uri $DeleteURL -Method DELETE -Headers $Headers -ContentType "application/json"
    If ($restReponse.Status -gt "0")
    {write-host "File Delete Unsucessful - "$restReponse.Details}
    Else
    {write-host "File Delete Sucessful -" $LoadFile}


    # LCM Rest URL (for uploading files)
    $inboxfolder = $inboxfolder.Replace("\","/") # The insanity as to why I have to do this....
    $UploadURL = $URL + $RESTLCM + $LoadFile + "/contents?q={""isLast"":true,""chunkSize"":500,""isFirst"":true,""extDirPath"":""$inboxfolder""}"

    #Upload File
    $restReponse = Invoke-RestMethod -Uri $UploadURL -Method Post -Headers $Headers -Infile ($LoadPath + "\" + $LoadFile) -ContentType "application/octet-stream"
    If ($restReponse.Status -gt "0")
    {write-host "File Upload Unsucessful - "$restReponse.Details}
    Else
    {write-host "File Upload Sucessful -" $LoadFile}

    #Build a data load payload (JSON)
    $Payload = "{
            ""jobType"": ""DATARULE"",
            ""jobName"": ""$Rule"",
            ""startPeriod"": ""$LoadPeriod"",
            ""endPeriod"": ""$LoadPeriod"",
            ""importMode"": ""REPLACE"",
            ""exportMode"": ""STORE_DATA"",
            ""filename"":""$($inboxfolder + $LoadFile)""
            }"
    #Data Load Rest URL
    $DataLoadURL = $URL + $RESTData
    # invoke Request to trigger data load
    $restReponse = Invoke-RestMethod -Uri $DataLoadURL -Method Post -Body $Payload -Headers $Headers -ContentType "application/json"
    $responseURL = $restReponse.links[0].href
    $Jobcomplete = $false
    #Error Handling for data loads
    while(-not $Jobcomplete)
    {
        #Call the status of the job 
        $Checkrunning = Invoke-RestMethod -Uri $responseURL -Method Get -Headers $Headers -ContentType "application/json"

        #check if the status is still running
        if ($Checkrunning.Jobstatus -eq "RUNNING")
            {
            #wait for 5 secs
            Start-sleep -seconds 5
            }
        Else
            {
                #if the load is a sucess
                if($Checkrunning.Jobstatus -eq "SUCCESS")
                    {
                        #job completed sucessfully
                        write-host "Job Completed Sucessfully - " $LoadFile $Checkrunning.jobStatus- "Executed by" $Checkrunning.executedBy
                    }
                Else
                    {
                        $Logfilename = $LogFolder + "\" + $Rule + "_" + $LoadPeriod + ".txt"
                        $Logname = [System.Web.HttpUtility]::UrlEncode($restReponse.LogfileName.Replace("/","\")) # The definition of insanity....
                        $DownloadURL = $URL + $RESTLCM + $Logname + "/contents"
                        Invoke-RestMethod -Uri $DownloadURL -Method Get -Headers $Headers -ContentType "application/json" | Out-File -FilePath $Logfilename -Force
                        write-host "Job Failed - Log Downloaded -" $Logfilename " - Parsing for Missing Members..."

                        # Okay - this is where we get fun... 
                        $AllMissingMembers = New-Object System.Collections.Generic.List[System.Object] #create a list for the missing members
                        Get-Content $Logfilename | # open the file
                        Select-string "Error: 3303" | # find any Error 3303 strings (Missing members)
                        foreach{ #for each error row
                            $data = $_.line.split("|",[System.StringSplitOptions]::RemoveEmptyEntries).Trim(); # split them into members and data rows
                            $AllMissingMembers.add($data[1])  #build an list of missing members 
                            }
                        $UniqueMembers = $AllMissingMembers | Select -unique # select the unique items
                        write-host "Creating the following members:" $UniqueMembers
                        
                        #Now we have a unique list of members, lets iterate and add them!
                        foreach($UniqueMember in $UniqueMembers)
                        {                                              
                            $CreateMbrURL = $URL + $RESTPlanning + "/dimensions/Product/members" # Create the Rest Add Member URL
                            # Generate the Add Member payload
                            $AddMemberPayload = "{
                                ""memberName"": ""$UniqueMember"",
                                ""parentName"": ""Unmapped Product""
                            }"
                            #Create the member
                            $restReponse = Invoke-RestMethod -Uri $CreateMbrURL -Method Post -Body $AddMemberPayload -Headers $Headers -ContentType "application/json"
                            If ($restReponse.Status -gt "0")
                            {write-host "Member Creation Unsuccessful - " + $restReponse.Details}
                            Else
                            {write-host $UniqueMember "- Member created Sucessfully"}
                            
                        }
                        #Now all the metadata has been added, trigger a database refresh
                        $RefreshURL = $URL + $RESTPlanning + "/jobs" # Create the Rest refresh URL
                        # Generate the refresh cube payload
                        $RefreshPayload = "{
                            ""jobType"": ""CUBE_REFRESH"",
                            ""jobName"": ""Refresh_with_Kill""
                        }"
                        $restReponse = Invoke-RestMethod -Uri $RefreshURL -Method Post -Body $RefreshPayload -Headers $Headers -ContentType "application/json"
                        $RefreshURLResponse = $RefreshURL + "/" + $restReponse.jobID
                        $RefreshComplete = $false
                        while(-not $RefreshComplete)
                        {
                            #Call the status of the job 
                            $Checkrunning = Invoke-RestMethod -Uri $RefreshURLResponse -Method Get -Headers $Headers -ContentType "application/json"
                            #check if the status is still running
                            if ($Checkrunning.status -eq "-1")
                                {
                                #wait for 5 secs
                                Start-sleep -seconds 5
                                }
                            Else
                                {
		                        #if the refresh is a sucesss,
		                        if($Checkrunning.status -eq "0")
			                        {
				                        #Job was sucessful 
				                        write-host "Refresh Sucessful - " $Checkrunning.details
			                        }
		                        Else
			                        {
				                        #job failed
				                        write-host "Refresh Failed - " $Checkrunning.details
			                        }
                                $RefreshComplete = $true
		                        }
                        } 
	                    # invoke Request to Reload the data
                        $restReponse = Invoke-RestMethod -Uri $DataLoadURL -Method Post -Body $Payload -Headers $Headers -ContentType "application/json"
                        $responseURL = $restReponse.links[0].href
                        $ReloadComplete = $false

                        while(-not $ReloadComplete)
                        {
	                    #Call the status of the job 
	                    $ReloadRunning = Invoke-RestMethod -Uri $responseURL -Method Get -Headers $Headers -ContentType "application/json"

	                    #check if the status is still running
	                    if ($ReloadRunning.Jobstatus -eq "RUNNING")
		                    {
		                    #wait for 5 secs
		                    Start-sleep -seconds 5
		                    }
	                    Else
		                    {
		                    #If the load is a sucess, confirm 
		                    if($ReloadRunning.Jobstatus -eq "SUCCESS")
			                    {
				                    #job completed 
				                    write-host "Data Reload Completed Sucessfully - " $LoadFile $ReloadRunning.jobStatus- "Executed by" $ReloadRunning.executedBy
			                    }
		                    Else
		                    #It may have failed for another reason (not metadata) - post details
			                    {
				                    #job failed 
				                    write-host "Data Reload Failed -" $LoadFile $ReloadRunning.details
			                    }
		                    $ReloadComplete = $true
		                    }
	                    } 
                    
                    }
                $Jobcomplete = $true
        }
    } 

}

RESTing Around with PBCS

KScope Recap

G’day All,

So a lightning quick recap of KScope17.

Cameron celebrated VE day
Celvin met a deer
Opal had a sugar high
Gary drove a train
Ludovic ate some tacos
Natalie experienced Puppy Love
Jason went to the danger zone
Rodrigo grew a moustache
Ricardo held an axe
Jake Turrell ransacked the house of Pacman
Edward Roske’s Hat had a crisis of confidence
Tim G was English
And Newbie of the Year Neviana literally did everything

I’m not one to wax lyrical, but y’all are amazing. I’m honoured to call you my friends.

So – what’s next

So after an amazing week, returning from KScope17 I found myself full of verve and enthusiasm, motivation and endorphins

Actually, Becky is always this enthusiastic…bad example.

So I figured I’d hit the ground running and bang out multiple quick posts about the REST API in PBCS. Also gives me an excuse to make bad REST puns through the entire post.

Needless to say, 4 weeks, 1 SR, 2 OTN posts, hours trolling though the amazing John Goodwin’s blog backlog and some urgent twitter DM’s to the guru and I’m finally ready to post.

RESTating the background

So, I could write a whole background of what a REST API is and it’s benefit to developers – but I’ll leave this to the very talented Jake Turrell. This is his recorded session from KScope 16, where he discusses the anatomy of a REST API request, talks through JSON and does a live demo (because doing something completely crazy and unsupported in front of a live audience is ALWAYS a great idea!).

It was one of my favourite presentations from KScope16 so I cannot strongly recommend it enough!

Without further RESTraint

So now you know what the REST API can do, let’s throw some code at the wall and see what sticks.

I’m going to walk through a very simple (and oft repeated process!) of uploading a datafile to a PBCS instance and triggering a data load rule.

Firstly though, let’s see what the same process looks like in EPMAutomate.

The code lines are fairly simple – Oracle has exposed REST functions through very simple batch parameters
Steps are:

  • Login
  • Delete File
  • Upload File
  • Trigger data load
  • Logout


And this is what it looks like for a successful run:

And this is what it looks like with a failed load – I’ve added a data record for an entity that doesn’t exist

UncleaREST logging ever…

So we know we ‘can’ do it through EPMAutomate, but that logging is pretty atrocious. For anyone wondering, the EPMAutomate command does return an Errorlevel, but it doesn’t return any real details and there is no way to easily identify and grab the log.

So let’s go to the REST API.

The four commands we are going to use are:

Delete Files

DELETE /interop/rest/{api_version}/applicationsnapshots/{applicationSnapshotName

Upload Files

POST /interop/rest/{api_version}/applicationsnapshots/{applicationSnapshotName}/contents? q={"isLast":true,"chunkSize":444,"isFirst":true,"extDirPath":inbox\}

Running Data Rules

POST aif/rest/V1/jobs {Payload}

Download Files

GET /interop/rest/{api_version}/applicationsnapshots/{applicationSnapshotName}/contents

My code language of choice for today is Powershell – I’m still a bit rusty with Groovy, and Powershell has a very useful set of REST functions built in.

Define the headRESTS

# Parse the config file
$Config = "C:\Git\PBCS_Overnight\Rebuild\" + $Env + "_Config.xml"                                          "
[xml]$ConfigFile = Get-Content $Config

# Import configuration settings
$Username = $ConfigFile.Settings.ClientSettings.Username
$Password = $ConfigFile.Settings.ClientSettings.Password
$Env = $ConfigFile.Settings.ClientSettings.Env
$Domain = $ConfigFile.Settings.ClientSettings.Domain
$URL = $ConfigFile.Settings.ClientSettings.URL
$LogFolder = $ConfigFile.Settings.ScriptSettings.LogFolder

#Params 
$Env = "Test"
$FiletoLoad = "C:\Git\PBCS_Overnight\Rebuild\BS\BS_Actual_Jan-13.txt"
$Rule = "BS_ASO"

#REST API URLS
$RESTLCM = "interop/rest/11.1.2.3.600/applicationsnapshots/"
$RESTData = "aif/rest/V1/jobs"

#Encode the authorisation
$EncodedAuthorisation = [System.Text.Encoding]::UTF8.GetBytes($Domain + "." + $Username + ':' + $password)
$EncodedPassword = [System.Convert]::ToBase64String($EncodedAuthorisation)
$headers = @{"Authorization"="Basic $($EncodedPassword)"}

So the first tasks are to set variables, pass through some parameters and define the authorisation headers. I’m going to use the very elegant get-content function to pull all my variables from an XML file (read more about that here). In addition, I’m defining the specific REST API URLs for both the LCM and Data load functions.

We also need to create an encoded header for authentication – the format of which is as follows:

Domain.Username:Password

This then needs to be encoded into UTF8 and THEN converted into a base64string and added to a JSON header.

#Encode the authorisation
$EncodedAuthorisation = [System.Text.Encoding]::UTF8.GetBytes($Domain + "." + $Username + ':' + $password)
$EncodedPassword = [System.Convert]::ToBase64String($EncodedAuthorisation)
$headers = @{"Authorization"="Basic $($EncodedPassword)"}

Write-host $EncodedPassword
YTQ4NTk5Ny5wZXRlci5uaXRzY2hrZUBtcG93ZXJzb2x1dGlvbnMuY29tLmF1OkFTRDMzd2Fz

Basically leaving you with a header record as follows that gets passed through to every REST query.

{
"Authorization"="Basic (YTQ4NTk5Ny5wZXRlci5uaXRzY2hrZUBtcG93ZXJzb2x1dGlvbnMuY29tLmF1OkFTRDMzd2Fz)"
}

RESTringing

#Parse the Folder Path
$LoadFile = Split-Path -Path $FiletoLoad -leaf -Resolve 
$LoadPath = Split-Path -Path $FiletoLoad
$LoadParent = Split-Path (Split-Path $FiletoLoad -Parent) -Leaf

#Parse the filename
$inboxfolder = "inbox\" + $LoadParent + "\"
$LoadPeriod = ([io.fileinfo]"$LoadFile").basename.Substring(([io.fileinfo]"$LoadFile").basename.get_Length()-6)

We can then split up the folder and file names to generate all of the required parameters. This is where using a ‘real’ code language is incredibly helpful – giving you a set of string and file system parsing parameters rather than needing to roll your own.

Below is my nice final set of variables.

 Write-host $LoadFile
write-host $loadpath
write-host $loadparent
write-host $inboxfolder
write-host $loadperiod
BS_Actual_Jan-13.txt
C:\Git\PBCS_Overnight\Rebuild\BS
BS
inbox\BS\
Jan-13

One item that will become important later. I’m using a windows box, and therefore all of the current file pathing has “\” slashes for deliminators.

RESTage the data

# Delete file if it exists
$DeleteURL = $URL + $RESTLCM + [System.Web.HttpUtility]::UrlEncode($inboxfolder + $LoadFile) 
$restReponse = Invoke-RestMethod -Uri $DeleteURL -Method DELETE -Headers $Headers -ContentType "application/json"
If ($restReponse.Status -gt "0")
{write-host "File Delete Unsuccessful - "$restReponse.Details}
Else
{write-host "File Delete Successful"}

So we finally get to our first REST command – the delete function.

For this command we need to URL encode the file path of the target to pass it through

Basically that converts the following path:

 inbox/BS/BS_Actual_Jan-13.txt

into

 inbox%5cBS%5cBS_Actual_Jan-13.txt

We then send the REST Delete command with the authentication header and the URL encoded filename. We also set the content type:

 -contenttype "application/json"

in order to get the query return information in a JSON format.

This is where we start to see the value of using the REST API for queries. If a REST query fails, we will get a set of return information – in the case of the Delete function, we get the following return dataset in a JSON format.

Which we can easily parse into an actual error handle – Note: I got this simply by running the delete function twice

Much better than EPMAutomate and a simple ErrorLevel!

PRESTaging Data

 # LCM Rest URL (for uploading files)
$inboxfolder = $inboxfolder.Replace("\","/") # The insanity as to why I have to do this....
$UploadURL = $URL + $RESTLCM + $LoadFile + "/contents?q={""isLast"":true,""chunkSize"":500,""isFirst"":true,""extDirPath"":""$inboxfolder""}"

#Upload File
$restReponse = Invoke-RestMethod -Uri $UploadURL -Method Post -Headers $Headers -Infile ($LoadPath + "\" + $LoadFile) -ContentType "application/octet-stream"
If ($restReponse.Status -gt "0")
{write-host "File Upload Unsuccessful - "$restReponse.Details}
Else
{write-host "File Upload Successful"}

Now we deal with the upload, and this is where the ‘/’ deliminators get a little funky.

My URL is basically the REST API plus a filename, but I have to now define an extDirPath to load a file into the inbox directories for data management. Unfortunately this needs to be parsed as a Unix file path, therefore the slashes must be reversed

$inboxfolder = $inboxfolder.Replace("\","/") # The insanity as to why I have to do this....

In another ‘learn from my mistakes and frustrations’ – the documentation indicates that the folder structure is “extDirPath”:”/inbox”

However this is incorrect – you don’t need a leading ‘/’ before inbox. This is the correct final syntax:

/interop/rest/11.1.2.3.600/applicationsnapshots/BS_Actual_Jan-13.txt/contents?q={"isLast":true,"chunkSize":500,"isFirst":true,"extDirPath":"inbox/BS/"}

Finally, the -Infile command (ie: the source file) accepts the WINDOWS pathing. So the string is the full path using ‘\’ slashes.

-Infile C:\Git\PBCS_Overnight\Rebuild\BS\BS_Actual_Jan-13.txt

RESTituting Data

#Build a data load payload (JSON)
$Payload = "{
        ""jobType"": ""DATARULE"",
        ""jobName"": ""$Rule"",
        ""startPeriod"": ""$LoadPeriod"",
        ""endPeriod"": ""$LoadPeriod"",
        ""importMode"": ""REPLACE"",
        ""exportMode"": ""STORE_DATA"",
        ""filename"":""$($inboxfolder + $LoadFile)""
        }"

Now onto the data load – the first step is defining a REST body payload – this defines the load periods, job name, import and export modes as well as the (optional) filename.

Once again, this is where the documentation differs from the reality.

Note only a single Period is listed? Yeah. Anyways, through trial and error – mostly by looking at what the EPMAutomate commands were using – I tested startperiod and endperiod and it worked.

Obviously the moment I worked it out I found John Goodwin’s post on exactly the same topic…sigh.

Also note that the Inbox folder path needs to be the Unix format so:

"filename": "inbox/BS/BS_Actual_Jan-13.txt"

Moving on.

 #Data Load Rest URL
$DataLoadURL = $URL + $RESTData
# invoke Request to trigger data load
$restReponse = Invoke-RestMethod -Uri $DataLoadURL -Method Post -Body $Payload -Headers $Headers -ContentType "application/json"
$responseURL = $restReponse.links[0].href

So here we get to the final load commands. The data load trigger is a POST command, passing through the Payload in the body, and the standard headers.

Remember that the API URLs are different between the LCM (data import\export) and the Data Management URLs. Specifically, the data management URL is

 aif/rest/V1/jobs

The second command

$responseURL = $restReponse.links[0].href

is defining the rest response URL for this specific load – once again, very useful because we can poll this using a REST GET to check the status of the load.

$Jobcomplete = $false

while(-not $Jobcomplete)
{
    #Call the status of the job 
    $Checkrunning = Invoke-RestMethod -Uri $responseURL -Method Get -Headers $Headers -ContentType "application/json"

    #check if the status is still running
    if ($Checkrunning.Jobstatus -eq "RUNNING")
        {
        #wait for 5 secs
        Start-sleep -seconds 5
        }
    Else
        {
            #if the load is a failure, download the log
            if($Checkrunning.Jobstatus -eq "FAILED")
                {
                    $Logname = [System.Web.HttpUtility]::UrlEncode($restReponse.LogfileName.Replace("/","\")) <# The definition of insanity.... #>
                    $DownloadURL = $URL + $RESTLCM + $Logname + "/contents"
                    Invoke-RestMethod -Uri $DownloadURL -Method Get -Headers $Headers -ContentType "application/json" | Out-File -FilePath $($LogFolder + "\" + $Rulename + "_" + $LoadPeriod + ".txt") -Force
                    write-host "Job Failed - Log Downloaded -" $($LogFolder + "\" + $Rulename + "_" + $LoadPeriod + ".txt")
                }
            Else
                {
                    #job completed sucessfully
                    write-host "Job completed Sucessfully - " $Checkrunning.jobStatus "Executed by " $Checkrunning.executedBy
                }
            $Jobcomplete = $true
    }
} 

The final component of the script – and this is where using the REST API shines. By polling the job status we can check for a failed run – and then, because one of the return codes is the logfile name (see $restReponse.LogfileName) we can then selectively download that specific log file.

Note once again we have to deal with some slight insansity – the filename of the log is a returned as a windows format (‘\’ slashes) whereas to download the file we need to convert it BACK to a unix format. We then use a fairly standard GET command and write the output to a text file.


ARESTted Development

So there you have it, a full walkthrough of uploading a datafile and triggering a load rule using the REST API.

I’ll put the full code at the bottom – mostly to help anyone deal with the insanity of needing to chop and change the folder paths. Standard caveats apply, don’t run code you don’t understand in your production instances (or any instances!) and no warranties implied or otherwise.

I’ll be noting some of the updates\issues in the documentation and sending something through to Oracle – would make a lot of this development significantly easier!

Next I’ll be extending this process a bit further – actually parsing the downloaded log into actionable errors!

Any questions\issues, leave a comment. Happy RESTing!

Cheers
Pete

<#
.SYNOPSIS
This script loads in data through the Rest API and downloads the log if there are errors

.DESCRIPTION
This script requires both a target environment, and accepts a switch for a file or folder

 
.EXAMPLE
./LoadData.ps1 -Env Prod -Rule BS_ASO -Folder c:\Test\BS

.NOTES
Change Control
When       Who    What
13/15/2017  PN   Created 

.LINK


#>

<#
Param (
	
    [Parameter(Mandatory=$True)]
    [String]$Env = $(throw "Target Environment is mandatory"), 

    [Parameter(Mandatory=$True)]
	[String]$Rule = $(throw "Rule name is mandatory"),

    [Parameter(Mandatory=$True)]
	[String]$FiletoLoad = $(throw "File name is mandatory")
)
#>

#Params 
$Env = "Test"
$FiletoLoad = "C:\Git\PBCS_Overnight\Rebuild\BS\BS_Actual_Feb-13.txt"
$Rule = "BS_ASO"


# Parse the config file
$Config = "C:\Git\PBCS_Overnight\Rebuild\" + $Env + "_Config.xml"
[xml]$ConfigFile = Get-Content $Config

# Import configuration settings
$Username = $ConfigFile.Settings.ClientSettings.Username
$Password = $ConfigFile.Settings.ClientSettings.Password
$Env = $ConfigFile.Settings.ClientSettings.Env
$Domain = $ConfigFile.Settings.ClientSettings.Domain
$URL = $ConfigFile.Settings.ClientSettings.URL
$LogFolder = $ConfigFile.Settings.ScriptSettings.LogFolder

#Encode the authorisation
$EncodedAuthorisation = [System.Text.Encoding]::UTF8.GetBytes($Domain + "." + $Username + ':' + $password)
$EncodedPassword = [System.Convert]::ToBase64String($EncodedAuthorisation)
$headers = @{"Authorization"="Basic $($EncodedPassword)"}

#Parse the Folder Path
$LoadFile = Split-Path -Path $FiletoLoad -leaf -Resolve 
$LoadPath = Split-Path -Path $FiletoLoad
$LoadParent = Split-Path (Split-Path $FiletoLoad -Parent) -Leaf

#Parse the filename
$inboxfolder = "inbox\" + $LoadParent + "\"
$LoadPeriod = ([io.fileinfo]"$LoadFile").basename.Substring(([io.fileinfo]"$LoadFile").basename.get_Length()-6)


#REST API URLS
$RESTLCM = "interop/rest/11.1.2.3.600/applicationsnapshots/"
$RESTData = "aif/rest/V1/jobs"

# Delete file if it exists
$DeleteURL = $URL + $RESTLCM + [System.Web.HttpUtility]::UrlEncode($inboxfolder + $LoadFile)
write-host $DeleteURL
$restReponse = Invoke-RestMethod -Uri $DeleteURL -Method DELETE -Headers $Headers -ContentType "application/json"
If ($restReponse.Status -gt "0")
{write-host "File Delete Unsucessful - "$restReponse.Details}
Else
{write-host "File Delete Sucessful"}

# LCM Rest URL (for uploading files)
$inboxfolder = $inboxfolder.Replace("\","/") # The insanity as to why I have to do this....
$UploadURL = $URL + $RESTLCM + $LoadFile + "/contents?q={""isLast"":true,""chunkSize"":500,""isFirst"":true,""extDirPath"":""$inboxfolder""}"

#Upload File
$restReponse = Invoke-RestMethod -Uri $UploadURL -Method Post -Headers $Headers -Infile ($LoadPath + "\" + $LoadFile) -ContentType "application/octet-stream"
If ($restReponse.Status -gt "0")
{write-host "File Upload Unsucessful - "$restReponse.Details}
Else
{write-host "File Upload Sucessful"}


#Build a data load payload (JSON)
$Payload = "{
        ""jobType"": ""DATARULE"",
        ""jobName"": ""$Rule"",
        ""startPeriod"": ""$LoadPeriod"",
        ""endPeriod"": ""$LoadPeriod"",
        ""importMode"": ""REPLACE"",
        ""exportMode"": ""STORE_DATA"",
        ""filename"":""$($inboxfolder + $LoadFile)""
        }"


#Data Load Rest URL
$DataLoadURL = $URL + $RESTData

# invoke Request to trigger data load
$restReponse = Invoke-RestMethod -Uri $DataLoadURL -Method Post -Body $Payload -Headers $Headers -ContentType "application/json"
$responseURL = $restReponse.links[0].href
$Jobcomplete = $false

while(-not $Jobcomplete)
{
    #Call the status of the job 
    $Checkrunning = Invoke-RestMethod -Uri $responseURL -Method Get -Headers $Headers -ContentType "application/json"

    #check if the status is still running
    if ($Checkrunning.Jobstatus -eq "RUNNING")
        {
        #wait for 5 secs
        Start-sleep -seconds 5
        }
    Else
        {
            #if the load is a failure, download the log
            if($Checkrunning.Jobstatus -eq "FAILED")
                {
                    $Logname = [System.Web.HttpUtility]::UrlEncode($restReponse.LogfileName.Replace("/","\")) # The definition of insanity....
                    $DownloadURL = $URL + $RESTLCM + $Logname + "/contents"
                    Invoke-RestMethod -Uri $DownloadURL -Method Get -Headers $Headers -ContentType "application/json" | Out-File -FilePath $($LogFolder + "\" + $Rulename + "_" + $LoadPeriod + ".txt") -Force
                    write-host "Job Failed - Log Downloaded -" $($LogFolder + "\" + $Rulename + "_" + $LoadPeriod + ".txt")
                }
            Else
                {
                    #job completed sucessfully
                    write-host "Job completed Sucessfully - " $Checkrunning.jobStatus "Executed by " $Checkrunning.executedBy
                }
            $Jobcomplete = $true
    }
} 

Error Handling the Max Power(shell) way

Let’s start this off with a bang

If, like most of us, part of your day to day role is support – one of the key advantages of using ODI in your overnight process is to have the full gambit of ODI error handling available to you. Errors are nicely caught, with email warnings for failures in batch processes basically a default state.

However, what happens if ODI isn’t part of your implementation? What happens if the back end is basically a bunch of batch/maxl scripts hanging together (no matter how elegant it is). If it is the latter, at best you’re unlikely to have an easy method to trap errors and send notifications to the support team and at worst, you may not have any automated notification at all!

So your options are:

  1. Hope against hope that you never ever have any errors in your overnight processes
  2. Write your own error handling piece and include it in the processing

Rolling Your Own

So you’ve decided to steal cable write your own. While this process would be possible in batch – certainly you’re likely to be better off selecting a more advanced language with slightly better handling of strings (See also: Java/Perl/Powershell/Cygwin/VBA).

I’ve gone with Powershell: slightly because I’m generally on Windows environments, somewhat because it’s fairly decent at string manipulation and mostly because it’s one of the few languages I know.

Code Overview

So – to start with, basic documentation and variable structure. I’ve written this to poll through a folder containing log files (rules file errors, maxl return errors and an archive copy of all of the Essbase Log files from the previous day). These have been moved to this folder by earlier maxl/batch steps, mostly by using STDERR outputs.

PowerShell_ErrorHandle_1

Loop-de-Loop

For each of the files in the log folder it scans for any lines containing errors and pushes the entire line to a central log file.

I’ve added suppression to deal with a majority of ‘false positive’ errors – primarily related to errors in the Essbase log for Smartview user-side errors like ‘you’ve-got-suppression-on-by-default-and-your-top-level-retrieve-is-blank’ (man, if I had a nickel for that one…).

PowerShell_ErrorHandle_2

Letting you Know

Finally it checks if the log file isn’t blank and emails it through to the email addresses listed. The primary reason I’ve got the error suppression above is to make sure you’re only receiving automated emails when something has gone wrong. I find too many false positives in your inbox mean that the actual errors occasionally get lost in the underflow.
PowerShell_ErrorHandle_3

Code
Finally – the full code in all it’s glory. 20 Lines of working code all up!

# Name: Hyperion Mail Error script
# Purpose: Sends an email to the support team if there are any errors in the overnight
# Author: Peter Nitschke - M-Power Solutions

# Change Control
# When Who What
# 19/07/2014 PN Created

# INSTRUCTIONS
# Script is built as a powershell command to be called at the end of the overnight
# Script will poll through the available files in the log locations folder searching for errors
# Some errors are suppressed (var SUPPRESSEDERROR), additional error codes to be suppressed can be added to this variable
# If any errors are found, it will automatically send an email through to the support team

Set-Variable ERRORFOLDER “..\Errors\OvernightErrors”
Set-Variable ERRORLOG “..\Logs\OvernightErrors.log”

#ERRORS TO BE SUPPRESSED
#1020010 No data was generated: Suppress Missing = [%s], Zeros = [%s]. Sheet not overwritten.
#1020011 Maximum number of rows [%s] exceeded [%s].
#1020012 Maximum number of columns [%s] exceeded [%s].
Set-Variable SUPPRESSEDERROR “1020007”,”1020011″,”1020012″

# Create or clear the status file
new-item $ERRORLOG -type file -force

#For all of the error log files
(Get-ChildItem $ERRORFOLDER |

#Create array of objects
Foreach-object{
#Define the name and content
$name = $_.name
$content = Get-Content $_.FullName
#look for errors in the file
switch(($content | Select-string “Error” | Select-string $SUPPRESSEDERROR -NotMatch | measure | select -exp count) -gt 0) {
#if errors found, print the filename and the error
True {Echo $name >> $ERRORLOG
$content | Select-string “Error” | Select-string $SUPPRESSEDERROR -NotMatch >> $ERRORLOG
Echo “” >> $ERRORLOG}
}
}
)
#Once the loop is finished, check if the status file has any content and email if true
If ((Get-Content $ERRORLOG) -eq $Null) {
}
Else{
#Calling send email function
$smtp = “mail.gmail”
$emailto = “HyperionSupport@EssbaseDownUnder.com”
$emailfrom = “HyperionSupport@EssbaseDownUnder.com”
$subject = “Hyperion Overnight – Errors” + (get-date -f “dd/MM/yyyy”)
$body =(get-content $ERRORLOG | Out-string)

send-mailmessage -to $emailto -from $emailfrom -smtp $smtp -subject $subject -body $body
}

As an aside, I’m aiming to provide full code snippets as often as possible. Most of my learning over the years has been picking up code from everywhere and jerry-rigging it all together. (Subnote: greatest wikipedia ‘Not to be confused with’ ever!). Please feel free to use, send to colleagues, share with friends, or claim as your own to help you pick up ladies.

However, standard rules of never putting something into your environment that you haven’t first tested and understood apply. I’m not to blame if it takes these lemons and burns your house down.

Conclusions

Hopefully this has given you some ideas as to adding basic error handling to your own environments, allowing you to both rest easier at night and feel like you’re always one step ahead of any problems as they occur!

Cheers
Pete