Defining the Problem
Hello all and welcome to another episode of “this before was a problem”. Firstly before starting, lets state the obvious I really like PBCS and in my opinion the benefits of PBCS far outweigh that of On-Premise Hyperion and I am sure some would disagree, but hence why this is my opinion and you are also welcome to yours (maybe this will be a topic for a future blog post). This leaves me with interesting tasks which are finding areas where PBCS might have limitations that are not issues in on-premise Hyperion and I simply try to find solutions / workarounds to address these limitations.
Today we look at on-demand data loads from source to PBCS. In on-premise Hyperion, if the requirement is to trigger on-demand data loads from within Hyperion they are a few options like Calc Manager command line triggers or Workspace Generic Jobs both of which can trigger batch scripts on the underlying servers to trigger the load from source systems to Hyperion. In PBCS this is a lot more difficult for a few reasons:
- Firstly PBCS is a cloud solution this means that if a request is to initiate from PBCS to an on-premise system, IT would have to allow communication initiating from the internet (PBCS) to On-Premise systems. This is usually considered a security risk and frowned upon by IT (I don’t agree with that, but its the hurdles that will be thrown your way. You will feel like IT has asked you to run the 100m hurdles and you will be required to jump through many technical hoops to get through to acceptance. I simply believe its a complex requirement that would require a robust secure technical solution to resolve, but not a showstopper).
- Secondly PBCS is not able to initiate request to external systems with exception to those allowed by Oracle (SSO and other Oracle Cloud Services/Sources).
You could argue well why bother, you can simply schedule source loads to run more often or have the users trigger the loads via On-Premise EPM Automate batch scripts, custom groovy consoles, Process Management systems, the source system and many other ways. However the reality is this, when a client / customer makes the investment in PBCS they are investing in a cloud solution and that’s exactly what they want. They want to login to PBCS and do everything they need in one place. Asking them to do things outside of PBCS for PBCS specific activities doesn’t carry the same glamour of a new age cloud system (feels like its not quite there), (paused for a few minutes to think of a good example… here we go) kind of like going to see your favorite sports team play, but then sitting in the worst seat in the venue. Sure its better than not going, but you’re there and it could be better. Well in the case of the game, you have two options: wait for the game to start, assess the free seats and try to sneak to better seats or buy better tickets. Likewise with PBCS you can look for a work around or wait for improvements to the software overtime from Oracle. You are probably wondering how many times I snuck to better seats to see my favorite team play, well the sad answer is none, I have yet to see my favorite team play (Hala Madrid), so I will tell you after I do go (#bucketlist).
I am sure in the future PBCS will be able to communicate with more systems and more easily, but for now we are still tasked with a mission to “find a way from within PBCS to trigger an on-demand data load to extract data from source systems and load it to PBCS”.
Some Assumptions Before We Start
- To be absolutely clear in this particular requirement we have assumed the source system is not Oracle Financial Cloud, Oracle Netsuite or any other Oracle Cloud Product that PBCS can directly integrate with and pull data directly from (through a request that is initialized from PBCS).
- We assume we are using two main systems an on-premise source system (ERP or other system) and PBCS.
- We assume that the source system extract is defined at source or calling script layer (for example PL & BS data for current month), we are not passing back specific period requests or specific data requests. Whatever files are extracted from source we will load. Further enhancements to the script / process to include functionality to allow specific data requests and periods parameters are possible, but have not been included in this example.
- We assume clearing of the data is actioned by the administrator in PBCS prior submitting the source data load request in this example. Technically this can be done within the process, but for this example I have not included.
- We have assumed the administrator triggering this script will be notified via the email distribution group as part of the execution of the process. Again this is another area that could be enhanced to send only to the requester of the load based on a parameter or a different approach, but for this example I have adopted an email distribution approach for administrators.
- The load file names drive the load process, for example, BS_Actual_Jan-18.txt will be used to run the data load rule BS and load to period Jan-18.
- We have used EPM Automate to create this process (the process can also be built using direct REST API queries from Groovy or another tool/software).
Solution
Before starting if you are expecting a very elegant and clean solution you won’t get one (but I like to think the solution is as elegant as possible given the limitations). If there was such a thing there would be no point to write this post. This is because we are working with limitations of the software, so this is and must be viewed as work around (a clever work around) to an issue.
So what is the clever solution? Thought you would never ask….
We simply do the following:
- Create a substitution variable in PBCS called “Source_Load”, The variable will be set by default to “False”, but will serve as a status request. If the administrator wants the data loaded from source he/she simply updates this variable to “True”.
- We will have EPM Automate Probe PBCS for the status of this variable. If the Source_Load variable value is set to “True”, the source data will be extracted and loaded to PBCS. On completion an email with the status of the execution will be sent to the administrator(s) (via distribution email).
Simple, of course it is, we like it this way!!!!
So here goes the demo.
First The Code
:: Author Name: Ahmed Hafez (Hyperion Jedi)
:: Creation Date: 02/02/2018
:: File/CMD Name: On-Demand_PBCS_Source_Load.cmd
:: Description: Checks for the value of substitution variable in PBCS and if true loads data from source
@echo off
REM SET VARIABLES
set user=<enter user here>
set pass=<enter password here>
set url="https://<PBCS URL>.oraclecloud.com"
set domain=<enter domain here>
set coredir=<directory>\On-Demand_PBCS_Source_Load\core
set loaddatadir=<directory>\On-Demand_PBCS_Source_Load\files
set logfile="<directory>\On-Demand_PBCS_Source_Load.log"
set workingdir="<EPM Automate Bin Directory>"
set smtpserver='<enter smtp server here>'
set emaildistgroup='<enter email distribution email>'
set starttime="08:00:00,00"
set endtime="20:00:00,00"
set timeouttime=60
:STARTLOOP
REM SYSTEM CHECK TO START
set "now=%time: =0%"
if "%now%" lss %starttime%( goto :ENDLOOP )
if "%now%" geq %endtime% ( goto :ENDLOOP )
REM LOGIN
call %workingdir%\epmautomate login %user% %pass% %url% %domain% > %logfile%
REM CHECK IF REQUEST TO LOAD DATA HAS BEEN MADE (IF REQUEST IS TRUE LOAD DATA)
setlocal enabledelayedexpansion
FOR /F "tokens=2 delims== skip=1" %%g IN ('"call epmautomate getsubstvar ALL name=Source_Load"') do (
SET Request=%%g
IF !Request!==False goto :LOGOUT
echo Request has been made to load data from source >> %logfile%
)
REM RUN SOURCE DATA EXTRACT
call %coredir%\Run_Source_System_Extract.bat
REM SET VARIABLES AND CLEAR DATA
setlocal enabledelayedexpansion
For /R %loaddatadir% %%A IN (*.txt) do (
Set loadfile=%%~nxA
Set load=!loadfile:~0,-18!
Set period="!loadfile:~-10,-4!"
echo Delete existing file prior upload inbox/!load!/!loadfile! >> %logfile%
call %workingdir%\epmautomate deletefile inbox/!load!/!loadfile! >> %logfile%
echo Uploadfile to PBCS "%%A" and place in inbox/!load! >> %logfile%
call %workingdir%\epmautomate uploadfile "%%A" inbox/!load! >> %logfile%
echo Load data using load rule !load! for period !period! >> %logfile%
call %workingdir%\epmautomate rundatarule !load! !period! !period! REPLACE STORE_DATA >> %logfile%
)
REM RESET LOAD REQUEST VARIABLE
call %workingdir%\epmautomate setsubstvars ALL Source_Load=False >> %logfile%
:LOGOUT
REM LOGOUT
call %workingdir%\epmautomate logout >> %logfile%
REM SEND EMAIL (SKIP EMAIL IF REQUEST TO LOAD DATA IS FALSE)
IF !Request!==False ( goto :ENDLOOP )
POWERSHELL Send-MailMessage -To %emaildistgroup% -From %emaildistgroup% -SMTPServer %smtpserver% -Subject 'On-Demand PBCS Source Load' -Body 'PBCS On-Demand Source Loaded has completed. Attached is the log created as part of the On-Demand PBCS Source Load Process. Please review attached log for more details.' -Attachments %logfile%
REM LOOP AGAIN AFTER WAIT
:ENDLOOP
TIMEOUT /T %timeouttime% /NOBREAK
goto :STARTLOOP
See it in Action (some print screens)
Variable is updated to true indicating request has been made to load data from source.
The process checks the value of the variable on a defined interval and between a specific allowable start time and end time. If it is determined that the Source_Load variable is set to “True”, the data is extracted from source and loaded to PBCS.
An Email is sent to the distribution email including the below log of the actions taken.
Key files & folders
What happens if process tries to run outside the allowed time or if it the Source_Load variable is set to false? It will wait the defined wait time before trying again.
Email Output Sample
Some Recommendations
Have clear a set rules about what will be extracted and loaded (I’m sure it is possible to start adding complexity by allowing users/admins to select a period or periods to load or specifying specific data to extract). I think the key is to start simple and go from there. I like current and prior period default extract approach, this usually covers most requirements.
Summary
This is a simple solution to the issue of not being able to trigger on-demand data loads from within PBCS. I am sure the solution can be further improved and decked out with many options and features and probably could be delivered in many different ways. This is an approach that I found was fairly simple, easy to implement and delivered the required solution.
Message To My Friends
To my followers / friends I really wish you find this helpful and entertaining. To anyone reading one of my posts for the first time I welcome you and hope you find success and fun times in your PBCS / Hyperion journeys ahead.
May you always be with the force and may the force always be with you!
Leave a Reply