Real Time data push from Planning to ASO 11


This is a follow up post on what was covered in here.
Lot of us might have asked this – “How can I move data from a Planning application to ASO in real time?”
Well this post tries to answer that same old query. You’ll have to be on a Calc Manager version which is greater than 11.1.2.3.502.
There was an issue with 11.1.2.3.502 – When Planning sends a member from Page/POV it adds encloses the member name in double quotes. This was the reason why I didn’t cover that in the previous blog. 
Now you’ll ask what is wrong in it. Yes this is great on a BSO application, however ASO works differently (it uses double quotes only for literal strings (like in UDA function or a String Compare)). 🙂 “” for BSO member enclose [] for ASO isn’t it?
 Sree Menon and team worked on a patch and now that issue is solved.
The below given example is on 11.1.2.4 Planning + Calc Manager.
I’m using Sample Vision application and I’ve created an ASO replica of it (exception being a new Country dimension).
I created a Calc Manager Rule as shown below
As you can see I’m going to use Variables in this rule. It is easier to create this rule by looking at the form definition. This way you know what is in POV/Page/Row/Column.
Entity, Scenario, Version can be variables in Calc Manager rule.
I’m writing this using RUNJAVA, (Yes I know Cameron is going to object this method ;))

RUNJAVA com.hyperion.calcmgr.common.cdf.MDXDataCopy 
  "fhe34i5hsdudoi3f%hk$j#hds"  /*key */
  "WtvQ8ypcGEmUnMKSjE3wr0RI03WqepxCrzahya/xS2r021N4sSnibp+7pwDZPPN0" /*user*/
  "4OCPhcRSHg+pTKSETth6aaOVz/XpC8BH/zVtWK+TKZGpy1+4YaEGwfeE/y9aV1vs"  /* password */
  "Vision" /* from application */ 
  "Plan1" /* from database */ 
  "ASOPLN" /* to application */ 
  "Vision" /* to database */ 
  "{[{Scenario}]}" /* MDX that defines the column members */
  "Crossjoin({Descendants([&QRFPer1],[Period].Levels(0)),Descendants([&QRFPer2],[Period].Levels(0)),Descendants([&QRFPer3],[Period].Levels(0)),Descendants([&QRFPer4],[Period].Levels(0))},   Crossjoin({[{Department}]},         Crossjoin({[&QRFYr1],[&QRFYr2],[&QRFYr3],[&QRFYr4]},   Crossjoin({[{Version}]},         Crossjoin(Descendants([P_TP1],[Product].Levels(0)),{[Units],[ASP],[Gross Margin %],[4001],[5000],[GP]}))))) "/* MDX that defines the row members */
  ""/* source member mappings, can be empty */
  ""/* target member mappings, can be empty */
  "No Country" /* Target POV columns, members from dimensions that do not exist on the source*/ 
  "-1" /* rows per page */
  "c:\Temp\VisionCopy.log"; /* log file , can be empty */

As you can see it is a bit of a mess if the variable needs to treated as set in MDX. (Note – the extra dimension member is covered in Target POV)
MDX set uses ‘{}’ to denote a set and Calc Manager also uses ‘{}’ to denote a variable. So double brackets to the rescue, {[{Scenario}]} is parsed correctly.

Note: Do not use NON EMPTY functions in MDX, this is done to make sure that data clears from Planning is accurately captured.

Now if you are wondering about the encryption of Calc Manager rule, here is how it is done.
There are some characters which cannot be used as key. I did receive an error about AES length not being met.
Let’s attach this rule to the corresponding form.
Now let’s change some data and see if it copies to ASO.
This is my ASO retrieval before the save.
Changing webform data.
Retrieval after data submit.
I did notice two bugs while preparing for writing this post.
  1. While creating an Application level variable in Calc Manager using Dimension Name option, it won’t allow you to type in the custom dimension name. (just behaves like Dimension Type). I can go back to my hack till this one is fixed. 🙂
  2. Now this one is an issue if you are planning to implement using 11.1.2.4 (I couldn’t test this on a version between 11.1.2.3.502 and 11.1.2.4). The MDX script that extract the data gets cached some where, and it won’t extract correct data if the Page member is changed.
If I recycle the services and do a data submit then it will correctly move data using the member selected in page, however after the first data movement, the member gets cached. I would hold on for sometime till Calc Manager team fixes this issue.
HTH

Update 03/04/2015

Calc Manager team updated me about the issue with Member caching, they did fix this issue internally, so we might see this coming in a patch soon 🙂


About Celvin Kattookaran

I’m an EPM Consultant, my primary focus is on Hyperion Planning and Essbase. Some of you from Hyperion Support team might recognize me or have seen my support articles, I was with the WebAnalysis Support Team. I'm an Independent Consultant with “Intekgrate Corporation” based out of Aurora office. I’m from God’s Own Country (Kerala, India), lived in all southern states of India, Istanbul and Johannesburg (and of course United States). I’m core gamer :) and an avid reader. I was awarded Oracle ACE Director for my contributions towards EPM community.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

11 thoughts on “Real Time data push from Planning to ASO

  • Peter Nitschke

    G'day Kelvin,

    two quick questions:
    – What is the performance like?
    – I see you're using NON EMPTY in the MDX query, how does that handling clearing data? Ie: a user deletes some data on the dataform leaving #missing, does it correctly push through and clear on the ASO side?

    Otherwise, looks really good!

    Cheers
    Pete

  • Mike Henderson

    Great post, Celvin.

    For another perspective this topic was presented at Kaleidoscope 2013.

    We basically created hybrid BSO-ASO Planning with a full round trip. (The push to ASO is easy compared to the return to Planning through transparent partitions!)

    The push process is initiated by a RUNJAVA call to the CDF that runs a maxL script. Parameters are passed including App, DB, Scenario, Version, Year, and any POV members. MaxL ran a DATAEXPORT, partial clear, and load.

    Pete is correct, you need to handle erasing values from the target that have been cleared in the source. Sending a bajillion #MIs is a very expensive way to achieve that. A better option is to run a partial clear in the target (use the POV members to narrow the focus of the clear, and we did physical clears).

    Performance – In our system, users hit the Save button on a form up to 4000 times a day. The process takes an average of 8 seconds and varies with the size of the form.

    One gotcha: partial clears insist on sole access to the cube. Our solution was to build in a few IFERROR GOTO, pause and try again steps in your MaxL script. The eliminated 99.9% of the issues. We ran a nightly "full push" to catch up any stragglers that my have been left behind.

    http://www.odtug.com/p/cm/ld/fid=65&tid=33&sid=1746
    (Tip – get the .PPTX slides, the .PDFs came out awful.)

  • Celvin Kattookaran

    Mike,

    Thanks for the clear script pointer. I do know that people have used different ways to achieve this in the past. I do that that someone even made an application which can handle concurrent user data push request.

    DATAEXPORT, does it work if current users are trying to perform a data push, since it cannot accept filename as a parameter?
    Well one option is to run a calc string in MaxL where you can change the filename and add members in POV to the filename. (I think that is a possibility)
    I do know someone even created an application which uses subvars to check whether simultaneous users are there in the system and stuff 🙂

  • Peter Nitschke

    Hey – yeah, running the calc string directly from maxl works best.

    You can use the CDF to trigger the maxl, pass the variables through to a batch command and use batch to dynamically create both the dataexport (with a random number / or full timestamp (down to ms) as well as the data clear.

    You start picking up an overhead into MAXL and batch – but it's not much one way or the other.

    Mike – I've 'heard' about this presentation! Never actually found it. I love the idea of using the subvariable as the full dataexport command!

  • Mike Henderson

    Another comment on performance… pushing to ASO is actually faster than the times I said above.

    All calculations are done in BSO just prior to the push. So any depreciation, compensation, etc. math is done first in the local currency. Next, this application does FX translation in three rate sets so you wound up with Local plus 'at plan rate', 'at forecast rate', and 'at prior forecast rate'. The eight second average includes everything.

  • Mike Henderson

    >DATAEXPORT, does it work if current users are trying to perform a data push, since it cannot accept filename as a parameter?

    I did exactly that by a Magyver technique. At a Kscope presentation, I had learned that a subvar can contain calc code. Not much, but enough for a few lines like standard SET statements. Hmmmm, what if…? So, I set a subvar on the fly in MaxL, it contains the DATAEXPORT statement with a filename that included a serial number. I then call a calc script that has that subvar in it followed by a semicolon. Voila, you are essentially passing that serial number to the DATAEXPORT statement and writing to a different text file each time the process runs.

  • Peter Nitschke

    G'day Mike,

    I did something similar as an internal proof of concept (I'm not going to suggest mine is Macguyver so much as ghetto DIY!).

    That said, I ended up using a CDF to trigger dos batch because I couldn't create a suitable 'random' filename in maxl. (Mine was 'random enough' by writing the full timestamp down to ms on it).

    How did you create a random serial number in pure maxl?

    Cheers
    Pete

  • Dheeraj

    Hi Celvin,

    Appreciate your blog and thanks for sharing the knowledge. I have a question for the above post about the CDF. How do you think the source and target mappings will be useful if at all I have to move data from BSO to ASO(Report cube) where a smartlist from BSO is mapped as dimension in ASO?

    Thanks,
    Dheeraj