Quantcast
Channel: SCN : Blog List - SAP Business Warehouse
Viewing all 333 articles
Browse latest View live

Conversion approach of Composite Provider from COPR to HCPR

$
0
0

Introduction:


This document explains the conversion approach/steps to convert the Local Composite Provider/ Multi Provider to HCPR composite provider.


Prerequisites:


Backend BW version has to be BW 7.4 SPS9 and above to get the conversion. 

If the version is lesser than SPS11 then pre-requisite is to apply the OSS note – 2115198 -HCPR: Corrections for conversion.

SAP suggest to use Composite provider of type HCPR as HCPR can be developed using the Studio and not the COPR. All future developments/enhancements are happening on the type HCPR.


Steps Involved:

 

Step 1: Check the system version and consider the below action if the system is lesser than SPS11.


If the system is lesser than SPS11 then apply the OSS note – 2115198 in the system which will bring the required corrections in the system.


Download the OSS Note in SNOTE:

1.PNG

Implement the note:


While implementing this this is prompt to implement note- 2115198 corrections and updating of the new program RS_UDO_NOTE_2115198 in the system which has to execute post implementing the note – 2115198.

2.PNG

Once the note is implemented, then execute the report using SE38 in the system:

3.PNG

Execute with the sequence as per the instructions below:

4.PNG

Test run gives the below message:

5.PNG

Proceed to Step 2:

6.PNG

This will prompt for modifications to be stored in the transport. 


Execute the activity as a batch job.

7.PNG

The below message appears for the job.

8.PNG

In transaction SMX, check the log of the job:

9.PNG

10.PNG

Come back to SE38 to re-run the program again with the UPDATE mode.

11.PNG

     These results with the below log:

12.PNG

If all green then the note corrections are applied. If there are any errors, then perform the execution one more time.


Goto next step:

13.PNG

This list the overall log in Green color.

 

 

Once this is done then confirms the corrections are done in SNOTE.

15.PNG

Confirm the manual actions and then mark the action as done. With this Note is completely implemented.


Step 2: Execute the conversion program in BW system to convert the existing COPR/MPRO to HCPR.


SAP has delivered a program - RSO_CONVERT_IPRO_TO_HCPR to convert the already existing Multi provider (MPRO) / Local Composite provider (COPR) to type HCPR.


Execute the program in SE38 which will bring the below view:

17.PNG

The options represents the below:

18.PNG

Note: There are multiple additional options are available in BW7.5 which allows taking a backup of the Queries or restoring the queries.


Update the fields with the source and target provider data and run the execution in simulation mode.

20.PNG

Note: As in Local Composite provider, the Navigational attributes are marked as @3 as prefix which is replaced with prefix 4 to differential the type from COPR to HCPR.


Physical Conversion: Select the mode as "Convert Infoprovider"

21.PNG


Log displays with the results and the kind of changes.


Goto RSA1 and check for the new Composite Provider:

22.PNG

Double click on the composite:

23.PNG

Check the contents to validate the data further and reconcile the results matching with the COPR type.


Conversion using the same name space:

 

34.PNG

New entry with the same name space is created and available in RSA1.

25.PNG

Option 3:

 

To convert the below composite provider and its associated BEX query.

26.PNG

Execute the program with the below options:

27.PNG

Above selection will create the HCPR with the same name but creates as a copy and overwrite is not possible.


Post execution, this will prompt to select the list of BEX to be copied:

28.PNG

Proceed further will allow us to change the new Query name:

29.PNG

Rename can be done for example like below or to the customer name space.

30.PNG

Once this is done, the program lists the log for all the changes that are being performed.


In the database there is a new BEX query which has made an entry:

31.PNG

Observation: The old Composite is in the name space of @3 as prefix. And the new composite of type HCPR is referred with the same name space the way it looks in RSA1.

You can see the Old and new Composite.

32.PNG

Difference in the Bex Level between the old to the new:

32_1.PNG
This has additional Dimensions to list the InfoProviders separately where as in the COPR this is present in the Char. Catalog itself. Navigational attribute name space is changed as below:

32_2.PNG

Validate if the Output results are the same.


Limitations:

 

Below limitations are applicable:

33.PNG

Note: Prepare the data output before and after conversion and make sure that the results are same. Depending on the results, check whether the new HCPR need to be considered to make the old one as obsolete.


In WEBI, the old BEX has to be replaced with the new BEX if the converted HCPR is considered going forward and if the BO is being used on top of BEX.


Good Luck.

 

References: help.sap.com


Creating and Assigning Authorization in BW

$
0
0

In the past I created a blog post describing the Infoobjects Level authorizations:

 

SAP BW Authorization - InfoObjects level authorization


Now I will focus on creating and assigning authorization to BW:

 

Creating authorization


To create analysis authorization perform the following steps:

1. Use TCode RSECADMIN, go to the Authorizations tab.

2. Press Maint. button and enter a name (e.g., Z_USR_A1) and press Create.

3. Fill required Short Text field.

4. Insert special characteristics: 0TCAACTVT, 0TCAIPROV, and 0TCAVALID by pressing Insert Special Characteristics button.


1.png


5. Insert authorization-relevant characteristics and navigational attributes (Insert Row -> press F4 -> choose item). I described how to set in my previous blogSAP BW Authorization - InfoObjects level authorization.


6. Press Details button to restrict values and hierarchy authorization of inserted items.


7. Save the authorization.

 

You must include special characteristics: 0TCAACTVT (activity), 0TCAIPROV (InfoProvider), and 0TCAVALID (validity) in at least one authorization for a user. They are used for:

  • 0TCAACTVT - to restrict the authorization to activities, default value: Display;
  • 0TCAIPROV - to restrict the authorization to InfoProviders, default value: all (*);
  • 0TCAVALID - to restrict the validity of the authorization, default value: always valid (*).



If you want to authorize access to key figures, add 0TCAKYFNM characteristic to the authorization. It is important to know that if this characteristic is authorization-relevant, it will be always checked during query execution.

 

0BI_ALL authorization

The 0BI_ALL authorization includes all authorization-relevant characteristics. It is automatically updated when you restrict an infoobject. Use this authorization if you have users that are allowed to execute all queries.

 

Assigning authorization to a user

You may assign authorization directly to a user or to a role. To assign authorization directly use TCode RSECADMIN, go to the User tab and press Assign. Now enter the user name, pressChange and select the authorization. To assign authorization to the role use TCode PFCG, enter the role name and press Change. Using Authorization tab change authorization data by adding S_RS_AUTH entry. The entry includes analysis authorization in roles. Enter here authorization that you previously created.

 

Summary

I encourage you to collect all requirements related to BW security, structure of the organization and authorization needs before starting authorization preparation. I have learned that it can save a lot of time. Organization's hierarchy can facilitate your work by providing structures and levels of authorization. Indirect authorization assignment can also save your time because it is more flexible and easier to maintain.

BW/BI - Vol 1 - Loading a Flat File into Info cubes in BI

$
0
0

Greetings to All,


Hope you all are having a wonderful time. In this Blog I would like to explain the step by step procedure on how to load a flat file into Info cube in a BW system. Am in fact thrilled to write my first blog in this space.


Introduction:


Loading a flat file into BI system is one of the basic requirement  that all BI consultants should be knowing however due to the long process we at times may skip a step a or two which will result in failure of data load. In this blog I would like to come up with all the steps in detail along with screenshots, hope you find it interesting.


The following steps needs to be carried in completing our task,


 

  1. Create an Info Objects Catalog for both Characteristics and Key figures.
  2. Create Info Objects for both the Info Object Catalogs created in step 2.
  3. Create a Flat file Source System.
  4. Create an Application Component for the created Source System.
  5. Create a data source in the application component.
  6. Load data into PSA(Persistent Staging Area).
  7. Create an Info Package.
  8. Create an Info Cube by proper Assignment of info obects into fact and dimension table.
  9. Load data into Infocube through transformation and DTP(Data Transfer Process).


One need to follow all the above steps for proper data load into Infocube.Before we get into the steps let me brief you on the requirement. Consider we have a local file with five columns such as,

  1. Eid(Employee ID)
  2. Did(Department ID)
  3. Pid(Project ID)
  4. NOH(No Of Hours Worked)
  5. Sal(Salary)


Now we have to load the values in these file in to the BI system. For simplification purpose let us consider only five entries in the file as shown below, Please note that the file should be in excel CSV format.


SCN.PNG


Out of these five columns the first three columns Eid, Did, Pid will be categorized into characteristics since they are not subject to any change.

Whereas columns Noh, Sal will be grouped into key figures since they may change in future.

Now lets us see each step in detail,


Step 1 - Create an Info Area:


After logging into the system, execute the T-Code RSA1. You will end up to the below screen.  Go to Info Provider tab and right click on info provider and create an Info Area as shown below.


SCN.PNG


Once the above step is done you will get a pop-up, give the name of the Info Cube and its description as shown below and click continue.


SCN.PNG


Once InfoArea is created you can see the same in the Info provider List of as marked below.


1.PNG


Step 2- Create an Info Objects Catalog for both Characteristics and Key figures.


Now go to Info Objects tab and reach to the Info Area that is just created, if you don’t find the same refresh the  objects using the given(Marked in Green Below).

Right Click on the Info Area and Create an Info Object Catalog for Characteristics as shown below.


2.PNG


In the Pop up fill in all the details as shown below, Please do not forget to click on the characteristics radio button for object type since the catalog is created for characteristics. Then click the create icon and then activate(ICON marked in green).


3.PNG


Now you will find the catalog created as shown below,


4.PNG


Now Proceed to create another Catalogs for Key figures with only difference that you have to check Key Figures info object as marked below.


5.PNG


Once created again activate and you can find both the catalogs as shown below.


6.PNG


Now Right click on each catalog and create the respective info objects as shown below,


7.PNG


Now you will be getting a pop up as shown below, now we are creating info object for employee Id. Fill in the required details and click continue.


8.PNG



In the next screen provide the details as shown below, in this case since we are creating for Emp ID we select the character string data type with the length of 3 then click on activate button.


9.PNG



Now you will find the created info object of emp id under the characteristics catalog as shown below.


10.PNG


Similarly for fields department number and project id we select the same data type and length since we have similar data. We have selected such data just for simplification, in real time you may get complex data with complex data type. Once all three are created we will find all the info objects in the Test_IOC catalog as shown below.


11.PNG


Now proceed to right click on the Info Object catalog for key figures and create two info objects for salary and no of hours in the same as created above.

Here since it is a key figure the data types that must be selected will be of different set as shown below. Here we will select as number for our convenience as shown below.


12.PNG


Once both the key figures are created we will get a display as shown below.


13.PNG


So as of now we have created two info objects catalogs one for characteristics and other for key figures and we have also created their respective info objects which is 5 in our case.


Step 3- Create a Flat file Source System.


Go to source systems tab, right click on the source systems and click on create as shown below.


14.PNG


In the Next Pop up select the flat file radio button as shown below and click continue.


15.PNG


Give the name for the flat file as below and continue. Please note that this step will take some time so please be patient.


16.PNG

Once the flat file is created you can see it as shown below.


17.PNG


Step 4- Create an Application Component for the created Source System.


Double click on the flat file that was created in the above step, it will lead you to the data sources tab. Now right click on the top and try creating Application Component as shown below.


18.PNG


Now in the next Pop-Up give the details of the application component and click on continue.


19.PNG



Step 5-Create a data source in the application component.


Once the Application component is created in the above step you have to scroll down to last to see it, right click on it and create data source as shown below.


20.PNG


In the next pop up name the source system details and select the data source type, in our case we are trying to upload transaction data hence we select that and click on continue as shown below.


21.PNG


Step 6- Load data into PSA(Persistent Staging Area).

 

Now we have entered into loading data into PSA. As soon as you click on continue in the above step you will get the below tab called general details, enter the descriptions and go to the next tab as shown below.


22.PNG


In the next  tab is named as Extraction, this is the most important tab. Here fill all the values as shown in the below screen shot. For reference all important fields are briefed below.


 

Delta Process: Since we are doing a full load we have selected the option accordingly.


 

Adapter: Since the files are load from the local system, the option is selected accordingly.


 

File name: Browse the file from the local system and place it In the field.


 

Header rows to be ignored: Since the file has one row for header we provide as 1.


 

Data Format: Since it is a CSV file we select the option accordingly.


 

Data Separator: We provide “ , “ as the separator.


All the above mentioned details can be seen in the below screen shot.


23.PNG


In the next tab click on the load sample data button (marked in red below), you will able to see the sample data with comma separator as shown below.


24.PNG


In the next tab, you need not perform any operation just check the fields and data types are loaded correctly as shown below.


25.PNG


In the next tab click on Read preview data button as marked below, you will get a activate pop up button proceed to activate.




26.PNG


On loading data successfully you will get the below screen with the data.


27.PNG


Till now we have created and activated a data source and we have loaded data into data source via PSA.


 

Step 7 : Creation Of Info Package.


Now right click on the data source and create an Info Package as shown below.


28.PNG


In the next screen name the info package and continue as shown below and then save.


29.PNG


Now you will get the below screen, proceed to carry a check with option give below (Marked in Red)


30.PNG


Once the check is carried out, have a look at all the tabs whether all the values filled in earlier is similar. Reach to the final tab for scheduling, select the start immediately radio button and click on start as shown below.

On successful execution you will get a confirmation as data was requested below.


31.PNG


If you want to ensure that the data has been loaded into the system properly you can do the below steps:


  1. Double click on the data source.
  2. Select GOTO from menu bar, and select technical attributes.
  3. You will get a pop up as below, click on the table as marked below and check for the entries in the table. You should find the same data in the file here in this table

 

32.PNG

 

If all are fine till this step then proceed.



Step 8-Create an Info Cube:


Go to info Provider tab and right click on the info area that was created and create an info cube as shown below.

 

33.PNG


In the next screen name the Info cube and create as shown below.


34.PNG


In the next screen select the Info Object catalog icon as marked below.


35.PNG


Now you will be getting a pop with both Info Object catalog created in step 2. Double click the first catalog created for Characteristics. The pop up screen will be as below.


36.PNG


In the next screen you will get all the info object that are created under that catalog, just drag all those three info objects and drop it to Dimensions folder in the right(Marked in Red Below)


Remember you have to just a drag and drop each info objects from characteristics folder (In Green below) to Dimension node (In Red Below).


37.PNG



Now similarly click on the info objects icon again as did before, now select the catalog created for key figures and similarly drag and drop the key figures from key figures folder(Marked in Green) to the Key figures folder in the right (Marked in Red) as shown below. Then Click on activate icon.


38.PNG



Step 9-Creating Transformation:


  On activation you will get an info cube icon as shown below. Right click on the icon and select transformation as shown below.


39.PNG


In the next screen select the object type as data source and give the correct details of the data sources that we have created as shown below. Then click on continue.


40.PNG


In the next screen map the fields from the file to the fields in the data source that we have created. For mapping you have start from the field on the table and drag to the same field on the right table. After proper mapping activate (Ignore warnings if any) the same and you will get a screen as below.


41.PNG


Step 10-Create DTP (Data Transfer Process):


On Successful completion of above steps you will get an icon for DTP as below, right click on it and select create data transfer process as shown below.

 

41.1.PNG

 


In the next screen you will get a pop up as shown below, just proceed to continue without making any changes.


42.PNG


Now you will get a screen as below, under extraction tab select the extraction mode as full as shown below.


43.PNG


In the next tab called update, select the error handling method as shown below and proceed to next tab.


44.PNG


As soon as you reach the final execute tab first activate (Ignore warnings if any) the process, post which you can see the Execute button available (Before Activation it will greyed out).


45.PNG


Upon clicking on the execute button you will get a below pop up, select Yes and continue.


46.PNG


In the Next screen you will get a report page with all in green status (If the process is successful) as shown below. If the status is in yellow it means the process is still running in that case keep on refreshing until you get a green (Successful) or Red (failed).


47.PNG


The whole process till completes the successful load of flat file data into the info cubes. To check the data in info cube, right click on the info cube and select display data as shown below.


48.PNG


In the next screen select ‘fields for selection button’ near execute button as shown below,


49.PNG


Select the ‘select all’ button near execute button as shown below.


50.PNG


Now click on execute buttons to see the value in the Info cube as shown below,


51.PNG


Thus we have successfully loaded a data from the flat file into Info cube. Hope this blog will help you understand the concept clearly.


Conclusion:


I would like to thank you all for patiently reading such a long Blog, hope this serves you better. Please do share your reviews and feedbacks which will serve as a encouragement for my future blogs.

 

Thanks and Regards,

Satish Kumar Balasubramanian

Force Conversion Activity

$
0
0

Author: Subhash Matta

 

Company: NTT DATA Global Delivery
Services Limited


Author Bio :

Subhash Matta is a Senior Consultant at NTT DATA from the
SAP Analytics Practice

 

This SCN can be useful in some special cases where a cube compression activity fails due to Index size becomes larger than table (more than table size). Compression is nothing but deleting the Request ID’s and moving the data from F Fact Table to E Fact table. This will enable for an improved Query performance. If the compression activity fails it will hinder the Query performance. 


Take a scenario, where the cube compression fails.

First thing you can do would be a few/single request is taken into the compression request and try compressing.

If this attempt fails due to an error, “Failed compression:”
SQL-ERROR: 942 ORA-00xxx: table or view does not exist
.  This could be result of the Database adjustments or the table index size issue.

This can be resolved as below.

 

Step1:

 

 

Go to Transaction code, SE14 ABAP Dictionary: Database Utility. Give the table name
of the E Fact Table of an Infocube where in the error is showing.

 

pic1.png

 

Press Enter you will get the below screen.

 

Step2.

 

Click on “Activates and Adjust database” please make sure
that the option is on “SAVE DATA”, else you can lose the data in the cube.
Please be careful, as this will generally be done in Production, we cannot
afford to lose data.

 

pic2.png

 

pic3.png

 

 

 

Step3.

 

After the completion of “Activate and adjust” step, Goto -> Extras -> Force Conversion.

 

pic4.png

 

pic5.png

After this activtity
completes successfully the above shown message pops up. Please try repeating the compression again.

If this again fails, please check if the table is active, if not please activate the table and try to compress.

This should be successful.

 

NOTE: The solution suggested is applicable to BW version 7.23 and more. If you don’t have the
mentioned version please raise a request with SAP. Please perform this activity in background for monitoring the job.

Please note that this activity will take more time if the data is more in the table.

Sap documentation search x component assigning

$
0
0

Today I'm writing this blog post for reiterante the importance of chosse the right component when open an incident, and also on how to make a best search before open it.

SAP is working a lot on provide notes, KBA's and different documentation about our known errors, bugs and frequent customer questions, this is for improve our customer satisfaction and time.

 

For identify the relevant documentation to your inquiry, it is very important to know how to make the best search. For that the bellow note can help you:

2081285 - How to get best results from an SAP search?

 

Also, as you might know the component chosen to open your incident determines the expert team that will process your inquiry, and also the result of your document search. A wrong component chosen can cause unnecessary delays until your incident reaches the correct team and the relevant notes and documentation will not be identified.

 

As an information source on how SAP components may be defined, I'd like to recommend you the following SAP Wiki in which you'll find relevant information on how to define the component of your message and, therefore, obtain a faster response from our Experts.

 

 

 

      http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/22886

 

 

This information is valid to all SAP Support component areas and I suggest you to review it to make sure that you are opening your incidents under the correct expertise area avoiding delays in the resolution of your future issues.

 

 

Janaina

Browse infoprovider content via RSRT using a 'non existing' query

$
0
0

Imagine you've created an (acquisition) DSO for which you'd like to browse its content. Imagine this DSO contains General Ledger Data and is called MC_O005

 

 

Of course "Display data" or SE16 can come in handy, but slicing and dicing on this data isn't easy. Wouldn't it be nice if there was a (hidden gem) feature which enables you to browse the content of this infoprovider via transaction RSRT (the query monitor transaction), WITHOUT an actual query being created on top of this infoprovider?

 

 

Execute transaction RSRT and add the following query name: <infoprovider name>/!!A<infoprovider name>In our example, the above would lead to query name MC_O005/!!AMC_O005 (as to be seen in the screengrab above) Pressing enter generates the following success message on the bottom left of the screen

 

 

When the above success message is shown, a temporary query has been generated which can be executed....by pressing the execute button. The result of this temporary query is similar to the execution of a regular query

 

Isn't this neat ;-)

 

(This blog has also been cross posted on http://www.thesventor.com)

Tables for Technical Design Documents Creation

$
0
0

Every SAP project has phases like Blue print, Development, Testing, Migration, Maintenance etc. in each phase we need to spend considerable amount of time in documenting the activates we have planned and doing. On daily basis I spend good amount of time in developing the SAP Technical design documents. To capture the technical details quickly, easily & correctly, I use the following tables.

 

  1. Info Source Object (i-Src) : Fields in I-Src and their details       -RSKSFIELDNEW
  2. Data store Object (DSO) : Fields in DSO and their details        -RSDODSOIOBJ
  3. Info Cube Object (i-cube) : Fields in i-Cube and their details    -RSDCUBEIOBJ
  4. Multi Provider Object (M-Pro) : Fields in MPR and their details -RSDICMULTIIOBJ
  5. STMS (Transportation): How to identify the Object in the Project Package / $TMP Package

 

 

 

 

  1. Info Source Object (i-Src) : Fields in I-Src and their details  Table - RSKSFIELDNEW

2.JPG

 

 

2_2.JPG

 

 

 

2. Data store Object (DSO) : Fields in DSO and their details Table - RSDODSOIOBJ

dso 1.JPG

dso 1_1.JPG

 

 

3. InfoCube Object (i-cube) : Fields in i-Cube and their details Table - RSDCUBEIOBJ

 

Ic_1.JPG

 

Ic_2.JPG

 

 

 

4. Multi Provider Object (M-Pro) : Fields in MPRO and their details  Table- RSDICMULTIIOBJ

3.JPG

 

 

3_3.JPG

 

 

 

 

5. STMS (Transportation): Object in the Project Package / $TMP Package  Table - TADIR

 

A_1.JPG

 

 

 

 

How to find current objects in $TMP (For example)

  Selection used DEVCLASS = $TMP

a_2.JPG

 

 

 

How to find current objects in $TMP (For example)

Selection used DEVCLASS = $TMP : OBJECT = RSPC (Process Chain)

a_3.JPG

Are you facing deadlock issue while uploading master data attributes?

$
0
0

Sometimes you face such issues in SAP BW which may drive you crazy and this deadlock issue is one of them. I have recently resolved this infamous dump so decided to share my experience with you all. Before any further delay, let me tell you the system & database details about my system.

 

Component/SystemValues
SAP_BW740
Database SystemMSSQL
Kernel Release741
Sup.Pkg lvl.230

 

Let me first explain what is deadlock.

A database deadlock occurs when two processes lock each other's resources and are therefore unable to proceed.  This problem can only be solved by terminating one of the two transactions.  The database more or less at random terminates one of the transactions.

Example:

Process 1 locks resource A.

Process 2 locks resource B.

Process 1 requests resource A exclusively (-> lock) and waits for process 2 to end its transaction.

Process 2 requests resource B exclusively (-> lock) and waits for process 1 to end its transaction.

For example, resources are table records, which are locked by a modification or a Select-for-Update operation.

Following dump is expected when you will upload master data attribute.

Dump1.jpg

Sometimes you might encounter this dump too.

Dump2.jpg

 

Solution:

In order to avoid this issue please make sure that your DTP does not have semantic grouping ON and it's processing mode should be "Serially in the Background Process". To be on the safe side, I would recommend to create new DTP with these settings.

 

 

Please let me know if you find this blog helpful or not.

 

P.S. This was related to time-dependent master data.


BW transformation: defining custom methods

$
0
0

Sometines we need to create a huge ABAP expert routine to shave a mammoth into an elephant by bussines requirements.

And we using the FORM statement to split and structurize a ABAP code, but I've found a little hack that making possible to define and implement custom methods inside of transformation class.

 

Process consist of only two steps:

 


1. Methods definition in 1st global part.

expert_routine_1.PNG



2. Methods implementation after ENDMETHOD statement used to artificially close expert routine's body

expert_routine_2.PNG


 

That's it.

 

And I don't want discuss possible future migration issues on "BW 15.x", just enjoy.

Change released task to unreleased/ modifiable

$
0
0

Dear All,

 

Recently i have a faced one issue where i required to change TR status from released to unreleased.

I have assumption that once we released TR(task) then we can not revert TR to modiable, but that was wrong.

SAP have provided program using which we can do this.

 

Initially TR was in Released status.

Capture.PNG

 

Step 1: Go to SE38-> enter program name : RDDIT076 -> execute

 

 

Step 2: Enter TR/Task which status you want to convert from released to modifiable -> execute program

 

Step 3: you will find TR with R (Released) status.

 

double click on R status(encircled in above screen), you will find below screen where need to edit R status to D status(D= Modified).

 

 

click on Save. -> exit from program

 

Step 4: go to Se01-> enter TR name, you will find TR in unreleased(modifiable) status.

 

 

 

Regards,

Hitesh

Calculating MTD, QTD and YTD in BW Transformation

$
0
0

If you are a BW developer and you just read the title, you are probably thinking ‘…but shouldn’t that be done in the query’. You are right the ‘to date’ variants for aggregating date should be done at the query level whenever possible, which is almost all the time. But there do arise, situations when you want it done in the back-end transformations.

 

                I came up with one such scenario recently where we had to do the MTD, QTD and YTD calculations in our transformation layer. We had to calculate the ‘to-date’ aggregates for our key figures and store the data in the back-end. I couldn’t find a lot of information on how people usually do this when they have to do this in the back-end.

 

                The first thought that came to mind is to ‘loop the loop’, wherein we loop through the end-routine data and calculate the aggregation in an inner loop for every record. Say,  I have one year’s worth of data for a given account number, I would take each record and run it in a loop and decrement the date each time until the first day of the year aggregating my key figures each time. For example, if I had a record for 20151231, I would run the loop 365 times adding up my key figures until 20150101 calculating MTD, QTD and YTD inside the loop. But we all know a loop inside a loop is a strict No-no.

 

 

                I was experimenting on different ways to do this without involving a too much looping around. One such method is to use the AT-NEW ‘control level statement’, (https://help.sap.com/saphelp_470/helpdata/en/fc/eb381a358411d1829f0000e829fbfe/content.htm). The control-level statements itself have been in existence for a long time, but I am sure not many of us BW developers would have considered it inside of transformations.

 

My source metadata looked something like this,

SourceDataset.png

I was calculating the ‘to-date’ values at the lowest level of granularity, involving all the key fields.

 

In the end routine do a look back on source and SELECT-FOR ALL ENTRIES in result package at FISCYEAR level. This will give you a full year’s data even if a record exists for single date in a year in the result package.

 

SELECT  CO_AREA COMP_CODE PROFIT_CTR COSTCENTER CURTYPE FISCVARNT
CHRT_ACCTS ACCOUNT FISCYEAR CALQUARTER FISCPER DATE0
AMOUNT
CURRENCY
FROM /BIC/AZTDSO00100
INTO CORRESPONDING FIELDS OF TABLE lt_ZTDSO001
FOR ALL ENTRIES IN RESULT_PACKAGE
WHERE CO_AREA     = RESULT_PACKAGE-CO_AREA
AND   COMP_CODE   = RESULT_PACKAGE-COMP_CODE
AND   PROFIT_CTR  = RESULT_PACKAGE-PROFIT_CTR
AND   COSTCENTER  = RESULT_PACKAGE-COSTCENTER
AND   CURTYPE     = RESULT_PACKAGE-CURTYPE
AND   FISCVARNT   = RESULT_PACKAGE-FISCVARNT
AND   CHRT_ACCTS  = RESULT_PACKAGE-CHRT_ACCTS
AND   ACCOUNT     = RESULT_PACKAGE-ACCOUNT
AND   FISCYEAR    = RESULT_PACKAGE-FISCYEAR.

SORT lt_ZTDSO001 ASCENDING BY
CO_AREA COMP_CODE PROFIT_CTR COSTCENTER CURTYPE FISCVARNT
CHRT_ACCTS ACCOUNT FISCYEAR CALQUARTER FISCPER DATE0
.


The order of fields in the internal table is the key here as any change of value to the left of the field for which we check ‘AT-NEW’ would trigger a change. I am doing it at the ‘ACCOUNT’ level, so any change to the right of the ACCOUNT field would register as a NEW record. . The ascending order of sort will help run the loop only once.

In the LOOP below, the AMOUNT value is aggregated over every iteration and for every true ‘AT-NEW’ the corresponding ‘to-date’ key figure value is reset,

 

LOOP AT lt_ZTDSO001 ASSIGNING <fs_ZTDSO001>.
AT NEW FISCPER.
lv_kf_MTD
= 0.
ENDAT.
AT NEW CALQUARTER.
lv_kf_QTD
= 0.
ENDAT.
AT NEW FISCYEAR.
lv_kf_MTD
= lv_kf_QTD = lv_kf_YTD = 0.
ENDAT.
AT NEW ACCOUNT.
lv_kf_MTD
= lv_kf_QTD = lv_kf_YTD = 0.
ENDAT.

lv_kf_MTD
= lv_kf_MTD + <fs_ZTDSO001>-AMOUNT.
<fs_ZTDSO001>
-/BIC/ZTKFMTD = lv_kf_MTD.
lv_kf_QTD
= lv_kf_QTD + <fs_ZTDSO001>-AMOUNT.
<fs_ZTDSO001>
-/BIC/ZTKFQTD = lv_kf_QTD.
lv_kf_YTD
= lv_kf_YTD + <fs_ZTDSO001>-AMOUNT.
<fs_ZTDSO001>
-/BIC/ZTKFYTD = lv_kf_YTD.
ENDLOOP
.

 

Once the MTD, QTD and YTD values are calculated in the temporary internal table a second loop over the result package is necessary to copy over the calculated values.

LOOP AT RESULT_PACKAGE ASSIGNING <fs_PACKAGE>.
READ TABLE lt_ZTDSO001 INTO wa_ZTDSO001 WITH KEY
“table key
BINARY SEARCH.
IF SY-SUBRC = 0.
“populate calculated fields
ENDIF.
ENDLOOP
.

 

NOTE: We can even do this in a single loop if we know for sure we have an entire year’s data in one package.

 

The second best option to using AT-NEW would be to use a parallel cursor as mentioned in this document, http://scn.sap.com/docs/DOC-69322 . I did run a few tests between these two methods to check the number of times the loop executes and you can see how even a minimal loop using the cursor method compares to using  control-level statements.

 

For AT-NEW Code

ATNEWloopcount.png

Loop count is for one package of 50000 records

 

ATNEWload.png

 

 

Parallel Cursor Code

LOOP AT RESULT_PACKAGE ASSIGNING <fs_PACKAGE>.
READ TABLE lt_ZTDSO001 ASSIGNING <fs_ZTDSO001>
WITH KEY “table key
IF SY-SUBRC = 0.
lv_SYTABIX 
= SY-TABIX.
LOOP AT lt_ZTDSO001 FROM lv_SYTABIX ASSIGNING <fs_ZTDSO001>
WHERE “table key
IF <fs_ZTDSO001>-FISCYEAR = <fs_PACKAGE>-FISCYEAR.
lv_kf_YTD
= lv_kf_YTD + <fs_ZTDSO001>-AMOUNT.
ENDIF.
IF <fs_ZTDSO001>-CALQUARTER = <fs_PACKAGE>-CALQUARTER.
lv_kf_QTD
= lv_kf_QTD + <fs_ZTDSO001>-AMOUNT.
ENDIF.
IF <fs_ZTDSO001>-FISCPER = <fs_PACKAGE>-FISCPER.
lv_kf_MTD
= lv_kf_MTD + <fs_ZTDSO001>-AMOUNT.
ENDIF.
ENDLOOP.
“populate calculated fields
lv_kf_MTD
= lv_kf_QTD = lv_kf_YTD = 0.
ENDIF.
ENDLOOP
.

 

CursorLoop.png

Loop count is for one package of 50000 records

 

CursorLoad.png

 

 

The use of AT-NEW might not work for all scenarios but you can take it into consideration when you have to do some sort of aggregation inside your transformations.

 

And if you have a better way of doing this, please do write about it and share the link in the comments below for the benefit of the community.

Introducing an Add-on for Sending DTP Monitor Log to E-mail

$
0
0

Monitoring data loads can be considered as a recurring daily activity for any BW Support Organization. In some cases it is also required to have a deep-dive into the DTP Monitor Log, e.g. for observing any “data related issues”. Often this task is assigned to a Functional Application Manager or Business User who might not have access to the BW back-end. It would be convenient to automate the process by sending DTP Monitor Log entries by e-mail to one or several receipients.

In this blog I would like to introduce a comprehensive ABAP Add-on which I developed for facilitating this process. Please refer to my document Implementing an Add-on for Sending DTP Monitor Log to E-mail for detailed implementation instructions.

DTP Monitor Log Entries

The screenshot below shows an example of a DTP Monitor Log.

 

Figure_1_DTP_Monitor_Log.jpg

Figure 1: DTP Monitor Log

 

As you can see the Expert Routine sent some warning messages to the DTP Monitor Log. The message log can be displayed by double-clicking on the log icon.

 

Figure_2_DTP_Monitor_Log_Messages.jpg

Figure 2: DTP Monitor Log messages

 

For each message you can show more information by clicking on the question mark icon. Next to an optional long text, you can find here the Message Class (i.e. the first part of Message No. - in this example ZILL) and Message Number (i.e. the last three digits of the Message No. - in this example 004).

 

Figure_3_Example_Detailed_Message.jpg

Figure 3: Example of a detailed message

E-mail Add-on

The E-mail Add-on is an ABAP program which is intended to be included in a Process Chain. The program must run after the DTP for which you want the Monitor Log to be sent via e-mail. The next screenshot shows the selection screen of the program.

 

Figure_4_Selection_Screen.jpg

Figure 4: Selection screen

 

The program retrieves the latest DTP request according to the selections made on the selection screen. Subsequently, the monitor messages are filtered based on the Message Class (please see figure 3 for an example) and Message Type (e.g. Error, Warning, etc.) which can be optionally specified on the selection screen. If you don’t specify any Message Class and/or Message Type, all messages will be collected. It is mandatory to enter at least one E-mail Address and an E-mail Subject.

 

Furthermore, a word about the Abort flag. It is meant to influence how the program must behave in case of an e-mail send failure. I suggest to activate the Abort flag if sending the e-mail is crucial for the Business Users. The program terminates in such a case, the Process Chain becomes red and requires manual intervention.

 

Note: be aware that as a prerequisite the SAPconnect configuration must be OK. This configuration is not described here. Please refer to t/code SICF and SCOT.

 

As the last step the program displays a log of what has been executed.

 

Figure_5_Program_Log.jpg

Figure 5: Log at the end of the program

 

The program log will be stored as a spool request. This log can be viewed retrospectively by the BW Support Organization if required.

 

Last but not least, you can check out the outbound e-mail messages using SAPconnect Administration (t/code SCOT).

 

Figure_6_SAPconnect_Administration.jpg

Figure 6: SAPconnect administration

Conclusion

In this blog I introduced the E-mail Add-on. It is an ABAP program meant to automate the process of sending DTP Monitor Log entries by e-mail to one or several receipients. Please refer to my document Implementing an Add-on for Sending DTP Monitor Log to E-mail for more information re. implementing this E-mail Add-on.

Addition of new records to START or END routine

$
0
0

Dear All,

 

Many times there are sdn threads on issues while adding records to SOURCE_PACKAGE or ERSULT_PACKAGE.

 

This issue causes because while adding new records to SOURCE or RESULT Package, we need to add "Record Number" field in addition to other fields which we want  to add to result package.


There are two ways to resolve this issue:

 

Way-1) add record number manually to SOURCE or RESULT package.

Way-2) Call standard method provided by SAP .

 

Way-1) Update record number manually which inserting new records:

            In this way, we need to manually populate record number & then increment it manually while adding records to source/result package.

             Example: Suppose based on some condition in internal table IT_TAB1, we need to add records to result package.

 

            Step-1) First declare integer type of variable.

                         ex. DATA: lv_record_count TYPE I.

 

            Step-2) Populate records of result_package, this is start counter for our declared variable.

                         DESCRIBE TABLE RESULT_PACKAGE LINES lv_record_count.

 

                        This line will populate total number of records in RESULT_PACKAGE, for ex. in result_package we have 100 records then after                                               executing above ABAP statement , we will have 100 integer value inside variable lv_record_count.

      

             Step-3) Just increment this counter for each record while inserting new record in SOURCE/RESULT package.

                       

                        LOOP AT IT_TAB1 ASSIGNING <FS_1>  WHERE <condition>.

                           <---------- ABAP Syntax --------------------->

                         lv_record_count = lv_record_count + 1.

                         <FS_1>-record = lv_recod_count.

                         APPEND <FS_1> TO RESULT_PACKAGE.

                        ENDLOOP.

 

Way-2) Update record number using SAP method while inserting new records.

             <considering same scenario as of above>

            This way is more feasible and preferable, as provided by SAP.

               SAP provided method which populates record number automatically.

            

               

                        LOOP AT IT_TAB1 ASSIGNING <FS_1>  WHERE <condition>.

                         

                           <---------- ABAP Syntax --------------------->

                         CALL METHOD me->new_record__end_routine. 

                               EXPORTING

                                      source_segid = 1

                                      source_record = 0

                               IMPORTING

                                       record_new = <result_fields>-record.

 

                                 ""  This method automatically populates record number, we just need to import new record number populated by this method. 

                        ENDLOOP.

 

 

References:

http://scn.sap.com

SNOTE: 1223532 DEsign Rule: Addition of records to end routine

 

**************************** Thanks for your Time   ************************************** 

Automation of RDA Scheduling and Monitoring

$
0
0

     Real Time Data Acquisition can be controlled and monitored using trx. RSRDA. The same tasks can be accomplished programmatically. I created a set of programs that simplifies RDA scheduling and monitoring. It is not a big deal to debug trx. RSRDA and create such programs. What makes these programs unique is an added intelligence that helps:

  • identify scheduling problems (start, stop and monitor programs)
  • prevent locking conflicts (stop program)
  • identify failures even though deamon status is green (monitor program)

 

Z_RDA_DAEMON_START Program

Similar to starting RDA in trx. RSRDA it can be started using Z_RDA_DAEMON_START Program

RDA1.jpg

RDA2.jpg

If you run the program and daemon is already started the program issues an error message notifying about scheduling problem

RDA3.jpg

 

Z_RDA_DAEMON_STOP Program

Similar to as RDA is stopped in trx. RSRDA it can be stopped using Z_RDA_DAEMON_STOP Program

RDA4.jpg

RDA5.jpg

Program is intelligent enough to wait till the daemon is completely stopped such a way preventing possible locking conflict (for example, next step in day end deletes content of RDA DSO)

 

If you run the program and daemon is already stopped the program issues an error message notifying about scheduling problem

 

RDA6.jpg

 

 

Z_RDA_DAEMON_MONITOR Program

Similar to Monitoring RDA in trx. RSRDA it can be monitored using Z_RDA_DAEMON_MONITOR Program

RDA7.jpg

RDA8.jpg

RDA9.jpg

Monitor program also handles situation when daemon is running but underlying DTP or InfoPackage failed

RDA10.jpg

RDA11.jpg

If you run the program and daemon is already stopped the program issues an error message notifying about scheduling problem.

RDA12.jpg

 

Start, Stop and Monitor Process Chains

The programs are used in ABAP Process Chain Types Steps. RDA13.jpg

RDA14.jpg

RDA15.jpg

Here is an example of Monitor Process Chain in case of DTP failure

RDA16.jpg

RDA17.jpg

Described programs can installed from attached SAPLink archive (change extension to .nugg before installation and use the one for your respective BW release).

SAP BW - Security Customer Exit for use in Analysis Authorizations

$
0
0

Below is an example of the methodology on how to build a security model for SAP BW using Customer Exit and Analysis Authorizations.  Edit where necessary for your particular project.

 

Create a Z-table in the BW systems: ZCOUNTRY_USER

1.png

 

Use SM30 to add the appropriate mappings to this table.  This should only be completed by the security team as this mapping will allow users entered into the table to see the corresponding countries they are assigned to.

2.png

 

 

Create a BEx Variable of Processing By “Customer Exit”.  Note: You will not add this variable into the query.

3.png

When a BW Query with ZSLDTO_EX_REG authorization variable in it is executed, it will pull values taken from the Exit variable.

 

To input cmod code for the exit access the include for CMOD Project: #### (choose the Project for your project)

4.png

 

and Component EXIT_SAPLRRS0_001

5.png

 

and inside INCLUDE: ZXRSRU01

 

Insert the following code:

*** Declarations for Security Customer Exit ZSLDTO_EX_REG ***
DATA: it_zcountry_user TYPE STANDARD TABLE OF zcountry_user,
wa_zcountry_user
TYPE zcountry_user.
DATA: low_country like loc_var_range-low.
*** End of Declaration for Security Customer Exit ***

 

&

NOTE: Using i_step 0 was found to be a better fit in this particular case but i_step 1 can also be used in customer exits used to fill authorization values. Test both out to find the best fit for the requirements.

* This code will perform the security lookup for Country (0COUNTRY) based upon the user --> country mapping in the table ZCOUNTRY_USER.
WHEN 'ZSLDTO_EX_REG'.
DATA: l_uname type xubname.

IF i_step EQ '0'.
     
CALL FUNCTION 'RSEC_GET_USERNAME'
          
IMPORTING
                e_username
= l_uname.
     
REFRESH it_zcountry_user.
SELECT * FROM zcountry_user INTO TABLE it_zcountry_user WHERE uname = l_uname.
     
IF sy-subrc = 0.
          
LOOP AT it_zcountry_user into wa_zcountry_user.
               
CLEAR l_s_range.
                l_s_range
-low  = wa_zcountry_user-country.
                l_s_range
-sign = 'I'.
                l_s_range
-opt  = 'EQ'.
               
APPEND l_s_range TO e_t_range.
               
CLEAR wa_zcountry_user.
          
ENDLOOP.
     
ENDIF.
ENDIF.



&

Note: This step is optional and should only be used if you want to display the variable on the variable screen (ready for input on variable definition).


*** Validation on BW Security - Variable Screen
IF i_step EQ '3'.
LOOP AT i_t_var_range INTO loc_var_range WHERE vnam = 'ZSLDTO_EX_REG'.
CLEAR: l_s_range.
low_country
= loc_var_range-low.

*** Get values if stored in custom mapping table ***
SELECT SINGLE * FROM zcountry_user INTO wa_zcountry_user
WHERE country EQ low_country AND uname EQ sy-uname.

IF sy-subrc NE 0.
     
CALL FUNCTION 'RRMS_MESSAGE_HANDLING'
          
EXPORTING
               i_class 
= 'RSBBS'
               i_type  
= 'E'
               i_number
= '000'
               i_msgv1 
= 'No authorization for Country - '
               i_msgv2 
= loc_var_range-low
               i_msgv3 
= ' , Enter different Country or request access.'
               i_msgv4 
= sy-uname
          
EXCEPTIONS
               
OTHERS   = 2.
* raise the exception
          
RAISE again.
     
ENDIF.
     
ENDLOOP.
ENDIF.

 

 

Refer to OSS Note 1561635 as this was used to base the code on.

 

 

Assign the variable ZSLDTO_EX_REG to the Analysis Authorization Z_GD_COUNTRY by clicking the Variable button or by putting a $ sign in front of the variable technical name.

6.png

 

When the Z_GD_COUNTRY Analysis Authorization is assigned to the user (or role for more broad access), it will pull the data from the ZTABLE mentioned above even though that variable is not in the query, the analysis authorization sees the exit variable and executes it prior to i_step 1, 2 or 3.

 

Next step is to create the second variable for the authorization.

7.png

 

This is the variable that needs to be assigned to the query for Sold-To Country.  This variable can be made “ready for input” or not, depending upon the requirements needed.


How to Optimise the long running loads by DTP settings

$
0
0

Hi All,

 

Today I'm going to share one of most efficient way and the fix for long running loads.

 

In one of our application, there is delta load which usually completes in seconds, but from past 2 days is running long than the usually time. We started our investigation by checking the BG job and it is getting hanged up and we tired with doing full loads also is not happening.

 

1.png

 

Actually there is no error message. and the load that usually runs for a matter of seconds is now taking more than 5 hours and still not processing any data. source has 56,000 records to be processed. However, the DTP ran for almost 4.5 hours and did not process any records.

Due to this, we had to cancel the load and delete the request since it hampers the rest of the processes.

 

Then we try running with reducing the data package size from 50,000 to 1,000 and set semantic grouping on  Sales org, division, customer sales and distribution channel and also we consider running the loads by enabling the setting “Get All New data request by request”. ( since there are more than one delta request from the source )

 

As per the logic between source A and target B,  it will pick up all the records from target DSO B based on Sales org, division, customer sales and distribution channel.

 

Now the DTP load is trying to process 120,405 records with record mode =”N” and you can imagine when it does look up on B DSO how much records it will bring(definitely more than half a million?) and the ABAP heap memory limit will hit when it reaches at the maximum. So, we continued with running with small data package size with semantic grouping. So the lookup cost & processing also will be optimise.

 

2.png

 

Now same loads are now successfully completed within 15 mins. after making the below proposed changes to the DTP settings.

 

From the below screen-print, you can find the explosion of source package(A DSO ) <–> compare the LINES READ and LINES TRANSFERRED columns.

The total delta created in A DSO is 0.12 Million which has explode to 3.4 Million; which is 27 times bigger in terms of volume!! and it is not practically feasible to accommodate all 3.4 Million records in single data package.

 

3.png

 

Analysis:

 

With the initial DTP settings i.e., data package size = 50,000 records and without semantic grouping + parallel processing = 3, you can imagine totally 3 packages(with split of records 50000+50000+24901) would be created and when it was not semantically grouped, the same set of records with key combinations Sales org, division, customer sales and distribution channel processed in package 1 might get a chance of processing in rest of the other data packages also. So, logically the number of records explode Vs. the amount of ABAP heap memory reserved per background job definitely will not be suffice to accomplish this data load.

 

 

Certainly this is the reason where we find ABAP shortage of memory dumps in ST22 and when the processes running with the PRIV memory mode, still if it don’t find enough memory to process further, then can find the job not progressing because it will wait for any other jobs to release the memory

 

Coming to the other part on load type = delta or full : Be it delta or full, the logic will treat the loads in the same way.. because we don’t consider the record mode = “X”(of Change log) and thus, it won’t make any difference(on delta or full) though you run the loads with the same selection as in the Change log.

 

 

Furthermore,

It was the same case even in the past where 6K records explode to .6 M(~100 times bigger!!) and the total runtime to complete this job was 38 minutes.

 

77.png

 

 

Solution:

• Ensure the semantic grouping is always maintained in the both full loads as well as delta loads.

• Prefer to keep the data package size from 50,000 to 1,000 and if you still find performance issues, you can consider running with even small data package size(e.g., 500 records per package).

• To boost the load performance, you can consider to use up-to 6 parallel work processes.

 

Thanks,

Siva

How to check Multiprovider for unused Infoprovider

$
0
0

Introduction

After running a BW-system for several years, there will be many InfoProviders where you are not sure if they are really being used by BEx Queries. But every time you have to modify or would like to replace the InfoProvider, you will have to edit the Multiprovider in which the Infoprovider is included.

A good option would be to remove the (unused) Infoprovider from the Multiprovider (where it won’t be called at query runtime to read data).

It would take a lot of time to check every query (based on the Multiprovider in which the Infoprovider is included). Therefore, it would be better to check the BW metadata tables via an ABAP program.


Checking Multiprovider and their queries

In order to determine the usage of a part provider in a query, we will have to

  • get the mapped key figures of the Infoprovider within the Multiprovider
  • check if these keyfigures are used in BEx queries/key figures (and no other PartProvider is filtered there)
  • find out if the filter at 0INFOPROV (in restricted key figure) is set to the PartProvider (“Fitting IP-Sel.?”=Y)


Instructions for use:

  • You can use single values, intervals or ranges (e.g. T*) on the selection screen; cubes and DSOs can also be used simultaneously.

HowTo_UnusedPartProv_201602_SelScreen.jpg

Selection-Screen: Example with three InfoProvider (to check for usage)

 

 

The result is split into the

  • Usage of every Infoprovider within a  Multiprovider
    • “Keyf.incl.?” (=Key figure included?): set to X if one key figure of the PartProvider is used in the shown query/structure/key figure (“Mapname”)
    • “IP-Sel.?”(=Infoprovider selection existing?): X will be shown in this column when the global BEx element (“Mapname”) includes a selection of “0INFOPROV”
    • “Fitting IP-Sel.?” (=Fitting 0INFOPROV Selection?): In addition to “IP-Sel.?”, this field will be marked if the 0INFOPROV selection includes the shown Part cube/Provider (e.g. 0INFOPROV=TCHC001)
  • When there are no queries or other global BEx elements based on the resp. Multiprovider, there will be no entries at the details (e.g. Multiprovider: TCGMZ001)
  • At the chapter “==> LIST OF RESULTS:”, the aggregation of the detailed results from before will be displayed. The PartProvider which are not used in the respective Multiprovider (Used?=N) can be  removed from the Multiprovider (from a technical perspective).


  • List of Providers without any Multiprovider inclusion
    • Are these Providers still relevant or can they be deleted? 

 

HowTo_UnusedPartProv_201602_Results.jpg

Results: Example with three InfoProvider (to check for usage)


Summary

With the two lists (“LIST OF RESULTS:” / “CUBES WITHOUT MULTIPROVIDER INCLUSION”) you know which Infoprovider is not used by BEX elements within the resp. Multiprovider or that this Infoprovider is not used by any Multiprovider at all.

 

Please note that the program is only fitting for InfoCubes/DSOs within a Multiprovider and that it is not used for aggregation levels, planning functions or Composite Providers.




CODING

*&---------------------------------------------------------------------*
*& Report  ZBI_BEX_PARTCUBES_UNUSED
*&---------------------------------------------------------------------*
*& Created by/on: C. Heinrich - (11.02.2016)
*& Targets:
*& A) check usage of Cubes/DSO, included in Multiprovider, in queries
*&    and global BEx-elements (e.g. restricted key-figures)
*& B) also show Provider, which are not included in any Multiprovider
*& Additions:
*&  -> beside (all types of) Cubes, also DSOs are supported
*&---------------------------------------------------------------------*
REPORT ZBI_BEX_PARTCUBES_UNUSED LINE-SIZE 160 NO STANDARD PAGE HEADING.

TABLES: RSDCUBET.

**** Variables, internal tables
DATA: BEGIN OF ls_mpro,  " List of MultiProvider
        INFOCUBE TYPE RSINFOPROV,
        PARTCUBE TYPE RSINFOPROV,
        USED(1) TYPE C,
       END OF ls_mpro,
       lt_mpro LIKE STANDARD TABLE OF ls_mpro.
DATA: BEGIN OF ls_keyf, " List of found key-figures
        INFOCUBE TYPE RSINFOPROV,
        PARTCUBE TYPE RSINFOPROV,
        IOBJNM  TYPE RSIOBJNM,
       END OF ls_keyf,
       lt_keyf LIKE HASHED TABLE OF ls_keyf
         WITH UNIQUE KEY INFOCUBE IOBJNM PARTCUBE.
DATA: BEGIN OF ls_compic, " List of queries/elements to MultiProvider
         COMPUID TYPE SYSUUID_25,
         MAPNAME TYPE RSZCOMPID,
         INFOCUBE TYPE RSINFOPROV,
         PARTCUBE TYPE RSINFOPROV,
         FLAG_KEYF(1) TYPE C,     " key-f. of part-cube included(=X)?
         FLAG_SEL_IC(1) TYPE C,   " selection at 0INFOPROV exists? (Y/N)
         FLAG_PARTCUBE(1) TYPE C, " selection at corret part-cube (=Y)?
       END OF ls_compic,
       lt_compic LIKE STANDARD TABLE OF ls_compic.
DATA: lv_used(1) TYPE c.
*** Field-symbols:
FIELD-SYMBOLS: <fs_mpro> LIKE ls_mpro,
                <fs_compic> LIKE ls_compic,
                <fs_keyf> LIKE ls_keyf.

**** A) Selection-screen
SELECTION-SCREEN: BEGIN OF BLOCK b1 WITH FRAME TITLE text-001.
   SELECT-OPTIONS: S_CUBE FOR RSDCUBET-INFOCUBE OBLIGATORY.
SELECTION-SCREEN: END OF BLOCK b1.


*** B) Checking usage of Partprovider(s):
START-OF-SELECTION.

    " --> getting list of Multiprovider, including the Cube(s):
    SELECT RSDCUBEMULTI~INFOCUBE RSDCUBEMULTI~PARTCUBE
      FROM RSDCUBEMULTI
      INTO CORRESPONDING FIELDS OF TABLE lt_mpro
      WHERE RSDCUBEMULTI~OBJVERS = 'A'
        AND RSDCUBEMULTI~PARTCUBE IN S_CUBE.

    SKIP 1. FORMAT INTENSIFIED ON.
    WRITE: / '==> CHECK PER MULTI-PROVIDER AND GLOBAL BEX-ELEMENT:'.
    FORMAT INTENSIFIED OFF.
    LOOP AT lt_mpro ASSIGNING <fs_mpro>.
      " Header: Multiprovider / Part-Cube
      WRITE: / 'Multiprovider: ', <fs_mpro>-INFOCUBE,
               50 'Partprov.: ', <fs_mpro>-PARTCUBE.
      " --> get key-figures of part-cube which are included at the Multiprovider(s):
      SELECT RSDICMULTIIOBJ~INFOCUBE RSDICMULTIIOBJ~IOBJNM
             RSDICMULTIIOBJ~PARTCUBE
        FROM RSDICMULTIIOBJ INNER JOIN RSDIOBJ
          ON RSDICMULTIIOBJ~PARTIOBJ = RSDIOBJ~IOBJNM
         AND RSDICMULTIIOBJ~OBJVERS = RSDIOBJ~OBJVERS
        INTO CORRESPONDING FIELDS OF TABLE lt_keyf
        WHERE RSDICMULTIIOBJ~INFOCUBE = <fs_mpro>-INFOCUBE
          AND RSDICMULTIIOBJ~PARTCUBE = <fs_mpro>-PARTCUBE
          AND RSDICMULTIIOBJ~OBJVERS = 'A'
          AND RSDIOBJ~IOBJTP = 'KYF'.

***   --> now get queries and other BEx-elements similar to restricted key-figures
***       and structures, containing the key-figure(s) of the Partprovider(s):
       SELECT COMPUID MAPNAME INFOCUBE
         FROM RSZCOMPIC INNER JOIN RSZELTDIR
           ON RSZCOMPIC~COMPUID = RSZELTDIR~ELTUID
          AND RSZCOMPIC~OBJVERS = RSZELTDIR~OBJVERS
         INTO CORRESPONDING FIELDS OF ls_compic
         WHERE RSZCOMPIC~INFOCUBE = <fs_mpro>-INFOCUBE
           AND RSZCOMPIC~OBJVERS = 'A'.
         ls_compic-PARTCUBE = <fs_mpro>-PARTCUBE.
         APPEND ls_compic TO lt_compic.
       ENDSELECT.

       " every BEx-component should be checked downwards (recursively),
       " if one of the relevant key-figures is used and the Infoprovider (0INFOPROV)-
       " selection could cover also the part-cube(s):
       WRITE: / 'MultiProvider', 25 'Part-Cube', 40 'Mapname',
                85 'Keyf.incl?', 100 'IP-Sel.?', 110 'Fitting IP-Sel.?'.
       WRITE: / '------------------------------------------------------------' &
                '-------------------------------------------------------------' &
                '-----'.

       lv_used = 'N'. " Start with "is used"=no
       " check every query/key-figure for usage
       LOOP AT lt_compic ASSIGNING <fs_compic>.

           PERFORM CHECK_BEX_ELEMENT_RECURSIVE USING <fs_compic>-COMPUID
                                               CHANGING <fs_compic>.
           IF <fs_compic>-FLAG_KEYF = 'X' AND ( <fs_compic>-FLAG_PARTCUBE = 'Y'
                                              OR <fs_compic>-FLAG_SEL_IC <> 'Y' ).
              lv_used = 'Y'. " is used=yes
           ENDIF.
           WRITE: / <fs_compic>-INFOCUBE,     25 <fs_compic>-PARTCUBE,
                 40 <fs_compic>-MAPNAME,      85 <fs_compic>-FLAG_KEYF,
                 100 <fs_compic>-FLAG_SEL_IC, 110 <fs_compic>-FLAG_PARTCUBE.
       ENDLOOP.
       <fs_mpro>-USED = lv_used.
       SKIP.
    ENDLOOP.


*** C) Log at end of run:
     SKIP 3.
     ULINE.
     " C1. List of partprovider(s) usage
     FORMAT INTENSIFIED ON. WRITE: / '==> LIST OF RESULTS: '. FORMAT INTENSIFIED OFF.
     WRITE: / 'Multiprovider', 30 'Infocube', 50 'Used?'.
     WRITE: / '---------------------------------------------------------'.
     LOOP AT lt_mpro ASSIGNING <fs_mpro>.
       WRITE: / <fs_mpro>-INFOCUBE, 30 <fs_mpro>-PARTCUBE,
               50 <fs_mpro>-USED.
     ENDLOOP.
     SKIP 3.
     " C2. List of not-anywhere used part-cubes
     WRITE: / '--------------------------------------------------------------------------------'.
     FORMAT INTENSIFIED ON. WRITE: / '==> CUBE´S WITHOUT MULTIPROVIDER-INCLUSION: '.
     FORMAT INTENSIFIED OFF.
     PERFORM MISSING_CUBES.






*************************************************************************************
* Form CHECK_BEX_ELEMENT_RECURSIVE
* Key-figure and selection to 0INFOPROV will be checked by every BEx-element down-
* wards (form in recursive mode)
*************************************************************************************
FORM CHECK_BEX_ELEMENT_RECURSIVE USING I_ELTUID TYPE SYSUUID_25
                                  CHANGING C_COMPIC LIKE LS_COMPIC.
      DATA: BEGIN OF ls_range,  " entry at RSZRANGE
              IOBJNM TYPE RSIOBJNM,
               SIGN TYPE RALDB_SIGN,
               OPT  TYPE RSZ_OPERATOR,
               LOW  TYPE RSCHAVL,
               HIGH TYPE RSCHAVL,
            END OF ls_range.
      DATA: lv_eltuid TYPE SYSUUID_25.

      " Selection of Partprovider (at 0INFOPROV) or relevant key-figure present?
      " e.g. RSZRANGE-IOBJNM = '1KYFNM' plus RSZRANGE-LOW = 'MyKeyfigure'
      " or   RSZRANGE-IOBJNM = '0INFOPROV' plus RSZANGE-LOW = 'MyCube'
      SELECT IOBJNM SIGN OPT LOW HIGH FROM RSZRANGE
        INTO CORRESPONDING FIELDS OF ls_range
        WHERE ELTUID = I_ELTUID
          AND OBJVERS = 'A'
          AND IOBJNM IN ('1KYFNM', '0INFOPROV').
        IF ls_range-IOBJNM = '1KYFNM'.
           READ TABLE lt_keyf ASSIGNING <fs_keyf>
             WITH KEY INFOCUBE = C_COMPIC-INFOCUBE
                      IOBJNM = ls_range-LOW
                      PARTCUBE = C_COMPIC-PARTCUBE.
           IF sy-subrc = 0. " Key-figure of Part-Provider used at query!
              C_COMPIC-FLAG_KEYF = 'X'.
           ENDIF.
        ELSEIF ls_range-IOBJNM = '0INFOPROV'. " Selection at Infoprovider exists
           C_COMPIC-FLAG_SEL_IC = 'Y'.
           IF ls_range-SIGN = 'I' AND ls_range-OPT = 'EQ'. " Single value
             IF ls_range-LOW = C_COMPIC-PARTCUBE. " Selection to fitting part-cube exists
                 C_COMPIC-FLAG_PARTCUBE = 'Y'.
             ENDIF.
           ELSEIF ls_range-SIGN = 'I' AND ls_range-OPT = 'BT'. " Interval
             IF ls_range-LOW <> ''
               AND C_COMPIC-PARTCUBE BETWEEN ls_range-LOW AND ls_range-HIGH.
                 C_COMPIC-FLAG_PARTCUBE = 'Y'.
             ENDIF.
           ENDIF.
        ENDIF.
      ENDSELECT.

     " further check sub-elements of current BEx-element:
     SELECT TELTUID FROM RSZELTXREF INTO lv_eltuid
       WHERE SELTUID = I_ELTUID
         AND OBJVERS = 'A'.

       PERFORM CHECK_BEX_ELEMENT_RECURSIVE USING lv_eltuid
                                           CHANGING C_COMPIC.
     ENDSELECT.

ENDFORM.

*************************************************************************************
* Form MISSING_CUBES
* Some part-cube(s) not found in any Multiprovider? To this case, there should also
* a list be generated and shown
*************************************************************************************
FORM MISSING_CUBES.
   DATA: BEGIN OF ls_cube,
           INFOCUBE TYPE RSINFOCUBE, " Cube-Name
           CUBETYPE TYPE RSCUBETYPE, " Cube-/DSO-type
           TXTLG TYPE RSTXTLG, " Infoprovider-Text
         END OF ls_cube,
         lt_cube LIKE STANDARD TABLE OF ls_cube.
   FIELD-SYMBOLS: <fs_cube> LIKE ls_cube.
   " first get all cube(s) to selection:
   SELECT RSDCUBE~INFOCUBE RSDCUBE~CUBETYPE RSDCUBET~TXTLG
     FROM RSDCUBE INNER JOIN RSDCUBET
       ON RSDCUBE~INFOCUBE = RSDCUBET~INFOCUBE
      AND RSDCUBE~OBJVERS = RSDCUBET~OBJVERS
     INTO CORRESPONDING FIELDS OF TABLE lt_cube
     WHERE RSDCUBE~INFOCUBE IN S_CUBE
       AND RSDCUBE~OBJVERS = 'A'
       AND RSDCUBE~CUBETYPE <> 'M' " MPro not relevant
       AND RSDCUBET~LANGU = sy-langu.

   " also, add DSO to the selection:
   SELECT RSDODSO~ODSOBJECT AS INFOCUBE RSDODSO~ODSOTYPE AS CUBETYPE
          RSDODSOT~TXTLG AS TXTLG
     FROM RSDODSO INNER JOIN RSDODSOT
       ON RSDODSO~ODSOBJECT = RSDODSOT~ODSOBJECT
      AND RSDODSO~OBJVERS = RSDODSOT~OBJVERS
     APPENDING CORRESPONDING FIELDS OF TABLE lt_cube
     WHERE RSDODSO~ODSOBJECT IN S_CUBE
       AND RSDODSO~OBJVERS = 'A'
       AND RSDODSOT~LANGU = sy-langu.

   " now check cube(s) at multiprovider assignments:
   LOOP AT LT_CUBE ASSIGNING <fs_cube>.
     READ TABLE lt_mpro TRANSPORTING NO FIELDS
       WITH KEY PARTCUBE = <fs_cube>-INFOCUBE.
     IF sy-subrc = 0. " Cube is included in one multi-provider, no entry necessary
        <fs_cube>-INFOCUBE = 'DELETE'.
     ELSE.
        CASE <fs_cube>-CUBETYPE.
          WHEN '' OR ' ' OR 'T' OR 'W'. "=DSO
            WRITE: / 'DSO  ', <fs_cube>-INFOCUBE, 18 '/', <fs_cube>-TXTLG, 80 ' not included in any Multiprovider!'.
          WHEN OTHERS.
            WRITE: / 'Cube ', <fs_cube>-INFOCUBE, 18 '/', <fs_cube>-TXTLG, 80 ' not included in any Multiprovider!'.
        ENDCASE.
     ENDIF.
   ENDLOOP.
   DELETE LT_CUBE WHERE INFOCUBE = 'DELETE'.


ENDFORM.

Code Customer Exit Variables with Renewed ABAP

$
0
0

Latest BW releases offer better options for coding Customer Exit Variables by means of using:

  • RSROA_VARIABLES_EXIT_BADI BADI instead of RSR00001 exit to structuring code better (since BW 7.31);
  • New ABAP features to write short and robust code (since ABAP 7.40 and some features since ABAP 7.02).

 

I created a simple BW Query based on SFLIGHT data model to demonstrate renewed ABAP in action. The Query displays number passengers per airline who traveled certain distance ranges specified on selection screen.Customer Exit Variable Renewed ABAP 1.jpgCustomer Exit Variable Renewed ABAP 2.jpg

 

If Distance Ranges are not in a sequence, then Customer Exit issues an error message:

Customer Exit Variable Renewed ABAP 3.jpg

So what Customer Exit is doing:

  • Sets Default Values for Characteristic Distance Range Variables;
  • Sets Text Distance Range Text Variables based on user input;
  • Validates Distance Characteristic Range Variables.

I coded all above in RSROA_VARIABLES_EXIT_BADI BADI implementation. It is possible to have multiple BADI implementations and isolate coding for a set of variables that logically belong to each other.

Customer Exit Variable Renewed ABAP 4.jpg

BADI implementations are isolated by means of Filter Values

Customer Exit Variable Renewed ABAP 4.jpg

Customer Exit Variable Renewed ABAP 5.jpg

BADI implementation will be executed for DISTANCE InfoObject Variables (Combination 1) as well as for Texts Variables (Combination 2) and at the time of Variables Validation (Combination 3).

Actual coding is done in IF_RSROA_VARIABLES_EXIT_BADI~PROCESS method of ZCL_RSROA_VAR_EXIT_DISTANCE class

Customer Exit Variable Renewed ABAP 6.jpg

 

Below is IF_RSROA_VARIABLES_EXIT_BADI~PROCESS method code in ABAP 7.40 syntax:

 

METHOD if_rsroa_variables_exit_badi~process.

 
CASE i_step.
 
WHEN 1.       "Before Selection
   
CASE i_vnam.
   
WHEN 'DIST_1'.
      c_t_range
= VALUE #( ( sign = 'I' opt = 'EQ' low = '00600' ) ).
   
WHEN 'DIST_2'.
      c_t_range
= VALUE #( ( sign = 'I' opt = 'EQ' low = '01000' ) ).
   
WHEN 'DIST_3'.
      c_t_range
= VALUE #( ( sign = 'I' opt = 'EQ' low = '05000' ) ).
   
WHEN 'DIST_4'.
      c_t_range
= VALUE #( ( sign = 'I' opt = 'EQ' low = '09000' ) ).
   
ENDCASE.
 
WHEN 2.    "After selection screen
   
CASE i_vnam.
   
WHEN 'DIST_1H_TXT'
     
OR 'DIST_2H_TXT'
     
OR 'DIST_3H_TXT'
     
OR 'DIST_4H_TXT'.
      c_t_range
= VALUE #( ( low  = replace( val = |{ i_t_var_range[ vnam = substring( val = |{ i_vnam }| off = 0 len = 6 ) ]-low }| regex = '^0+' with = '' occ = 1 ) ) ).
   
WHEN 'DIST_2L_TXT'
     
OR 'DIST_3L_TXT'
     
OR 'DIST_4L_TXT'
     
OR 'DIST_5L_TXT'.
      c_t_range
= VALUE #( ( low = replace( val = |{ i_t_var_range[ vnam = i_vnam+0(5) && |{ i_vnam+5(1) - 1 }| ]-low + 1 }| regex = '^0+' with = '' occ = 1 ) ) ).
   
ENDCASE.
 
WHEN 3.    "Validation
   
TRY.
     
DO 3 TIMES.
       
IF i_t_var_range[ vnam = 'DIST_' && |{ sy-index}| ]-low > i_t_var_range[ vnam = 'DIST_' && |{ sy-index+ 1 }| ]-low.
         
DATA(w_message) = |Range| && |{ sy-index}| && | is greater then Range | && |{ sy-index+ 1 }|.
         
CALL FUNCTION 'RRMS_MESSAGE_HANDLING'
              
EXPORTING
                    i_class 
= 'OO'
                    i_type  
= 'E'
                    i_number
= '000'
                    i_msgv1 
= w_message.
         
RAISE EXCEPTION TYPE cx_rs_error.
       
ENDIF.
     
ENDDO.
   
CATCH cx_sy_itab_line_not_found INTO DATA(itab_line_not_found).
   
ENDTRY.
 
ENDCASE.

ENDMETHOD.

 

In step 1 (before selection screen) BADI sets initial values for Distance Characteristic Variables.

In step 2 (after selection screen) BADI reads Distance Characteristic Variables and populates Distance Text Variables.

In step 3 (validation) BADI validates Distance Characteristic Variables entered on selection screen.

The code is short, well structured and easy to understand. Imagine dumping code for all other BW variables in the system into the same BADI, then it would become unreadable and unmanageable.

 

The same logic implemented in RSR00001 customer exit using ABAP 7.0 syntax would look like this:


DATA: wa_rangeTYPE rsr_s_rangesid.
DATA: wa_var_rangeTYPE rrrangeexit.
DATA: w_vnamTYPE rszglobv-vnam.
DATA: w_dist_1 TYPE rschavl.
DATA: w_dist_2 TYPE rschavl.
DATA: w_messageTYPE string.
DATA: wa_var_range_1 LIKE rrrangeexit.
DATA: wa_var_range_2 LIKE rrrangeexit.

CASE i_step.
WHEN 1.       "Before Selection
 
CASE i_vnam.
 
WHEN 'DIST_1'.
    wa_range
-sign= 'I'.
    wa_range
-opt  = 'EQ'.
    wa_range
-low  = '00600'.
   
APPEND wa_rangeTO e_t_range.
 
WHEN 'DIST_2'.
    wa_range
-sign= 'I'.
    wa_range
-opt  = 'EQ'.
    wa_range
-low  = '01000'.
   
APPEND wa_rangeTO e_t_range.
 
WHEN 'DIST_3'.
    wa_range
-sign= 'I'.
    wa_range
-opt  = 'EQ'.
    wa_range
-low  = '05000'.
   
APPEND wa_rangeTO e_t_range.
 
WHEN 'DIST_4'.
    wa_range
-sign= 'I'.
    wa_range
-opt  = 'EQ'.
    wa_range
-low  = '09000'.
   
APPEND wa_rangeTO e_t_range.
 
ENDCASE.
WHEN 2.    "After selection screen
 
CASE i_vnam.
 
WHEN 'DIST_1H_TXT'
   
OR 'DIST_2H_TXT'
   
OR 'DIST_3H_TXT'
   
OR 'DIST_4H_TXT'.
   
READ TABLE i_t_var_range INTO wa_var_range WITH KEY vnam = i_vnam+0(6).
   
SHIFT wa_var_range-low LEFT DELETINGLEADING '0'.
    wa_range
-low = wa_var_range-low.
   
APPEND wa_rangeTO e_t_range.
 
WHEN 'DIST_2L_TXT'
   
OR 'DIST_3L_TXT'
   
OR 'DIST_4L_TXT'
   
OR 'DIST_5L_TXT'.
    w_vnam
= i_vnam.
    w_vnam+5
(1) = w_vnam+5(1) - 1.
   
READ TABLE i_t_var_range INTO wa_var_range WITH KEY vnam = w_vnam+0(6).
    wa_var_range
-low+0(5) = wa_var_range-low+0(5) + 1.
   
SHIFT wa_var_range-low LEFT DELETINGLEADING SPACE.
    wa_range
-low = wa_var_range-low.
   
APPEND wa_rangeTO e_t_range.
 
ENDCASE.
WHEN 3.    "Validation
 
DO 3 TIMES.
    w_vnam
= sy-index.
   
SHIFT w_vnamLEFT DELETINGLEADING SPACE.
   
CONCATENATE 'Range' w_vnam INTO w_message SEPARATED BY SPACE.
   
CONCATENATE 'DIST_' w_vnamINTO w_vnam.
   
READ TABLE i_t_var_rangeINTO wa_var_range_1 WITH KEY vnam= w_vnam.
   
CHECK sy-subrc= 0.
    w_vnam
= sy-index + 1.
   
SHIFT w_vnamLEFT DELETINGLEADING SPACE.
   
CONCATENATE w_message 'is greater then' w_vnam INTO w_message SEPARATED BY SPACE.
   
CONCATENATE 'DIST_' w_vnamINTO w_vnam.
   
READ TABLE i_t_var_rangeINTO wa_var_range_2 WITH KEY vnam= w_vnam.
   
CHECK sy-subrc= 0.
   
IF wa_var_range_1-low > wa_var_range_2-low.
     
CALL FUNCTION 'RRMS_MESSAGE_HANDLING'
          
EXPORTING
                i_class 
= 'OO'
                i_type  
= 'E'
                i_number
= '000'
                i_msgv1 
= w_message.
     
RAISE no_processing.
   
ENDIF.
 
ENDDO.
ENDCASE.

 

ABAP 7.0 syntax is almost twice the size of ABAP 7.40 syntax. The comparison speaks for itself.

 

Now I want quickly go over the ABAP 7.40 syntax features that are the most useful for coding Customer Exit Variables.

 

 

String Templates

They are powerful option for defining Variables values. String Templates saves you trouble wrting multiple CONCATENATE, WRITE, SHIFT, REPLACE, etc statements. All of them can be combined into one String Template thanks to:

  • Chaining Operators && (to concatenate strings)
  • Embedded Expressions  { expression }
  • Build-in functions

In my example, I build w_message string concatenating strings and expressions (sy-index system variable and sy-index offset system variable). Note that also inline declaration of w_message ABAP variable was used to save trouble defining it beforehand.

 

DATA(w_message) = |Range| && |{ sy-index}| && | is greater then Range | && |{ sy-index+ 1 }|.

 

 

VALUE Operator

Can be used initalize Variable table parameter in just one statement.

In my example, I set default Distance Ranges as

 

c_t_range = VALUE #( ( sign = 'I' opt = 'EQ' low = '00600' ) ).

 

 

Expression to Access Internal Table

In i_step = 2 (after selection screen) Variable value can be conveniently read from Internal Table using expression rather then READ TABLE statement

In my example, I populate Distance Text Variable reading Distance Characteristic Variable. Note that replace function is strips leading 0. This is also a good example of nested expressions.

 

c_t_range = VALUE #( ( low  = replace( val = |{ i_t_var_range[ vnam = substring( val = |{ i_vnam }| off = 0 len = 6 ) ]-low }| regex = '^0+' with = '' occ = 1 ) ) ).

 

 

Expression in IF Statment

In i_step = 3 (validation) expressions can be used in IF statement to validate Variables values.

In my example, in IF statement I check if preceding Distance Range is not greater then subsequent Distance Range.

 

IF i_t_var_range[ vnam = 'DIST_' && |{ sy-index}| ]-low > i_t_var_range[ vnam = 'DIST_' && |{ sy-index+ 1 }| ]-low.
     ...

ENDIF.

Sending Monitor Log Messages with a Custom Formula

$
0
0

In the context of Transformations you can choose for rule type Formula. SAP delivered a Formula Builder with many ready-to-use Formulas. It is even possible to extend the Formula Builder with your own custom Formulas by implementing Business Add-in (BAdI) RSAR_CONNECTOR.

My requirement was not only to deliver such a custom Formula. I also had to add a message to the Monitor Log in case a certain condition was met. In this blog I would like to share with you how to handle such a requirement and to discuss some implementation aspects.

BAdI Implementation

Custom Formulas can be made available by implementing classic BAdI RSAR_CONNECTOR. Next to the on-line BAdI documentation you can refer to the following documents for detailed implementation instructions:

 

 

Result of my implementation is a new custom group with custom Formulas.

 

Figure_1_Example_Custom_Group.jpg

Figure 1: Example custom group with custom Formulas

 

The last custom Formula is used in the example Transformation rule.

 

Figure_2_Example_Transformation_Rule.jpg

Figure 2: Example Transformation rule with a custom Formula

Monitor Log Messages

Implementing the BAdI was actually the easy part. It was not so easy to implement Monitor Log messages in the custom Formula.

For reference purposes I used standard Formula LOG_MESSAGE. I copied the relevant source code from the standard Formula and appended it to my own implementation. Please refer to line 12 to 17 in the next screenshot.

 

Figure_3_Source_Code_BAdI_Implementation.jpg

Figure 3: Source code in BAdI implementation

 

Unfortunately, my custom Formula did not send any messages to the Monitor Log. After some investigation I found that also standard Formula LOG_MESSAGE did not send the messages to the Monitor Log.

I raised an SAP Support ticket and it turned out to be a program error. After implementing SAP Note 2285154 - 730SP15:Standard formula function LOG_MESSAGE doesn't send the issue was solved for the standard Formula. However, the custom Formula still did not send any messages.

 

Diving a bit deeper I found an interesting explanation. The generated Transformation program is based on template programs such as RSTRAN_RULE_TMPL_ROUTINE.  In the source code you can see how the Formula routine is composed. Pay attention to line 173 in the next screenshot.

 

Figure_4_Template_Program_Transformation_Rule.jpg

Figure 4: Template program for Transformation rule

 

The source code of line 174 to 180 is only applied if the Formula contains string LOG_MESSAGE.

 

So the “hidden feature” is that you have to include string LOG_MESSAGE in the technical name of the custom Formula. Only then the crucial source code is appended to the Transformation rule routine in the generated program.

 

Figure_5_Generated_Transformation_Program.jpg

Figure 5: Generated Transformation program

Conclusion

In this blog I demonstrated how to send a message to the Monitor Log in the context of a custom Formula. You have to append similar source code to the BAdI implementation as found in standard Formula LOG_MESSAGE. A small “hidden feature” is that you also have to include string LOG_MESSAGE in the technical name of the custom Formula. Only then crucial source code is appended to the Transformation rule routine in the generated program. Now the messages are correctly sent to the Monitor log.

Business Objects / Business Warehouse Trial in Cloud

$
0
0

    If you need Business Objects / Business Warehouse Trial then this blog is for you . Right now SAP offers BW 7.4 SP08 and BOBJ 4.1 free Cloud Application Library trials (you pay only for Amazon Web Services only). The fact that these are two separate trials has its pros and cons. Pros is that you have more flexibility to control AWS costs by starting / stopping BW and BOBJ separately. Cons is that you have to connect BW to BOBJ yourself. In my blog I will explain how to connect BW to BOBJ and demonstrate BW / BOBJ end-to-end scenario.

    These are CAL free trials:

BW BOBJ Sandbox 1.jpg

BW BOBJ Sandbox 2.jpg

These are the costs of running BW and BOBJ instances:

BW BOBJ Sandbox 3.jpg

BW BOBJ Sandbox 4.jpg

Note: it is important to check Public Static IP address check-box for BW instance to save you trouble updating BOBJ OLAP Connection every time BW is started.

    Once BW and BOBJ instances are created, in AWS EC2 Management Console you check and make a note of BW IP address (you will need to connect it to BOBJ). As you can also see BW comes also with front e.g. a remote desktop with SAP GUI and BW Modeling Tools in Eclipse.

BW BOBJ Sandbox 5.jpg

 

Create BW OLAP Connection in BOBJ CMC

BW BOBJ Sandbox 6.jpg

It is important to set Authentication mode to Pre-defined, otherwise in Prompt mode Webi will not see our OLAP Connection

Note: that server name is the Public Static IP Address of BW Server from AWS EC2 Management Console

 

Make TCP Ports of BW Server Accessible from Anywere

 

Without this BOBJ will not be able to connect to BW. Open BW Server security group in AWS EC2 Management Console

BW BOBJ Sandbox 7.jpg

Edit Inbound Rules

BW BOBJ Sandbox 8.jpg

Modify first entry

BW BOBJ Sandbox 9.jpg

And delete second entry

BW BOBJ Sandbox 10.jpg

Save Inbound Rules

BW BOBJ Sandbox 11.jpg

 

 

Install SAP GUI Business Explorer

 

What I noticed is that Eclipse BW Modeling Tools are not working because BW Project can not be expaded (it dumps on BW side see trx. ST22). I suggest to install SAP GUI Business Explorer and create BW Queries there and do all the modeling in SAP GUI trx. RSA1. Alternatively you can user other BW Trials (BW 7.4 SP10 on HANA or BW 7.4 SP01 on HANA), but the latter have higher AWS costs.

 

 

Create End to End Scenario

 

Create BW Query and allow external access for the Query

BW BOBJ Sandbox 12.jpg

In Web Intelligence create a new report selecting D_NW_C01 BW Query as a source from BW OLAP Connection

BW BOBJ Sandbox 13.jpg

BW BOBJ Sandbox 15.jpg

BW BOBJ Sandbox 16.jpg

Viewing all 333 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>