Quantcast
Channel: SCN : Blog List - SAP Business Warehouse
Viewing all 333 articles
Browse latest View live

How to find Last Executed User of BEx Query

$
0
0

Introduction

 

          In our BW projects, we are frequently required to know the "BEx Query last execution" details from BI Statistics. In this blog, I am going to show " How to deal with this common requirement in a simple manner"


Let's suppose we require Last Executed date, Last modified date, Last Modified by and last but not least Last Executed user. There is no standard BI Statistics Query to give last executed user details. We need to either hit Statistics Cubes or tables.

 

The first table which comes to our mind is RSDDSTAT_OLAP. This is basically a view out of following 3 tables below :-

 

  • RSDDSTATHEADER
  • RSDDSTATINFO
  • RSDDSTATEVDATA

 

We can find Last Executed User name from this table(below image). But this table can hold data up to specific time limit mentioned in RSADMIN table for Object: TCT_KEEP_OLAP_DM_DATA_N_DAYS. Usually, the value will be 30 days. So we will be able to get Last Executed user details of the BEx Queries which were executed in last 30 days only.

RSDDSTAT_OLAP.jpg

What if our requirement is to get User names who executed 6 months ago ?

 

Even the tables RSZCOMPDIR and RSRREPDIR tables do not store Last Executed User details. These tables store info like Last Executed date, Last modified date, Last Modified by etc

 

This is the situation where we need to hit BI Statistics cubes. If your BI Statistics process chains are being run regularly, then the historical statistics data might got uploaded to Cubes.

 

Create a simple BEx Query on Cube, Front-End and OLAP Statistics (Details) : 0TCT_C02 like below :

 

Query 1.jpg

Query 2.jpg

The reason why I have used highlighted Infoobjects in rows pane is, they are Compounding Infoobjects to Query Runtime object.

If we do not use these 3 infoobjects in the Query, then "Query Runtime Object" field will show data separated by / .

 

To avoid this and show just Query name in the column, we need to use them like above image and choose "No Display" like below. All this is for better visibility in the report.

 

Query 3.jpg

 

Drag all the 3 KFs into KF pane from the left side(Cube). This is just to complete the definition. Create a new Formula variable locally in New formula for Latest date. This should be a replacement path variable by Calday.

 

Since our objective is to show last executed user, we will simply create a Condition on Latest date KF as TOP N = 1. This will show the record which was the latest up to 6 months back(as we have set a filter in query) query 4.jpg

 

To run this Query, you should paste the query technical names in selection screen(right side) like below. If we try to enter in direct input, they will not be found because Query runtime object field expects values in compounded form separated with slashes( / ).

 

Query 5.jpg

Upon executing the query, you will get a report like below which shows Query Name, Calday and Last executed User(6 months back).

 

Query 6.jpg

 

Thanks for reading!!


Identifying New Master Data Records

$
0
0

I have come across a typical situation and requirement to identify new records(G/L Account master data) . The actual business requirement was everyday we use to bring full load G/L master data into analysis system . After update mapping department will download all data from G/L Account P table and going cross check with any mapping done for each G/L account.

 

Here mapping department( due to business ethics I am not able to discuss what is mapping), needs to download and executing some manual steps and SQL statements they will come back with list G/L's which are not mapped.

 

Everyday they need to download huge amount data and compare this job will consume lot time hence they asked me what are the possibilities of identifying only new G/L records.

 

After doing some research I thought of create generic data source on SID table of G/L Account infoobject.

 

Everybody knows that in SID table SID value get generate only when new values come and update, taking this option created a generic datasource on SID table with delta numeric pointer on SID field.

 

Here are the few screenshots which I attached for reference and clear understanding of delta on SID field.

 

 

Created Generic Datasource on SID table of G/L Account infoobject.

1.png

 

 

Delta capability selected as numeric pointer on SID field since SID value get created when new G/L Account created , we can extract only new G/L Accounts.

 

 

2.png

 

Note: From 7.4 on-wards not required any generic delta because we have option to see records 0RECORDMODE values as we see in DSO.

 

In SPA BW on HANA 7.4 on wards we have 0RECODMODE included by default if you check below option.

3.png

 

Thanks for reading this Blog. I do welcome for any suggestions.

 

Regards,

Nanda

Simplified: SAP BW 7.4 Release Notes

$
0
0

Hello Fans of BW,

 

it has been a bit of a challenge to find out what has changed with a new SAP BW release or what features were introduced with certain support packages. In general, the most essential changes are described in the solution and product roadmap and corresponding webinar sessions at http://service.sap.com/saproadmaps (SAP BW is under "Database & Technology").

 

On a more detailed level, you will always find the changes completely documented in our online documentation "What’s New – Release Notes" page at http://help.sap.com/nw74. This area describes all new features, enhancements to existing features, changes to existing features, and - in some cases - also features that have been deleted. When we publish a new support package, then the online documentation will be updated accordingly. Naturally, the higher the support package, the less changes will be included.

 

If you ever tried to get a complete overview - for example for SAP BW 7.4 from SP 0 to the current SP 9 - then you will quickly have noticed that it's quite cumbersome to browse though all the pages, click, click, click.

 

We often get the question, which features are just available with SAP BW powered by SAP HANA and which also work on other platforms. Did you know for example that 61% of the new features and enhancements are also relevant for non-HANA systems (86 of 142). So there are plenty of reasons for all customers to upgrade to 7.4!

 

Or you would like to know what exactly was changed for a particular component in SAP BW like Business Planning. Or you need to know the differences between two support package levels...

 

And this is where I have come up with a simplified solution: Drum roll... A consolidated spreadsheet!

 

rn01.png

 

The spreadsheet contains all release notes for 7.4

 

  • By support package
  • By component
  • With links to the documentation
  • A short description (so you don’t even have to click the links)
  • An indicator whether a feature is new, changed, deleted, or an enhancement
  • A flag for HANA
  • Of course you can filter, sort, and search quickly (no problem for you Excel Wizards, right?)

 

The magic of spreadsheets makes it possible to answer all those questions in a matter of seconds. For example, if you want to know what’s new for OLAP but does not require HANA, just filter component on “BW-BEX-OT-OLAP” and HANA-only on “No”. Or you can find out quickly the difference between for example SP 6 and SP 8. It’s simple and easy and should be a great help for you, customers, partners, and of course our own field organization.

 

The first version of the spreadsheet has been published as an attachment to the "SAPBWNews" for BW 7.40 SP 9 which you can find in SAP Note 2030800. Going forward, an update spreadsheet will be included with the corresponding SAP Note for each support package.

 

I hope you like the new format but nothing is perfect. Please let me know any feedback or if you have ideas on how to make it better.

 

Enjoy,

Marc

Product Management SAP EDW (BW/HANA)

Homogeneous System Copy on SAP Hana

$
0
0

Hi guys,



For a couple of days I observed several customers tryng to do System Copy on SAP Hana systems.


Is relevant to highlight that System copy for SAP Hana IS DIFFERENT from normal System copies.



>> For normal system copies you should follow instructions from note:


886102 - System Landscape Copy for SAP NetWeaver BW



>> But, for System copy on SAP Hana systems you need to read and follow steps from note:


1844468 - Homogeneous system copy on SAP HANA



There is also the HANA System Copy Guide:


HANA System Copy Guide



Best Regards,

Janaina

BW-WHM* Released SAP Notes (February)

$
0
0

Hello Guys,

 

 

I only would share with you the BW-WHM* released notes for February:

 

 

Component

NumberDescriptionRelease on
BW-WHM-AWB1600929SAP BW powered by SAP HANA DB: Information25.02.2015
BW-WHM-AWB2121217Remodeling tool for SPO is not working17.02.2015
BW-WHM-AWB2083701Entering an Integer value in a key figure Integer type field in transaction RSI13.02.2015
BW-WHM-AWB2117741Inclusion of ADSOs and HCPRs in data flow model12.02.2015
BW-WHM-DBA2111541Update 1 to Security Note 196581926.02.2015
BW-WHM-DBA-COPR2122411HCPR: CompositeProvider corrections for Release 7.40, part 1227.02.2015
BW-WHM-DBA-COPR2050557No aggregation with 0REQUID in PartProvider before join in CompositeProvider25.02.2015
BW-WHM-DBA-COPR2111944RSOHCPR/RSOADSO include adjustment19.02.2015
BW-WHM-DBA-ICUB2009574Duplicate number range object in different InfoProvider19.02.2015
BW-WHM-DBA-ICUB1999013RSRV - Initial Key Figure Units in Fact Tables test results in exception12.02.2015
BW-WHM-DBA-ICUB2125857Cube writer: Termination when loading extraction from cube12.02.2015
BW-WHM-DBA-ICUB2062714The figure for the year is more than 250010.02.2015
BW-WHM-DBA-ICUB1946893ORA-00060 Deadlock in Cube load05.02.2015
BW-WHM-DBA-ICUB2124904An "Invalid BW namespace definition" message is displayed in application log du03.02.2015
BW-WHM-DBA-IOBJ2103263Enhancement of structure BAPI InfoObject BAPI6108 fields with fields UOMCONV an20.02.2015
BW-WHM-DBA-IOBJ1984625ABAP Dictionary errors during InfoObject activation with change to compounding18.02.2015
BW-WHM-DBA-IOBJ2069619Function module RSD_IOBJ_MULTI_GET returns incorrect results if parameter I_REA12.02.2015
BW-WHM-DBA-IOBJ2111658A system dump occurs when viewing the database table status of a unit type char12.02.2015
BW-WHM-DBA-IOBJ1827295InfoObject with an attribute that has statistics created06.02.2015
BW-WHM-DBA-IOBJ2111737Enhancement of InfoObject impact analysis with regard to local characteristics06.02.2015
BW-WHM-DBA-IOBJ2099865Using RSA1 on InfoProvider is taking a long time04.02.2015
BW-WHM-DBA-ISET2050817Using RSA3 may cause a system dump while trying to reference a missing navigati12.02.2015
BW-WHM-DBA-MD1918525Performance optimization during text loading13.02.2015
BW-WHM-DBA-MPRO2082301Executing RSUPGRCHECK may display inconsistent MultiProvider24.02.2015
BW-WHM-DBA-ODS2070577(advanced) DataStore Object - availability in BW7.4 SP08 and SP0919.02.2015
BW-WHM-DBA-ODS2118329LISTCUBE: ADSOs not displayed in input help19.02.2015
BW-WHM-DBA-ODSV2123371Open ODS view: Error when opening using BW Modeling Tools if source object does17.02.2015
BW-WHM-DBA-ODSV2122710Open ODS view: Impact deletes active version06.02.2015
BW-WHM-DBA-ODSV2123034Open ODS view: Suboptimal query runtime after initial activation06.02.2015
BW-WHM-DBA-ODSV2123726Open ODS view: SQL error with field of type "Client"06.02.2015
BW-WHM-DBA-ODSV2121708Open ODS view: SQL errors for difficult data types03.02.2015
BW-WHM-DBA-OHS2046551Open Hub with query as source27.02.2015
BW-WHM-DBA-OHS2065393The old data is not deleted from DB with Deleting Data from Table when DB Conne18.02.2015
BW-WHM-DBA-OHS1980716Description for Open Hub Destination is truncated from 60 to 30 characters.09.02.2015
BW-WHM-DBA-OHS2122189SAP BW 7.40(SP11) Open Hub Destination can't be activated06.02.2015
BW-WHM-DBA-RMT1944429Remodeling - InfoObject remodeling is not supported18.02.2015
BW-WHM-DBA-RMT1966654RSMRT: Corrections for SAP BW 7.40 SP502.02.2015
BW-WHM-DBA-SPO2063607DTP filter generation from SPO/reading data from RemoteCube doesn't work25.02.2015
BW-WHM-DBA-SPO2079081730SP13: Error Handler in the DTP template with source as SPO or MPRO is switch24.02.2015
BW-WHM-DOC2017437Preliminary Version SAPBWNews BW 7.02 ABAP SP 1706.02.2015
BW-WHM-DOC1332017Preliminary Version SAPBWNews BW 7.02 ABAP SP 0104.02.2015
BW-WHM-DOC1367863SAPBWNews BW 7.02 ABAP SP 0204.02.2015
BW-WHM-DOC1800952SAPBWNews BW 7.02 ABAP SP 1404.02.2015
BW-WHM-DOC1940530SAPBWNews BW 7.02 ABAP SP 1604.02.2015
BW-WHM-DST769414Support Package 23:Lock Manager log written in batch process24.02.2015
BW-WHM-DST2080574Mass activation programs unintentionally load generated programs into main memo18.02.2015
BW-WHM-DST2116836P34; WO DSO: Dump in FM RSM1_DELETE_WO_DSO_REQUESTS06.02.2015
BW-WHM-DST2123548P35; RSSM_GET_TIME; APO: OPEN CURSOR is destroyed06.02.2015
BW-WHM-DST-ARC1858550Downport NLS IQ to BW 7.0X11.02.2015
BW-WHM-DST-DBC1888353DBC SAPLRSDS_ACCESS_FRONTEND termination due to neg. length16.02.2015
BW-WHM-DST-DBC2119576DB connect - error in RSDL after EHP6 update06.02.2015
BW-WHM-DST-DFG2118610P13; DFG: Data flow dumps or hangs06.02.2015
BW-WHM-DST-DS2124619730SP13: Activation of multi-segmented datasource inactivates transformation/DT13.02.2015
BW-WHM-DST-DS1809892Enable extraction from old versions of a DataSource12.02.2015
BW-WHM-DST-DTP2119480Fehler RSTRAN 840, Beratungshinweis 1851875 obsolete26.02.2015
BW-WHM-DST-DTP2080701P13: DTP: Access to active DTAs during import of DTPs25.02.2015
BW-WHM-DST-DTP2116840P35:WO-DSO:ACTIVE-Feld in RSBODSLOGSTATE zurück setzen25.02.2015
BW-WHM-DST-DTP2125520P14:MPRO:DTP:REDUCE:Performance bei vielen Mpro-Requests24.02.2015
BW-WHM-DST-DTP1915498P32; DTP: BDLS does not convert DTPH LOGSYS entries23.02.2015
BW-WHM-DST-DTP2123709P13:REDUCE:MPRO:LPOA:Zu alte Requests werden extrahiert23.02.2015
BW-WHM-DST-DTP1943907P33: DTP: Dump when executing DTP with expert mode simulation; data package17.02.2015
BW-WHM-DST-DTP2118187P13:MPRO:DTP:DTA löschen und Overflow der Mpro-Requests13.02.2015
BW-WHM-DST-DTP2121282P34; DTP; serial extraction with semantic group never ends10.02.2015
BW-WHM-DST-DTP2124611P35:DTP:WO-DSO:REQUDEL:Perf. beim Delta-Req. ermitteln02.02.2015
BW-WHM-DST-HAP2067912SAP HANA transformations and analysis processes: SAP Notes for SAP NetWeaver 7425.02.2015
BW-WHM-DST-HAP2033679SAP HANA transformations and analysis processes: SAP Notes for SAP NetWeaver 7424.02.2015
BW-WHM-DST-PC2021473Addition of process log to mail if the process is successful.13.02.2015
BW-WHM-DST-PC1692199Short Dump DBIF_RSQL_SQL_ERROR / WRITE_ICFACT11.02.2015
BW-WHM-DST-RDA2125240RDA_RESET: Request is closed without execution of data transfer process (DTP)19.02.2015
BW-WHM-DST-SDL2117128P13:IPAK:3rd party selection fields do not disappear11.02.2015
BW-WHM-DST-TRF2118057BW 7.40 SP8/SP9/SP10: HANA Analysis Processes and HANA Transformations (Part 9)27.02.2015
BW-WHM-DST-TRF1816350731SP8:Syntax errors in routines or Assertion failed during activation of trans26.02.2015
BW-WHM-DST-TRF2109129SAP HANA Execution: Inserted value too large for column25.02.2015
BW-WHM-DST-TRF2117312Collection note HAPs & HANA-Transformations24.02.2015
BW-WHM-DST-TRF2104509SAP HANA Execution: SAP HANA analysis process does not exist12.02.2015
BW-WHM-DST-TRF2125734Program RSDG_TRFN_ACTIVATE sets incorrect transformations to inactive without r06.02.2015
BW-WHM-DST-TRF2118892SAP BW 7.40(SP11) Script changes not transported05.02.2015
BW-WHM-DST-TRF2123327SAP BW 7.40(SP11) Changes of InfoObject not taken into InfoSource of SPO04.02.2015
BW-WHM-DST-TRF1995901730SP12: Performance problem in opening Transformation in change mode03.02.2015
BW-WHM-DST-TRF2100247730SP13: Performance problems in aggregation check in Transformations03.02.2015
BW-WHM-MTD-CTS2042927Incorrect AIM execution: No repeated activation of successfully processed objec17.02.2015
BW-WHM-MTD-CTS2120719RC = 12 during script-based Inhouse transport, error RSO 78117.02.2015
BW-WHM-MTD-CTS1978136Error when deleting BPC objects05.02.2015
BW-WHM-MTD-HMOD2032830External SAP HANA view: Inventory key figures (non-cumulative key figures)24.02.2015
BW-WHM-MTD-HMOD2121712Improved troubleshooting during termination of authorization replication with e18.02.2015
BW-WHM-TC-SCA2111723Password lost in task CL_RSO_UPDATE_BWMANDTRFC_HOST23.02.2015
BW-WHM-TC-SCA2124850Class CL_RSO_DTP_ERROR_LOG_DELETE is missing from Housekeeping task list09.02.2015

 

 

Best Regards,

Janaina

DTP error message RSKB257

$
0
0

Symptom

 

You have an infocube based on a standard extractor.

But you get error message when DTP's executed: Status 'Processed with Errors' error code RSKB257 without any more details.

 

 

A possible scenario:

 

 

 

1. Go to transaction RSDTP and enter DTP's technical ID

 

Note the DTP's Request ID (example 557.165)

 

 

2. Go to SM37 and enter field Job Name: BIDTPR_557165*

 

> Job name is started with BIDTPR_ then the Request ID (without dot separator) and put an * (asterisk) wildcard character after it.


Then enter field User Name: *

 

> User Name will be also an * (asterisk) wildcard character.

 

Only select Canceled job status below.

 

 

Delete the From and To dates from Job Start Condition area.

 

Click Execute.

 

 

3. Review job log.

 

You may find more lines like this and can double click on it:

 

 

 

22.01.2015 07:51:18 Enter rate TRY / EUR5 rate type M for 08.04.2011 in the system settings                             SG           105          E

 

 

 

 

Long description of the error message:

 

Enter rate TRY / EUR5 rate type M for 08.04.2011 in the system settings
Message no. SG105

 

Diagnosis


For the conversion of an amount into another currency, an entry is missing in the currency conversion table.

Procedure


Add the missing entry in the currency conversion table.

 

Execute function

You can then continue to process the commercial transaction.

 

 

 

 

Solution for this kind of issue:

 

 

With execution of the report RCURTEST in transaction SE38, you may check the missing exchange entries.

 

This issue may have happened after you have transfered the exchange rates to BW. Check if have you transfered the global settings also.

 

Please also check if the missing currency of the underlying company code caused this error.

Business Warehouse Performance Tuning at Source System

$
0
0

BW Performance At source

 

 

1) Maintain the control parameter for data transfer in SBIW -> General settings -> Maintain control parameters for data transfer.

 

Source system table ROIDOCPRMS: It contains the control parameters for data transfer from the source system to BW.

STATFRQU - Number of packets that are transferred before statistical info is sent

MAXPROCS - Maximum number of dialog work processes per upload request used to send the data to the BW system

MAXLINES - Maximum number of records sent in one IDoc packet.

MAXSIZE - Maximum size of an IDoc packet in KB.

 

 

Important Points to be considered.

 

A) Package size = MAXSIZE * 1000 / size of the transfer structure 

 

Package Size -  Not more than MAXLINES.

Transfer Structure size is determine by using SE11 (ABAP Dictionary) -> Extras -> Table Width -> Length of data division under ABAP.



B) If table ROIDOCPRMS is empty, the systems use default values during run-time. You should not allow these default values to be used.

 

SAP Note 1597364 - FAQ: BW-BCT: Extraction performance in source system

SAP Note 417307 Extractor package size: Collective note for applications

 

 

2) Values for Max Conn and Max Runtime in SMQS (Configure number of IDoc to be processed parallel depending on number dialog  process available in BW)

 

Cause

 

tRFC processing is very slow and has "Transaction recorded" status in SM58 or IDOC processing delay or Workflow processing delay

 

Resolution

  1. Call transaction SMQS
  2. Choose the destination
  3. Click Registration
  4. Increase the Max.Conn (Enter the number  of connection ) , this is directly proportion the available dialog process  in BW system. Example if BW has 30 Dialog process then you can try Max.Conn as 20. 
  5. Increase Max. Runtime(For example 1800)

 

SMQS.png

 

Sap Notes

 

1887368 - tRFC process slow in SM58 and Max.Conn is not fully used in SMQS

 

 

3) IDoc processing and Performance

The "Trigger Immediately" \ "Transfer IDoc Immed." options should be always used.

 

 

How to change the processing mode for the IDocs in question as follows:

For inbound:
-> go to transaction WE20 -> Select Partner Select Inbound Message Type and change the processing method to "Trigger Immediately" .

For Outbound:
-> go to transaction WE20 -> Select Partner Select Outbound Message Type and change the processing method to "Transfer IDoc Immedi."

 


What will happen IDocs are scheduled , reports RBDAPP01 \ RSEOUT00 in batch mode will be processed via scheduled runs. This should then leave the majority of dialog work processes free for users and mission-critical processes. Hence you will no longer encounter the resources problems you are currently experiencing.

 

4. Performance problem in collective job RMBWV3nn

Try to maintain the control parameters for the extraction for MCEX(xx) queue by the method below.
Transaction Code LBWR > Queue name (MCEX Queue) > Customizing > No.Of.Documents = 1000 to 3000.Check if this is reflected in Table TMCEXUPD-UPD_HEAD_COUNT field.

The adjustment of  TMCEXUPD- UPD_HEAD_COUNT will need to be tested for each application, as setting too large a value can result in a memory dump.

 

 

 

Part 2 - Performance Improvements in BW System

Getting reacquainted with the BW Administration Cockpit.

$
0
0

Purpose

 

This document is meant to reintroduce the importance of SAP BW’s Technical Content and specifically the BW Administration Cockpit. In the recent years, SAP HANA has stolen much of the spotlight from everything else that is equally important to our existing customers who are not ready to move onto the SAP HANA platform.

 

While everyone has been busy acquiring knowledge and getting acquitted with the latest SAP HANA capabilities and functions, little did we notice that SAP has sneaked in significant changes to the technical content that we have been so familiar with. For example, an Xcelsius dashboard has been included to provide a management style reporting and installation has become much more straightforward. This article is not meant to discuss the importance of using technical content in a BW environment but to raise the awareness of how easy it is to implement, what the new functionality can address and learnings that we have gathered while enabling this feature.

 

This document should be use as a guide to enable the BW Administration Cockpit in an environment where this has not been setup yet. The effort is relatively minimal with no significant impact on existing objects but please address the warning messages in the installation log. The estimated effort require to complete this installation from an end to end process should not requirement anywhere greater than 10 hours using a single resource.

 

Benefits

 

You may skip this section if you have prior experience with SAP BW’s Technical Content. This section is aim to provide a high level understanding of the importance of using and having visibility of the system’s health thorough use of the generated numerical logs within the BW application.

 

Aside from the obvious benefits of being of being able to contextualise error and perform analysis, enabling the BW Administration Cockpit is surprising simple. There are no additional licensing costs associated with it and this feature comes as part of the NetWeaver platform. So in essence, you have a free, powerful and insightful tool that if not leverage, will be such a waste.

 

The advantage of empowering your clients to monitor the health of the system will allows them a greater knowledge to take proactive measure in ensuring everything stays at its optimal level. Having tangible numbers to indicate who their active users are can be a useful communication tool to drive the adoption of BW to the wider community within an organisation. For example, an organisation would have heavily investment in an enterprise warehouse solution and would like to see it being productively used. What better way to feed these information back to the management team on the number of active reporting users, the type of reports that are frequently used and how it is being used. It can also be used as impact assessments mechanism in the event where an underlying BW object needs to be modified and the need to understand what and importantly who it will be affected can save the team a lot of Monday morning hate mail.

 

In my opinion, the most beneficial of all in enabling the BW Administration Cockpit is that the information is provided in an Xcelsius dashboard, it is easy to understanding and the information is not overly sensitive. Because of these reasons, I do not see a valid justification for not sharing this information with the larger community. If an organisation uses SAP portal, it can be included as part of the corporate view where it can help to create a culture where information drives decision making and an open and honest view of how the reporting system is performing is a feature everyone can learn to appreciate. Some of the newly provided content such as data consumption by InfoArea is not included into the Xcelsius dashboard but is part of the delivered content. Using this report allows the business to make informed decision on cost e.g. this report will allow the business to allocate the usage cost across different departments and the below sample data indicates that the Finance department is the largest memory consumer therefore cross departmental charges can take place with the appropriate groups. Another sample report, the BW DB Usage report can give you an insight into the trend of the data growth and this can help with hardware sizing by avoiding preventable upgrades by channelling funds to other areas of improvement.

 

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

 

Supporting documentations

 

The BI Administration Cockpit is a recommended reporting feature that has been provided by SAP through the use of Technical Contents and this document is meant to cover the topic of installation and useful features within the BI Administration Cockpit and Technical BI Content layer.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png


Browse through the details found in the standard documentation as it is an excellent way to familiarised yourself with the installation procedure and the instructions provided are clear and concise.

 

Standard SAP documentation from SAP Help Portal that details the prerequisite, installation procedure and usage instructions.

http://help.sap.com/saphelp_nw73/helpdata/en/4e/1c145b0bf01a24e10000000a42189e/content.htm?frameset=/en/88/a2d8cac6ad4097ba02a877106ebc84/frameset.htm&current_toc=/en/89/71b01ce1f44e95a860a6c3f7dda911/plain.htm&node_id=80&show_children=false

How to efficiently use the SAP NetWeaver BW integration in SAP BusinessObjects Xcelsius

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a03ecbcc-cee7-2c10-93b1-886dbb4e9778?QuickLink=index&overridelayout=true&48232482785060

SAP NetWeaver BW Administration Cockpit Technical Content BI Statistics (SAP Feb 2011)

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c0e5ca3b-95ce-2b10-4d94-864ab29a8b63?overridelayout=true

The Architecture of BI Statistics

http://help.sap.com/saphelp_nw70/helpdata/en/eb/3c8c3bc6b84239e10000000a114084/content.htm?frameset=/en/44/3521c7bae848a1e10000000a114a6b/frameset.htm&current_toc=/en/e3/e60138fede083de10000009b38f8cf/plain.htm&node_id=718&show_children=true#jump718

 

Installation procedure

 

We discovered that an active SAP portal is a crucial component in having a working cockpit to allow reporting through Xcelsius. Others might argue that having established a BICS connection, it is sufficient to execute any dashboard reporting from BW, however, this was not the case for this exercise.

 

This installation procedure is meant to act as a guide under BW version 7.4 SP09 (SAPKW74009). Some installation procedure might have changed over the course of time due to product improvement and thus necessary precaution is required to successfully implement this Administration Cockpit in a landscape which might be on a different release.

 

While the installation of the BW Administration cockpit is simple and straight forward, the documented installation procedure can help to clarify any doubt or questions that might arise in your effort to provide this solution to your client.

 

Seq

Procedure

 

1

At a minimum level, ensure that SAP portal is present in the landscape and it is configured together with BW.

 

To check, in the BW server, execute this table RSPOR_T_PORTAL under SM30 and you should see some basic settings maintained

 

Alternatively, contact the system administrator to have this setup.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

2

Ensure you have the BI Administrator role is added to your login (SAP_BW_BI_ADMINISTRATOR)

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

3

To activate the Technical Content, you have the choice of doing it via SPRO, execute the RSTCT_ACTIVATEADMINCOCKPIT_NEW program (SE38) or go directly to transaction RSTCT_INST_BIAC.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

4

SAP has made it really easy to perform the Technical Content installation and it is no longer done under the Data Warehousing Workbench à BI Content section.

 

After you have made your selection criteria and ready to proceed with the installation, click on the execute button and wait for it to complete.

 

The options provided are self-explanatory and you would probably want to create a transport request to move these newly created objects across the landscape.

 

There will be 5 Process Chains added into RSPC under theUnassigned Nodes (NODESNOTCONNECTED) and you can set the schedule execution time prior to starting the installation. The default parameter is 04:00:00

 

  1. 1.     0TCT_C0_INIT_DELTA_P01
  2. 2.     0TCT_C2_INIT_DELTA_P01
  3. 3.     0TCT_C3_INIT_DELTA_P01
  4. 4.     0TCT_C0_FULL_P01
  5. 5.     0TCT_C25_FULL_P01
2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

5

Upon completion of the installation, it is advisable to check the installation log for any errors or warnings. Please address these messages accordingly to the nature of the system environment.

 

We did not encounter any errors or warnings at this stage of the installation process in our internal environment.

 

If you encounter an error during the Technical Content installation, please refer to the Supporting Information section of this document.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

6

The extensive list of activated objects can be found under the 0BWTCT InfoArea and importantly, ensure that these Process Chains have been added into RSPC.

 

  1. 1.     0TCT_C0_INIT_DELTA_P01
  2. 2.     0TCT_C2_INIT_DELTA_P01
  3. 3.     0TCT_C3_INIT_DELTA_P01
  4. 4.     0TCT_C0_FULL_P01
  5. 5.     0TCT_C25_FULL_P01

 

Note that there will be additional Process Chains added into the list for example, such as Process Chains to monitor BIA Statistics and if you are in an environment where BIA does not exist, this can be ignored.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

 

When the installation is complete and you have verified that all the necessary process chains are in place, you can begin by loading the master data using Process Chain 0TCT_MD_C_FULL_P01 and subsequently followed by the 0TCT_C* Process Chains.

 

Xcelsius Dashboard

 

To use the Xcelsius dashboard, in the BW system, enter RSTC_XCLS and this will launch a web browser session extending to a preconfigured portal address and you should see a similar dashboard below provided that you have setup SAP portal and successfully activated the BW Administration Cockpit.


This dashboard will allow you an overview of three basic monitoring which is the Alerts, Performance and Usage of the system.

 

Monitoring Type

Usage

 

Alert

Alerts will alarm the BW administrator on data load failures for a given problematic InfoProvider or DataSource.

 

It will highlight error messages, list the impacted objects and the use of the Detail button to display the corresponding backend log.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

Performance

The Performance tab will highlight high runtime objects for both Process Chains and queries by using a BEx query condition to select the TOP 20 object.

 

The Analyse Details button will provide the option for a graphical analysis on a granular level.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

Usage

The Usage tab gives you information on the trend of the data growth for a 12 months period, the most frequently used queries and the most active users.

 

If BWA is present in the landscape, it will provide the percentage of data used by a query that fetches information from BWA.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

Supporting Information

 

This section is a collection of additional information that has been useful in providing core information to strengthen the understanding, concept and troubleshooting guide towards the usage of the Technical Content. Please make full use of the attached links and SAP Service Market Place to find updated information on technical areas which might have changed during the course of multiple system improvements.

 

     1. Discovered errors after the Technical Content installation.

 

      To avoid having to reinstall the entire Technical Content, use transaction RSTCO_ADMIN to restart the failed installation. A yellow status can also be an             indicator that a newer version has been released and attention is required to handle this warning message. RSTCO_ADMIN can also be used to fix an                 installation that was executed by a user without the proper authorisation for Business Content installation. For supporting information, please refer to OSS 1069134 - Improved monitoring RSTCO_ADMIN

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

     2. The background (SM37) job name is BI_TCO_ACTIVATION.

 

       Use this to understand the installation procedure, the potential warning or error messages that might occur as a result of your installation.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png


     3. Assigning an importance criterion to SAP’s Technical Content.

 

     This feature will allow you to sort or filter BW technical objects and it needs to be maintained by assigning an importance value against the customer query          that you wish to create or maintain. E.g., by assigning an importance value to a Technical Content Process Chain or InfoCube, you are able to sort that                information to give it prioritisation amongst the other objects that is being monitored. The default importance value for all BW technical objects is set at 50 and      to change this, use transaction RSTCIMP to assign any value between 0 and 100. The underlying table that stores this information is RSTCIMPDIR.

     2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

                    With the customising complete, transfer the value to InfoObject 0TCTBWOBJCT via DataSource 0TCTBWOBJCT_ATTR and verify attribute                     0TCTIMPRTNC.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

          4. Collection of Statistical Information.

 

          All newly created BW Query, InfoProvider, Web Template and Workbook have been defaulted to collect statistical information and this setting can be           change to disable it, turn it back on and determine the level of aggregation to report on. This setting is maintained using transaction RSDDSTAT and as a           rule of thumb, it is advisable to leave all objects turn on while maintaining a sense on the aggregation data that is required. Once you have evidence of           where performance monitoring is not required, e.g. on InfoProvider’s with low data volume, this setting can be turn off.


          If an InfoProvider has this setting disabled, e.g. InfoProvider ZKUST01, all newly created queries will inherit this property and no statistical information will           be collected for it. However, you can overwrite this setting in the Query tab to explicitly only collect information for that desired Query.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

The amount of data or level of detail to be collected can also be adjusted based on the setting of 1, 2, 9 and 0. Below is an extracted text from SAP and further detail can be found here, http://help.sap.com/saphelp_nw70/helpdata/en/43/e37f8a6df402d3e10000000a1553f7/content.htm


Statistics Detail Level for the Query Object Type

For queries, you also have the option of selecting a detail level for the statistics data. You can choose from the following:

 

  • 0 – Aggregated Data: The system writes only one OLAP event (event 99999) for the query. This contains the cumulative times within the OLAP processing of the query. The system does not record data from the aggregation layer of the analytic engine or aggregation information.
  • 1 – Only Front End/Calculation Layer Data: The system records all OLAP events, but not separate data from the aggregation layer of the analytic engine.  The system writes only the general data manager event 9000 in the OLAP context as well as the aggregation information.
  • 2 – All: The system records all data from the area for the front end and calculation layer as well as data from the area for the aggregation layer and aggregation information.
  • 9 – No Data: The system does not record any data from the front end and calculation layer or from the aggregated event 99999. However, it does record data for the BEx Web templates and workbooks, depending on the setting.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

          5. Deleting Statistical Data.

 

          Statistical data can grow at an exponentially rate depending on factors such as the number of users in the system, the frequency of query activities and the           type of aggregation setting that has been enabled under transaction RSDDSTAT. SAP’s data retention period for table RSDDSTAT_* is 14 days but you           can overwrite the standard setting of 14 days by maintaining a numeric value in the RSADMIN table for entry TCT_KEEP_OLAP_DM_DATA_N_DAYS.          

          To do this, use the SAP_RSADMIN_MAINTAIN program to add or modify this entry. The example below holds a value of 7 days.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

           Alternatively, to manually delete the statistical data use the standard Delete Statistical Data function under transaction RSDDSTAT or execute program           (SE38) RSDDSTAT_DATA_DELETE.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

Conclusion

 

It will come as no surprise that the BW Administration Cockpit will need to be owned and managed by the IT department to ensure continuous improvement is performed productively. By having these statistical data turned into readable information, it allows an easier way to keep track of what is going on within the IT landscape regardless of the size of your enterprise.

 

There is no need to reiterate that the function of IT is to support the core business function but put on your green hat to find a business use case for it e.g. in an environment where SLA for BW plays an important KPI e.g. the BW server is hosted by an application provider, these information can be prove to be useful.

 

It is not enough to just activate the Technical Content and start running the Process Chain to collect the information that has been generated by the system but having a good understanding of the data and the standard reports is especially crucial to perform actionable task to safeguard the health of the BW server. Use the standard reports as building blocks to further enhance and drive specific monitoring and runtime statistics requirements once your team has a better understanding on the other areas to improve on. 

 

In terms of the new features provided by SAP, it is worthwhile to recognise that new contents might be available and be mindful that continuous improvement is certain with every release and upgrade.


SAP - BW Optimization Approach

$
0
0

Business Scenario

 

SAP  BW system may needs optimization / performance improvements by which the whole BW system is intact and efficient to fulfill the business needs.

 

The Following are the high level brief summary points which may serve as a starting point , a general approach to see the system and move forward with the optimization techniques. There are number of performance tuning techniques are available in the BW area. We can incorporate all / any/some of that after getting familiarized with these key points.

This may help a new comer who is on board to the project or an existing consultant.These key points may help to define a optimization project for an existing system and independent of the BW versions.


SAP - BW Optimization Approach at a High Level


  1. Understand the System Landscape
  2. Get familiar with the BW Architecture and the reporting Environment
  3. Understand thoroughly on the Business Areas where the Analytics are performed.
  4. Identify the KPI's and Key reports (Management &  Analytical)
  5. Collect, gather information and understand the road blocks and Business Pain points.
  6. Analyze the optimization / performance tuning steps taken so far and understand the changes done in the system landscape.
  7. Gather information on the highly impacted areas and where the business needs the system to be optimized.
  8. Prioritize the areas where optimization process should start.
  9. Segregate the improvement areas into Data Extraction(Includes Source side as well) , Transformation/Staging,  Analytical/Reporting side.
  10. Take the inventory of the objects pertinent to the each areas defined above.
  11. Identify the Key areas where the system can be improved/optimized by taking out the current changes/enhancing the current changes done so far.
  12. Analyze the impact of changing the existing optimization / performance tuning steps taken so far.
  13. Identify the best performance tuning techniques which can be implemented on the areas (Point 9)
  14. Come up with a solution suggestion where the existing changes can be removed / enhanced to the areas defined in Point 9.
  15. Analyze and identify how to sustain the system live ,up and running when about to perform the optimization process and make sure  business would not be impacted at any point of time.
  16. Draw an Optimization/Project plan based on the prioritized  business areas and get the objects list based on the inventory taken segregated by the improvement areas.
  17. Come up with the plan to identify the deliverable , timelines (The project management work will be involved at this point)
  18. Collaborate and explain the plan internally then go for the approval from Business stake holders.
  19. Create a Unit test plan for Dev and Regression test plan for QA to accommodate the finalized plan.
  20. Start implementing the optimization as planned  with extra care and make sure to do multiple testing from the BW perspective and the Business perspective and make sure nothing breaks.
  21. Continue testing in Dev , meet with internal team and prove the performance/optimization ; done so far against the optimization done earlier.
  22. Finalize and confirm the optimization plan works as expected successfully and move on to the next level in the system landscape.

 

Hope this gives an idea to the beginners, entry level consultant how to approach for an optimization/performance tuning projects.

Some Additional Tips for Collecting & Validating Transports

$
0
0

Objective

In this blog I just want to share few tips related to BW transports that might help in collection & validation efficiently.

 

 

Settings

Though its a personal choice, but some settings are recommended while some are chosen based on personal comfort level.

Following are the settings I prefer for more clear one shot view:

 

Grouping

b1.jpg

Its most obvious setting but recommending it implies- we should collect different objects by types only. I mean, preferably, instead of dragging in infoproviders with "Data Flow Before" for collecting transformation, we should go & collect specific transformation from corresponding object type. For example-

 

b2.jpg

 

Display

Following setting can be chosen once required objects are dragged in for collection:

b3.jpg


Using this setting in conjugation with "Necessary Objects" settings makes picture very clear on what to be selected & what not, even for BEx Queries or transformation. For example :-

b4.jpg

Here we can right click on required object type & click "Transport All Below". Same ways following is sample for BEx Query collection:

b5.jpg

 

 

Grouping of BW Objects in Transports

I think there is no fixed rule for this but objective is complete transport without error & import in reasonable amount of time.

Following can be two strategies for grouping:

1) If we are sending our development first time & we have large number of data models & reports, then this strategy is recommended:

Separate Transport Requests based on following groups-

a) Infoobjects, Infoobjects Catalogs & Infoareas

b) Infoproviders

c) Datasources& Infopackages

d) Master Data Transformations & DTP

e) Transaction Data [First Level] Transformations & DTP "Transformations which are between datasource & infoprovider

f) Transaction Data [Second Level & Upwards] Transformations & DTP "Transformations which are between infoprovider & infoprovider

g) Process Chains

h) BEx Queries

i) Customer Exit Codes


If number of objects in any group is very high that group can be divided in parts, as if number of objects are too high sometimes importing that transport can become nightmare.

This is very generic sequence, but important thing is to take care of dependency i.e. dependent objects should go in second step once main objects are moved.

While releasing transports system itself checks dependent objects and give warnings or errors accordingly.

 

Possible question in section can be- "Why we collected 2 different transports for different levels of transaction data transformations?".

This is required only if we have multiple clients of ECC QA for testing but single client of BW. In this case BW will have two source systems connected, hence we will need to transport all TR's with a) first level transformations (between datasource & infoprovider) and b) datasources two times. Each time with correct destination client in "Conversion of Logical System Names":

b1_1.jpg

 

 

2) This strategy can be used when we are making ad-hoc transports. For example if we want to transport only 1 simple data model & 1 query, then all objects can be transported together in same transport request. This approach is not recommended while transporting complex data model where total number of objects to be moved is very high.

 

 

Some Tips for Quicker Collection

This tip is mainly for collecting large number of transformations. Suppose we have list of 36 [random number] master data models (36 Infoobjects, some ATTR, some TEXT & some all three HIER ATTR TEXT) to be collected and we took a call to collect these in 3 separate transport requests with 12 (Infoobjects data flows) [We need to take a call if we want to move all 36 together in one TR or break them in multiple TR based on complexity of objects & total number of objects in one transport request] each (To avoid large import times). For sake of simplicity, suppose all master data transformations are between datasource & infoobject.

In excel we can use concatenate formula to generate a list of following pattern-

RSDS*<INFOOBJECT TECHNICAL NAME>*

 

This list can be used as shown in screenshot below-

b6.jpg

This trick may seem foolish for collecting 2 or 3 transformations, but while collecting large number (15-20 or more) of transformations from even larger group (100 or more) of transformations, it will be handy.

This trick will reduce the number of objects shown in "Select Objects" pop-up and show only most relevant ones. Now quickly required transformations can be selected & transfer selection-

 

b7.jpg

 

Note, here if we change concatenate formula little bit we can achieve following results as well:

a) Restricting result set only for specific source system - "RSDS*<SOURCE SYSTEM NAME>*<INFO...*"

b) Restricting only those transformations which are between BW objects- "TRCS*<INFOBJECT>*"


This technique of applying filter based on wild characters can be used for collecting almost all object types (Except BEx Queries).



Another point worth noting here is, if we are collecting DTP & Transformations in a same transport request:

We can use same wild card technique for DTP as well. We need to just drag in all required DTP's and in one shot ["Necessary Objects"] we can collect both DTP & corresponding Transformation (plus dependent routines & formulas).

 

This trick can be easier to apply if we maintain a development tracker listing developed/changed objects by object type -Infoobject, Infoprovider, Transformations, BEx, Chains etc.

 

 

Validation of Transports

When we are working with multiple people in a team, it is a good idea to validate the transports before releasing them.

Following two tables are starting point-

1) E070 this table will give you list of sub tasks in a transport request

2) E071 this table will give all objects captured in sub tasks by object types

 

We can use following link to check different system tables by different object types:

http://wiki.scn.sap.com/wiki/display/BI/Important+Tables+in+SAP+BW+7.x

 

This link might not have all the system tables but by making use of wild card character "*" we can refer many more.

Some tables for reference-

1) RSZCOMPDIR for verifying BEx Query Tech Names

2) RSZELTDIR for checking different Query elements

3) RSTRANROUTMAP & RSTRANSTEPROUT can help in identifying routines of a transformations based on table RSTRAN

 

Making use of tables listed above we can make quick & basic validations (using Excel & VLOOKUP) for example:

a) All relevant routines are captured or not for transformations collected in a transport request

b) Different Objects (by Object Type) are captured in transport request or not based on development tracker

 

References

References for transport related blogs

How to manage Error Free Transports

 

 

Note to SCN Members: Please feel free to add more references related to the topic.

Performance Optimization Tips - SAP BW

$
0
0

Introduction

 

Hello everyone, I have been working on multiple BW landscapes and operations support for quite some time and from my experience I saw batch processing has high visibility among leadership (business) and its always challenging to refine the existing batch processes and bring down the overall runtimes as part of continuous process improvements.

 

I have been fortunate to successfully optimize batch processing in multiple instances and in my blog I intend to advise handful easy tips to optimize batch processing.


1. DTP - Data Transfer Processes

 

I have always seen when people create DTP’s they never consider optimization aspects of how this will impact are simple techniques you can use to reduce the runtimes for data loading. Combination of Parallel Processing and Data packet optimization there can be dramatic reduction in runtimes

 

Increase Parallel Processing: There is a provision in DTP to increase the number of parallel processing, if you have available work processes then feel free to increase this number and change the job priority. By default this is set to 3 and the job class is set as “C”

Pic 2.jpg

Pic 1.jpg

Another way of parallel processing is to split the data from the source into smaller chunks in case of full load from source to target and run them in parallel with filters applied.

 

Example: If you have to load Business Partner master data from CRM/SRM system then you can always split them into chunks depending on value range for Source /Territory/Type and run the DTPs in parallel

 

Data Packet Size: In DTPs you can always vary the data packet size which is directly proportional to the loading run time. Lesser the size of the data packet lesser is the loading time and vice versa. The default value set is 50k records but it can be changed when in edit mode.

 

Note: At times even after changing the data packet size the number of records in a packet won’t change and in such cases you will have to change the size of source package

 

Pic 3.jpg


2. Info Package


For Full Info Packages too we can have parallel processing to split the data from the source into smaller chunks and run them in parallel with filters applied and there is also provision to change the data packet size

Pic 4.jpg

 

In Scheduler there are other options (“Timeout Time” and “Treat Warnings”) as well which is not for runtime optimization but helpful in case you encountering issues with timeout errors and if warnings are to be ignored


3. DSO

 

DSO activation can be slow if the batch tables are large in size as these are run through for object activations, you can always ask BASIS team to clean such tables with report RSBTCDEL2, Tcode SM65

 

BASIS – SQL team should always consider updating the statistics for the DSO and reorg/fragment the tables if required. This can be also a routine activity based on the your requirements and needs

 

There is a provision to create secondary index for DSO tables and it can be either done by SQL DBA team OR in BW console Tcode SE11 to optimize the runtimes

 

If you are not reporting on the DSO, the activation of SIDs is not required (this will take up some considerable time in activation); Often the logs show that the activation job takes almost all the time to schedule the RSBATCH_EXECUTE_PROZESS as job BIBCTL_*. RSBATCH_EXECUTE_PROCESS is for scheduling and executing the SID-generation process. If you don't need the relevant DSO for reporting & you don't have queries on it, you can delete the reporting flag in the DSO maintenance. This would be a good way to speed this process up significantly. Check under 'Settings' in the DSO maintenance whether you have flagged the option "SID Generation upon activation".


Helpful SAP Notes & Documents

SAP Note 1392715: DSO req. activation: collective perf. Problem note

SAP Note 1118205: RSODSO_SETTINGS Maintain runtime parameter of DSO

SDN Documenthttp://scn.sap.com/docs/DOC-45290

 

Thanks

Abhishek Shanbhogue

Virtual Analysis Authorizations - Part 1: Introduction

$
0
0

In SAP NetWeaver BW release 7.3 a new Analysis Authorizations BAdI was introduced: BAdI RSEC_VIRTUAL_AUTH_BADI as part of Enhancement Spot RSEC_VIRTUAL_AUTH. The authorized values or hierarchy nodes can be determined dynamically during query runtime. It does not require any Analysis Authorization objects and PFCG Roles. Virtual Authorizations can be used to enhance any existing “classic” authorization model. I.e. you do not have to make an exclusive choice for one or the other, both classic and virtual can be used simultaneously and complementary.

I would like to share my implementation experience with virtual Profit Center and Cost Center authorizations. This introductory blog will discuss the rationale, a comparison between classic and virtual authorizations, and the different call scenarios for which the BAdI is processed.

On a short notice I will publish a second blog with the solution details and a document with implementation details.

Rationale

The main problem with a classic authorization concept is that it is less flexible in situations with a big user population, many authorization objects/roles and frequent changes. E.g. organizational changes effecting large parts of the organization and ongoing roll-outs with big increments in the user population.

Classic use cases for a more flexible and dynamic approach are Profit Center and Cost Center authorizations. Often we have to deal with hierarchy authorizations as well as value authorizations. There might exist multiple hierarchies which have to be authorized on many hierarchy nodes. The number of required authorization objects and roles is likely to become high.

As a consequence, TCD (Total Cost of Development) as well as TCO (Total Cost of Ownership) is likely to become too high.

Classic versus Virtual Authorizations

Before diving into the Virtual Authorizations, Iet’s try to compare the classic model with the virtual model.

 

Figure_01_Evaluation_Matrix.jpg

Figure 1: Evaluation Matrix

 

The biggest draw-back of the classic model pops up in the efficiency with a big user population in combination with many authorization objects and roles. Here the virtual model shows its added value.

On the other hand, the virtual model is less transparent and clear compared to the classic model. Also in the area of compliance we do not have the out-of-the-box functionality compared to the classic model.

Different Call Scenarios

During query run-time the BAdI is called multiple times. This might be a bit confusing in the beginning when you start working with the BAdI. There are 3 call scenarios:

 

  • Call scenario 1: InfoProvider-independent or cross-InfoProvider authorizations;
  • Call scenario 2: InfoProvider specific authorizations ;
  • Call scenario 3: Documents protected with authorizations.

 

Call scenario 1: InfoProvider-independent or cross-InfoProvider authorizations

Scenario 1 can be called multiple times. Importing Parameter I_IOBJNM is not initial and Importing Parameter I_INFOPROV is initial. Importing Parameter I_T_ATR might be filled with authorization-relevant Attributes of the respective Characteristic, if any.

In this call scenario the following authorization is processed:

 

  • Authorization-relevant InfoObjects; e.g. I_IOBJNM = '0PROFIT_CTR';
  • Authorization-relevant Attributes; e.g. I_IOBJNM = '0WBS_ELEMT' and I_T_ATR with ATTRINM = '0PROFIT_CTR' *);
  • Authorization-relevant Navigational Attributes; e.g. I_IOBJNM = '0WBS_ELEMT__0PROFIT_CTR'.

 

*) Display Attributes need full authorization; see also SAP Note 1951019 - Navigation Attribute and Display Attribute for BW Analysis Authorization.

 

Call scenario 2: InfoProvider-specific authorizations

Scenario 2 will be called once only. Importing Parameter I_IOBJNM is initial and Importing Parameter I_INFOPROV is not initial. You can determine the authorization-relevant InfoObjects using Function Module RSEC_GET_AUTHREL_INFOOBJECTS.

In this call scenario the following authorization is processed:

 

  • Authorization-relevant InfoObjects; e.g. I_IOBJNM = '0PROFIT_CTR';
  • Authorization-relevant Navigational Attributes; e.g. I_IOBJNM = '0WBS_ELEMT__0PROFIT_CTR'.

 

Call scenario 3: Documents protected with authorizations

I did not experiment with scenario 3 yet. It can be called in the context of documents which are protected with authorizations. In this case, both Importing Parameter I_IOBJNM and Importing Parameter I_INFOPROV are initial.

Conclusion

In this introductory blog we discussed the rationale of virtual authorizations, a comparison between classic and virtual authorizations, and the different call scenarios for which the BAdI is processed.

On a short notice I will publish a second blog with the solution details and a document with implementation details.

Virtual Analysis Authorizations - Part 2: Solution Details

$
0
0

In SAP NetWeaver BW release 7.3 a new Analysis Authorizations BAdI was introduced: BAdI RSEC_VIRTUAL_AUTH_BADI as part of Enhancement Spot RSEC_VIRTUAL_AUTH. The authorized values or hierarchy nodes can be determined dynamically during query runtime. It does not require any Analysis Authorization objects and PFCG Roles. Virtual Authorizations can be used to enhance any existing “classic” authorization model. I.e. you do not have to make an exclusive choice for one or the other, both classic and virtual can be used simultaneously and complementary.

I would like to share my implementation experience with virtual Profit Center and Cost Center authorizations. For an introduction please read my blog Virtual Analysis Authorizations - Part 1: Introduction. In this blog we will discuss the use case and chosen approach, the solution overview, the control tables and default hierarchies.

On a short notice I will publish a document with implementation details.

Approach

As already mentioned in my previous blog, our use case was Profit Center and Cost Center authorizations. We had to deal with hierarchy authorizations as well as value authorizations. There existed multiple hierarchies which had to be authorized on many hierarchy nodes. We urgently needed a more dynamic and flexible approach.

We implemented Virtual Authorizations for Profit Center and Cost Center authorizations next to the classic model for all other Analysis Authorizations. We tried to mitigate the “compliance issue” by introducing a Profit Center Basic and Cost Center Basic authorization object with only : (aggregation) and # (unassigned) authorization. These objects are checked by the BAdI and the Profit Center and Cost Center authorization is only processed if the respective “basic” object is assigned to the user. In our case that was a role-based assignment. This way we enhanced the Virtual Model:

 

  • An additional access key is required to get authorized;
  • It will the improve the traceability and auditability;
  • It will increase the compliance with security standards.

Solution Overview

Virtual authorizations can be realized by implementing BAdI RSEC_VIRTUAL_AUTH_BADI as part of Enhancement Spot RSEC_VIRTUAL_AUTH. The Analysis Authorizations are determined dynamically, i.e. during query runtime. Both value and hierarchy authorizations are supported.

Authorizations per user have to be maintained using two central control tables:

 

  • Value authorizations;
  • Hierarchy authorizations.

 

Both control tables can be maintained using their own table maintenance dialog. It is recommended to maintain the control tables in every system separately (i.e. no transports) to remain as flexible as possible. An initial mass upload could be facilitated by LSMW (Legacy System Migration Workbench).

Those control tables only have to maintained once for the respective basis Characteristic, i.e. Profit Center and Cost Center. The authorization for Display Attributes and Navigational Attributes is automatically derived and processed by the BAdI.

Control Tables

The hierarchy authorizations are maintained in control table ZBW_VIRTAUTH_HIEthat looks almost equal to table RSECHIE. Here we can enter a Profit Center or Cost Center hierarchy authorization for a particular user.

 

Figure_01_Control_Table_Hierarchy_Authorization.jpg

Figure 1: Control Table - Hierarchy Authorization

 

The value authorizations are maintained in control table ZBW_VIRTAUTH_VALthat looks almost equal to table RSECVAL. Here we can enter a Profit Center or Cost Center value authorization for a particular user.

 

Figure_02_Control_Table_Value_Authorization.jpg

Figure 2: Control Table - Value Authorization

Default Hierarchies

Another requirement was to be able to generate hierarchy authorization based on value authorization. The rationale behind it is that the majority of reports are based on “default hierarchies”. Particular roles like Cost Center responsible do not get any hierarchy authorization and as a consequence were not able to run those reports. At the same time, we wanted to prevent double maintenance.

The solution was to define a third control table for Default Hierarchies: ZBW_VIRTAUTH_DEF. Here you can enter one or more default hierarchies for a Characteristic. The BAdI will then generate the hierarchy authorization for the default hierarchy restricted to the authorized values as leaves in the hierarchy.

 

Figure_03_Control_Table_Default_Hierarchy.jpg

Figure 3: Control Table - Default Hierarchy

 

In the example above we have defined the (standard) hierarchy 1000KP1000 as default hierarchy for Cost Center.

Conclusion

In this blog we discussed the use case and chosen approach, the solution overview, the control tables and default hierarchies.

On a short notice I will publish a document with implementation details.

Update DataSource in source system

$
0
0

Hi,

 

You have the requirement of updating the DataSource in source system like changing the Extract Structure or changing the Extractor for example from View to Function Module. You do not have authorization to the RSA2 transaction and cannot wait for the SP release or cannot upgrade the SP level.

 

In standard BI, you can only change the Extractor fields of the DataSource in RSA6 transaction.

 

In this case, you can write a Z report to achieve the desired results.

 

I am providing a sample report which changes the Extract Structure for the DataSource without accessing the RSA2 transaction.

 

 

REPORT  Z_0GT_HKPSTP_TEXT.

 

 

  TABLES: ROOSOURCE,

          ROOSFIELD.

  DATA: lv_oltpsource TYPE ROOSOURCER VALUE '0GT_HKPSTP_TEXT',

        lv_objvers_d  TYPE ROOBJVERS  VALUE 'D',

        lv_objvers_a  TYPE ROOBJVERS  VALUE 'A'.

  DATA: ls_roosource_old TYPE ROOSOURCE,

        ls_roosource_new TYPE ROOSOURCE.

  DATA: ls_roosfield_old TYPE ROOSFIELD,

        ls_roosfield_new TYPE ROOSFIELD.

  DATA: txt(24) TYPE C.

 

 

  SELECTION-SCREEN COMMENT /1(80) TEXT1.

  SELECTION-SCREEN COMMENT /1(80) TEXT2.

  SELECTION-SCREEN COMMENT /1(80) TEXT3.

  SELECTION-SCREEN COMMENT /1(80) TEXT4.

  SELECTION-SCREEN COMMENT /1(83) TEXT5.

  SELECTION-SCREEN ULINE.

  SELECTION-SCREEN COMMENT /1(80) TEXT6.

  SELECTION-SCREEN ULINE.

 

 

  PARAMETERS: test AS CHECKBOX DEFAULT 'X'.

 

 

*                                                                     *

* INITIALIZATION

*                                                                     *

  INITIALIZATION.

 

 

  TEXT1 = 'Dear Customer.'.

  TEXT2 = 'You are just running a report, which will change '

        & 'structure of DataSource '.

  TEXT3 = '0GT_HKPSTP_TEXT on your database.'.

  TEXT4 = 'In case of doubt please contact SAP.'.

 

 

*                                                                     *

* START-OF-SELECTION

*                                                                     *

  START-OF-SELECTION.

 

 

  IF test IS INITIAL.

     WRITE: / 'Mode...: Update-Run'.

     txt = '  <-successfully updated'.

  ELSE.

     WRITE: / 'Mode...: Test-Run'.

     txt = '                      '.

  ENDIF.

 

 

  SKIP TO LINE 4.

  ULINE.

 

 

* Step 1: ROOSOURCE

* 1.1 get current values for protocol

  SELECT SINGLE * FROM ROOSOURCE INTO ls_roosource_old

    WHERE oltpsource = lv_oltpsource

    AND   objvers    = lv_objvers_d.

  IF SY-subrc IS INITIAL.

* ..1.2 build workarea for update

    ls_roosource_new = ls_roosource_old.

    ls_roosource_new-EXSTRUCT = 'WB2_TEXTSTR1'.

  ELSE.

    WRITE: / 'DataSource "0GT_HKPSTP_TEXT" not found in version "D".',

           / 'Nothing to do ... bye.'.

    EXIT.

  ENDIF.

 

 

  SELECT SINGLE * FROM ROOSOURCE INTO ls_roosource_old

    WHERE oltpsource = lv_oltpsource

    AND   objvers    = lv_objvers_a.

  IF SY-subrc IS INITIAL.

* ..1.2 build workarea for update

    ls_roosource_new = ls_roosource_old.

    ls_roosource_new-EXSTRUCT = 'WB2_TEXTSTR1'.

  ELSE.

    WRITE: / 'DataSource "0GT_HKPSTP_TEXT" not found in version "A".',

           / 'Nothing to do ... bye.'.

    EXIT.

  ENDIF.

 

 

* Step 3: Update tables ROOSOURCE, ROOSFIELD

  IF test IS INITIAL.

* ..3.1 Update ROOSOURCE

    UPDATE ROOSOURCE FROM ls_roosource_new.

    IF SY-subrc IS INITIAL.

*     ..OK, table has been updated successfully

    ELSE.

      ROLLBACK WORK.

      WRITE: / 'Error on update table ROOSOURCE.'.

      EXIT.

    ENDIF.

 

 

    SELECT SINGLE * FROM roosfield INTO ls_roosfield_old

      WHERE oltpsource = lv_oltpsource

      AND   objvers    = lv_objvers_a

      AND   field      = 'SPRAS'.

    IF sy-subrc IS INITIAL.

* ..1.2 build workarea for update

      ls_roosfield_new = ls_roosfield_old.

      ls_roosfield_new-selection = 'X'.

 

 

      UPDATE roosfield FROM ls_roosfield_new.

    ELSE.

      WRITE: / 'DataSource "0GT_HKPSTP_TEXT" not found in version "A".',

           / 'Nothing to do ... bye.'.

      EXIT.

    ENDIF.

 

 

    SELECT SINGLE * FROM roosfield INTO ls_roosfield_old

      WHERE oltpsource = lv_oltpsource

      AND   objvers    = lv_objvers_d

      AND   field      = 'SPRAS'.

    IF sy-subrc IS INITIAL.

* ..1.2 build workarea for update

      ls_roosfield_new = ls_roosfield_old.

      ls_roosfield_new-selection = 'X'.

 

 

      UPDATE roosfield FROM ls_roosfield_new.

    ELSE.

      WRITE: / 'DataSource "0GT_HKPSTP_TEXT" not found in version "A".',

           / 'Nothing to do ... bye.'.

      EXIT.

    ENDIF.

  ENDIF.

 

 

* Step 4: Protocol

* 4.1 HEADER for ROOSOURCE protocol

  SKIP TO LINE 6.

  FORMAT COLOR COL_TOTAL.

  WRITE: / 'Table ROOSOURCE:'.

  SKIP TO LINE 7.

  FORMAT COLOR COL_BACKGROUND.

  WRITE: '1 Field to update: EXTRACTOR STRUCTURE'.

 

 

  FORMAT COLOR COL_KEY.

  WRITE: / 'OLTPSOURCE',

     AT 50 'EXTRACTOR STRUCTURE OLD'.

 

 

* 4.2 Protocol for ROOSOURCE

  WRITE:  AT 80 'EXTRACTOR STRUCTURE NEW'.

 

 

  FORMAT COLOR COL_NORMAL.

  WRITE: / ls_roosource_new-oltpsource,

    AT 50 ls_roosource_old-EXSTRUCT, AT 80 ls_roosource_new-EXSTRUCT.

*                                                                     *

* AT SELECTION-SCREEN OUTPUT

*                                                                     *

  AT SELECTION-SCREEN OUTPUT.

 

 

* Set test flag on the initial screen

  test = 'X'.

 

 

* End of report  Z_0GT_HKPSTP_TEXT.

Attributes on Search Help

$
0
0

Introduction

 

Hello everyone, I recently completed BeX upgrade project and I found below information should be helpful for folks who work on similar projects

Business Scenario

 

Consider a master data info object has large number of attributes and business want to display only a selected number of attributes OR set the sequence of the attributes for display in F4 for help in report output

 

Solution

 

In the below example I have considered Employee Master Data to have Home Sub-LoS 5 to4 appear in F4 for help window in sequence instead of all other attributes that exists. When I execute a report I see the sequence maintained in F4 for help/Filter is the same as Attribute tab

 

TCODE -> RDS1 -> Select Appropriate Info Object -> Attribute Tab

 

F4 for Help.jpg

 

F4 for Help 2.jpg

 

SAP Notes for Reference

 

1080863 - FAQ: Input helps in Netweaver BI

 

 

Thanks

Abhishek Shanbhogue


Error "User does not have authorization for InfoProvider XXX"

$
0
0

Hi all,

 

Despite having authorizations for this InfoProvider, an error on a DTP ( XXX -> YYY ) is produced while executing a process chain. The message displayed is:

 

chain.gif

 

log.gif

 

But the user does have authorization for InfoProvider XXX (and for YYY):

 

role0.gif

 

If we generate a trace in tcode ST01 we can see that the error (RC=4) is related to the authorization object S_BTCH_ADM:

Trace1.gif

 

Trace2.gif

 

Trace3.gif

 

To avoid this error the following authorization (authorization object S_BTCH_ADM) should be granted to the user:

 

role.gif

 

This is because the DTP is running serial extraction. In this case the user needs authorization to manage background processing. If you do not want to grant this authorization to the user, you can check the "Parallel Extraction" mode and the authorization problem will be solved:

 

dtp.gif

 

Best Regards,

Carlos

How to edit a Transport Request

$
0
0

This guide is sort of a trick..:D


 

This will teach youhow to edit a transport requestthat you have already transported toQASorPROD..For example, you already seek for an approval of the TRANSPORT REQUEST that you have created..After transporting all the Infoproviders and Infoobjects that you have created, you noticed that you missed to include a single INFOOBJECT on your request..Hence, you will again seek for an approval of another transport request for that one (1) infoobject as well as re-transport all the infoproviders and infoobjects that you have already transported..


 

Using this trick, you need not to seek for approval for another transport request since you will be using the sameTransport Request Number..You will just edit your first transport and add that single infoobject that you have missed..


 

Here is the trick guyz!

 

1. Go to SE10 or STMS and copy the transport request number that you want to edit.

 

1.PNG

 

2. Go to SE38 and run the program RDDIT076. Click the EXECUTE button (near the ACTIVATE button) or press F8 to RUN.

 

2.PNG

 

3. Paste the Transport Request Number (from STEP 1) and click the EXECUTE button.

 

3.PNG

 

4. Double click the first row and edit its STATUS; change it from R to D. Click the SAVE button afterwards.

 

4.png

 

5.png

 

5. Do STEP 4 on the second row. You will have the screen below.

 

6.PNG

 

6. Go to SE10 and double click the top row of your transport request number.

 

7.png

 

7. DELETE all ROWS.

 

8.png

 

8. Go to PROPERTIES tab. Delete all rows in the ATTRIBUTE column by clicking each row then click the DELETE ATTRIBUTE button.

 

9.png

 

9. Click SAVE.

 

10.PNG

 

10. Include now the objects that you want to add in your existing request by clicking the OWN REQUESTS button and locate your edited transport request.

 

11.png

 

By the way, you can also use this trick if you already released your transport request and you forgot something to include in that transport request.

 

Hope this will help you guyz!

Business Warehouse Performance Tuning at Source System

$
0
0

BW Performance At source

 

 

1) Maintain the control parameter for data transfer in SBIW -> General settings -> Maintain control parameters for data transfer.

 

Source system table ROIDOCPRMS: It contains the control parameters for data transfer from the source system to BW.

STATFRQU - Number of packets that are transferred before statistical info is sent

MAXPROCS - Maximum number of dialog work processes per upload request used to send the data to the BW system

MAXLINES - Maximum number of records sent in one IDoc packet.

MAXSIZE - Maximum size of an IDoc packet in KB.

 

 

Important Points to be considered.

 

A) Package size = MAXSIZE * 1000 / size of the transfer structure 

 

Package Size -  Not more than MAXLINES.

Transfer Structure size is determine by using SE11 (ABAP Dictionary) -> Extras -> Table Width -> Length of data division under ABAP.



B) If table ROIDOCPRMS is empty, the systems use default values during run-time. You should not allow these default values to be used.

 

SAP Note 1597364 - FAQ: BW-BCT: Extraction performance in source system

SAP Note 417307 Extractor package size: Collective note for applications

 

 

2) Values for Max Conn and Max Runtime in SMQS (Configure number of IDoc to be processed parallel depending on number dialog  process available in BW)

 

Cause

 

tRFC processing is very slow and has "Transaction recorded" status in SM58 or IDOC processing delay or Workflow processing delay

 

Resolution

  1. Call transaction SMQS
  2. Choose the destination
  3. Click Registration
  4. Increase the Max.Conn (Enter the number  of connection ) , this is directly proportion the available dialog process  in BW system. Example if BW has 30 Dialog process then you can try Max.Conn as 20. 
  5. Increase Max. Runtime(For example 1800)

 

SMQS.png

 

Sap Notes

 

1887368 - tRFC process slow in SM58 and Max.Conn is not fully used in SMQS

 

 

3) IDoc processing and Performance

The "Trigger Immediately" \ "Transfer IDoc Immed." options should be always used.

 

 

How to change the processing mode for the IDocs in question as follows:

For inbound:
-> go to transaction WE20 -> Select Partner Select Inbound Message Type and change the processing method to "Trigger Immediately" .

For Outbound:
-> go to transaction WE20 -> Select Partner Select Outbound Message Type and change the processing method to "Transfer IDoc Immedi."

 


What will happen IDocs are scheduled , reports RBDAPP01 \ RSEOUT00 in batch mode will be processed via scheduled runs. This should then leave the majority of dialog work processes free for users and mission-critical processes. Hence you will no longer encounter the resources problems you are currently experiencing.

 

4. Performance problem in collective job RMBWV3nn

Try to maintain the control parameters for the extraction for MCEX(xx) queue by the method below.
Transaction Code LBWR > Queue name (MCEX Queue) > Customizing > No.Of.Documents = 1000 to 3000.Check if this is reflected in Table TMCEXUPD-UPD_HEAD_COUNT field.

The adjustment of  TMCEXUPD- UPD_HEAD_COUNT will need to be tested for each application, as setting too large a value can result in a memory dump.

 

 

 

Part 2 - Performance Improvements in BW System

Step by Step Guide to track manual changes via Integrated Planning ( IP)

$
0
0

Scenario :

 

In our project we are using statistical method to calculate number of products left at a customer location considering their past sales( cumulative)  and natural retirement with time.  To calculate the retirement we are using statistical density function to predict retirement over time. To get the current product base we are subtracting predicted retirement from total sales over time.

 

Now, as this prediction might not give 100% correct values ( in fact it will never give ) , business wants to update the "Current Product Base"  in case that information is available via field intelligence i.e. from the sales representative.

 

                        PIC1.png

 

For example  row 1 , our model is predicting "Current Product Base"  for customer C1 as of April-2015 for Product P1 is 50 . However, my sales representative knows it is exactly 60 .  So, he/she updated this value to 60 manually.  We used Integrated Planning functionality in BW to achieve that.  Now, we want to capture who changed the values and when the changes were made.

 


Step By Step Procedure :


1.  Create  Direct Update DSO  to log the changes:


We logged the changes in a Direct Update DSO.  So first we need to create some characteristics relevant for logging and then create a Direct Update DSO.

We have used 0DATE0 , 0TIME , ZUSERNM( to hold user information ) and ZSAVEID to log the changes. Created a DSO with 0DATE, 0TIME,  ZUSERNM , ZSAVEID these as the key fields together with other characteristics relevant for business.

 

        InfoObjects Settings :

pic 2.png

Now , we will create a DSO and change the Type of Data Store Object to " Direct Update" from the settings.  We shall use all our business key and above mentioned 4 characteristics as the key of DSO.

 

    pic 3.PNG

In the Data fields of DSO , you can include all the Key Figures which are supposed to be manually updated. For case our scenario it is actual value of product base.

 

 

2. Create Enhancement Spot Implementation to log the changes in DSO :


Now , we shall implement an Enhancement Spot which will do the job of logging manual update.  Every time user updates the value in real time cube, system will generate an Save Id and push that to our DSO along with user name, date and time.

 

Go to Transaction SE18 , choose Enhancement Spot  RSPLS_LOGGING_ON_SAVEChoose Tab Enhancement Implementation and click on Implement Enhancement Spot  ( highlighted ).

                              PIC 4.png

Put the name of your implementing class and description and then choose OK  . Select suitable Package and then fill the below screen with BAdi name and class name and choose BAdi definition


                      pic 5.png

 

                        pic 6.png

 

    Now we have to work on two things  1 ) Implementation Class and 2 ) Filter

 

    Let us work with implementation class first .  A class will have methods which will do the actual work for us. We have to put our code in those methods.

 

    Double click on the implementation class of the BAdi definition .

                                          pic 7.png

  It shall bring the below screen and you would be able to see the methods for the implementation class. We have to put our code inside these methods.  Please check the attachment for the code with comments.  You need very minimum adjustment to the code to adapt it for your scenario.

                              Pic 8.png

a) IF_RSPLS_LOGGING_ON_SAVE~LOG_DEFINED :

Here we need to define for which Real Time Cube logging is activated . Assign the cube name  to i_infocube_name  parameter.  Additionally I put my name , so that changes by my user id only would be logged as of now.  Later on we shall comment out second statement.

 

      PIC 9.png

 

b) IF_RSPLS_LOGGING_ON_SAVE~LOG_STRUCTURE :

This method will give us the structure of the  data which will be logged.  In our case it will provide me the structure of the  DSO where I am storing the log.  Please check the appendix for code adjustment with all relevant comments for understanding .

 

 

c) IF_RSPLS_LOGGING_ON_SAVE~LOG_WRITE :

This method actually writes the data to Direct Update DSO in a structure defined in  method 2.

Here we need to mention for which Real Time Cube we want to log the changes and where  ( in our case it is Direct Update DSO) . It could also be a DB table.

 

d) IF_RSPLS_LOGGING_ON_SAVE~LOG_DEFINED_DB :

This method you can use to write it the log to Database Table if you are using HANA as DB

 

e) IF_RSPLS_LOGGING_ON_SAVE~LOG_WRITE_DB:

This method you can use to write it the log to Database Table if you are using HANA as DB

For our case , we are tracking the changes in DSO, so , we did not use method 4 or 5 .  Still , we activated these two ( d and e)  method ,otherwise BAdi activation was throwing error.

 

**** Please check attached document for complete code

Once we put all our code in respective method , we need to fill Filter for this BAdi implementation.  Double click on the filter area and put your Real Time Cube name.

 

                    Pic 10.PNG

 

3. Login to Planning workbook and Update Values :

Now, we need to login to our planning workbook and manually adjust the number of Product Base and then save it in real time cube.

                        

PIC11.PNG

Note , we have changed Actual Product Base for first 4 rows and save them in the planning cubes .

 

We will check our Direct Update DSO to see if our BAdi has logged all those changes and the user id who changed it.

 

    PIC12.PNG

 

As we can see , it logged my user id and date, time and save id for the change I did.  If you want to update to some other target only the last changed time and change by user , you can read only the latest record by sorting with time .

 

Please find complete codes in link ( dropbox) , just need to adjust the portion highlighted.

Dropbox - Class Methods.pdf

 

 

Debug Tips :  If you face any problem, please set external breakpoints inside the methods one by one and debug.

 

 

For some more detail, please check How to... Log Changes in Plan Data when using the SAP BW Planning Applications Kit

 

 

 

Cheers

Anindya

Analyzing data access (of SAP BW data)

$
0
0

Business data is often viewed as the critical resource of the 21th century. The more actual the business data is, the more valuable it is considered. However historic data is not utterly worthless. To offer the best possible, meaning the most performant, consistent and correct access to data given a fixed budget, we need to know: Who consumes which slice of our business data at what point in time? This blog is about how to find out valid answers to this question from the perspective of a BW administrator.

Access to the data is granted via SAP BWs analytic engine. SAP BW users access the data via a BExQuery. The analytic engine in turn requests the data from the persistency services. BW (on HANA) offers a multi-temperature data lifecycle concept, with data stored in-memory and columnar format, usage of the non-active data concept, in the HANA ExtendedStorage (aka Dynamic Tiering), usage of the Nearline-Storage options, or archiving and, of course, you can delete the data.

Now given our fixed budget, how should we find out how to distribute the data across the different storage layers?

SAP BW on HANA SP 8 comes equipped with the “Selection statistics”, a tool designed to track data access and then assist finding a proper data distribution. With the selection statistics you can record all data access requests of the Analytic Engine on your business data. The selection statistics can be enabled per Info Provider. If enabled, then for each data access request the minimal and maximal selection date on the time dimension, the name of the Info Provider, the name of the accessing user and the access time are stored.

One of the major use cases for the “Selection Statistics” is for the “Data aging” functionality in the Administrator workbench (Administrative Tools->Housekeeping->Data Aging) is to be able to propose time slices for shifting data to the NearlineStore. Technically the “Data Aging” tool assist in creating:

  • Data Archving Processes
  • Parametrization (variants) of Data Archiving Processes , containing the proposed time slice
  • Process Chains that schedule the Data Archiving Processes

The recording of selection statistics is currently limited to time slices only. This limitation was introduced to

a)      keep the amount of recorded data under control

b)      minimize the impact on the query runtime due to the calculation of the data slices.

c)      emphasize time filters, that are usually provided in all queries and are the most important criteria when it comes to data retention and lifecycle considerations. 

If you would agree to this fine, otherwise feel free to post a comment and share your view.

 

Here some screenshots that demonstrate the use of the tools:

1.)    Customizing the selection statistics (transaction spro)

Pic1.pngpic2.png

 

2.)    Analyzing the selection statistics

 

pic3.png


 































3.)    Using selection statistics for Data Aging

pic4.png

Viewing all 333 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>