Quantcast
Channel: SCN : Blog List - SAP Business Warehouse
Viewing all 333 articles
Browse latest View live

Performance Issues in SAP BI/BW

$
0
0

Hi

Performance can be based on

1.      Reporting Performance

2.     Load Performanceor General Performance.

For improving the Report Performance

1. Create Aggregates on Info cubes

2. Use OLAP Cache for buffering the query result to reduce burden on database

3. Pre-calculated Web templates helps to distribute workload of running report to off-peak hours and can       have report result set ready for very fast access to data.

4. Use small amount of result data as starting point of any queries and do the drill down

5. Avoid reporting on ODS

6. Prefer to use Inclusion rather than using Exclusion. 

7. Use read mode "read when navigating and expanding the hierarchies"

8. Use compression of info cube since the E table is optimized for queries

9. Create additional indexes at manage data target - performance tab

10. Run DB Statistics often

For improvingLoad Performance:

1. Check the ABAP Coding at Transformations this would make performance slow

2. Keep available of more dialog processes available and do load balance to different servers

3. Indexes on source tables

4. Use fixed length files if you load data from flat files and put the files on the application server

5. Prefer to use SAP delivered standard extractors

6. Use PSA and Datatarget parallel option in the infopackage load settings

7. Start several infopackages parallel with different selection options

8. load master data before loading transaction data.

9. Use Write Optimized DSO for improve the load performance for Full load.

10. Minimize work process.

11. Delete Application log entries &  Idoc  table entries.


For improving
General Performance

1. Archieve and delete the old data

2. Use line item dimensions for large data

3. Use BW statistics cube to monitor the performance

4. If you are not going to use DSO for reporting disable the Bex reporting flag

 

 

Regards,

Sudhakar.


BW Delta Process (Record Mode)

$
0
0

Hi All,

 

RODMUPDMOD - BW Delta Process: Record Mode

 

Definition
This attribute describes how a record is updated in the delta process. The various delta processes support different combinations of the seven possible characteristic values. If a DataSource implements a delta process that uses several characteristic values, the record mode must be a part of the extract structure and the name of the corresponding field has to be entered in the DataSource (ROOSOURCE-INVFIELD).

The seven characteristic values are as follows:


'': The record delivers an after image.


The status is tranferred after something is changed or added. You can update the record into an IncoCube only if the corresponding before image exists in the request.

'X': The record delivers a before image


The status is transferred before data is changed or deleted.
All record attributes that can be aggregated have to be transferred with a reverse +/- sign. The reversal of the sign is carried out either by the extractor (default) or the Service API. In this case, the indicator Field is inverted in case of cancellationmust be set in the DataSource for the relevant extraction structure field.
These records are ignored if the update is a non-additive update of a DataStore object.
The before image is complementary to the after image.

'A': The record delivers an additive image.


For attributes that can be aggregated, only the change is transferred. For attributes that cannot be aggregated, the status after a record has been changed or created is transferred. This record can replace an after image and a before image if there are no non-aggregation attributes or if these cannot be changed. You can update the record into an InfoCube without restriction, but this requires an additive update into a DataStore object.

'D': The record has to be deleted.


Only the key is transferred. This record (and its DataSource) can only be updated into a DataStore object.

'R': The record delivers a reverse image.


The content of this record is the same as the content of a before image. The only difference is with a DataStore object update: Existing records with the same key are deleted.

'N': The record delivers a new image.


The content of this record is the same as for an after image without a before image. When a record is created, a new image is transferred instead of an after image.
The new image is complementary to the reverse image.

' Y':The record is an update image.


This kind of record is used in the change log of a Data Store object in order to save the value from the update. This is for a possible rollback and roll- forward for key figures with minimum or maximum aggregation. This record also has the update value for characteristics (in this case, it is the same as the after image).
Null values are stored for key figures with totals aggregation. An update image is only required when the value from the update is smaller or larger than the before image for at least one key figure with minimum or maximum aggregation.

Table RODELTAM determines which characteristic values a delta process uses (columns UPDM_NIM, UPDM_BIM, UPDM_UIM, UPDM_AIM, UPDM_ADD, UPDM_DEL and UPDM_RIM). The table has to ensure that only meaningful combinations of the above values are used within a delta process.

When extracting in the Delta update mode in the extracted records for the record mode, a DataSource that uses a delta process can deliver only those characteristic values that are specified in the delta process.

 


Regards,

Durga Prasad.

How to create PSA between further data targets in BW 7.x Data Flow?

$
0
0

I noticed many people in SCN were asking questions regarding PSA creation between data targets. Keeping this in view I am writing this step by step blog on how to create a PSA in between further data targets in BW 7.x and higher versions.

 

This approach is not the best practice and might not be applicable in most of the scenarios.

 

We all know that PSA is not mandatory anymore while loading data to further data targets.

 

But if you want you can create a PSA in between data targets based on following assumptions:

  • If your scenario demands a requirement of PSA between further Data targets.
  • You want to have a intermediate PSA and the data flow needs to be in 7.x version.
  • You want to use the flexibility of DTP on top of PSA in between further data targets.
  • If you want to perform some manual correction of data before loading into cube.
  • If you would like to have a additional layer of transformation.

 

Following is the Step by Step document to achieve this:

 

The steps are based on Load from ODS to Cube. The same can be applied in any further data targets.

Example:  Assuming that you scenario already has a DSO and Cube

 

DSO Name: ZDSO_TES, Cube Name : ZCUB_TES

 

Step 1: We know that when ever a DSO is created, automatically data mart interface gets created with the prefix ‘8’ followed by DSO technical name. Find the data mart Infosource. In this case the Infosource name would be 8ZDSO_TES.

 

1.jpg

Step 2: Once you find the 8ZDSO_TES Infosource, you can expand the flow to view the associated transfer rules and Datasource.

2.jpg

Step 3: Right click on the transfer rule and select the option migrate.

3.jpg

     Select copy Infosource 3.x to New Infosource

4.jpg

 

     Provide a technical name for the New Infosource, which will be 7.x Infosource.

 

5.jpg

    

     You should get the following message.

51.jpg

Step 4.  Use the find option to view the generated Transforamtion.

     You have to use the Find under Modeling tab to view the generated transformation.

6.jpg

 

     It will look something like this:

7.jpg

 

Step 5: Double click on the transformation and in the change mode activate the transformation.

8.jpg

 

At this point the transformation is active and a 7.x generated Infosource gets created.


Step 6: Create a Transformation between Cube ZCUB_TES and the generated Infosource ZINT_PSA.

     Right click on the cube and select create transformation:

 

9.jpg

Select Source object type “Infosource”.  When you click on the Name you should be able to find the 7.x generated Infosource in the list.

10.jpg

     Select the generated Infosource, in this case it is ZINT_PSA.

     Activate the transformation.

 

Step 7:  Now the flow will look like below.

11.jpg

 

 

 

Step 8: Create DTP

Select the Source of DTP as DataSource and select your Datasource (8ZDSO_TES) from the list.

12.jpg

13.jpg

 

 

Step 9: Create Infopackage.

 

Infopackage will be created under the Datasource “8ZDSO_TES”

14.jpg

Step 10: Schedule the Infopackage and then Schedule the DTP.

 

Now if you schedule the infopackage, the data will be loaded from ODS into the PSA, then the DTP schedule will load the data into Cube.

Request loaded into PSA.

 

15.jpg

 

     DTP schedule loaded data into Cube successfully.

16.jpg

 

SAP has already provided different ways to control your data load between data targets ( ODS to ODS, ODS to CUBE, CUBE to CUBE), but if you want to have a PSA in between data targets in 7.x version of BW then you can use the above approach.


Filtering query results based on several combinations of Infoobject values

$
0
0

Requirement

 

  • A Query contains the characteristics ZBASEA, ZBASEB, ZBASEC in the rows and several key figures.
  • The requirement was to allow filtering by exclusion of combinations of values for ZBASEA, ZBASEB, ZBASEC - that is,AND(NOT(ZA1,ZB1,ZC1),NOT(ZA2,ZB2,ZC2),...).
  • The filter should be selectable ad-hoc - that is, the user should be able to filter out only combinations that are actually in the query results.
  • It should be possible to save the filter.

Solution overview

 

  • An infoobject ZCOMP containing all possible combinations of ZBASEA, ZBASEB, and ZBASEC as master data was incorporated into the query and the underlying infoprovider. Relevant texts were created in order to make it clear what was being filtered.
  • The BEx setting "Only Posted Values for Navigation" allows filtering only combinations that are in the query results.
  • The WAD command SET_VARIABLES_STATE allows passing filter values into the variable screen, which allows saving the variable values as a variant.

 

Detailed Solution

zcomp.jpg

 

  1. The infoprovider on which the query is based is ZIC. The data in ZIC is loaded from the DSO ZDSO (there is no practical difference if the data is loaded from a PSA for the purposes of this solution). ZDSO and ZIC have the same infoobjects. For the purposes of the solution ZCOMP was added to ZIC.
  2. A new DSO ZCDSO containing ZBASEA, ZBASEB and ZBASEC was created. ZCDSO contains all combinations of the characteristic values from ZDSO. ZCDSO is loaded using a transformation from ZDSO. The transformation uses a start routine for faster aggregation of the data. See code below.
  3. ZCOMP has ZBASEA, ZBASEB and ZBASEC as attributes. The attributes infoprovider is loaded from ZCDSO. ZCOMP's value is calculated as a unique record number. This is done using start and end routine - see code below.
  4. ZCOMP's text infoprovider is loaded from the attributes infoprovider using an expert routine that fetches texts from relevant text tables. See code below.
  5. ZCOMP's value in ZIC is calculated from the values of ZBASEA, ZBASEB and ZBASEC in the end routine. see code below.
  6. ZCOMP is included in the rows of the query and hidden using the columnwidth module used in the analysis item showing the query. A filter web item is shown above the analysis item.
  7. In ZCOMP's properties in the query, under "Filter Value Selection during Query Execution", "Only Posted Values for Navigation" is chosen.
  8. The WAD command SET_VARIABLES_STATE allows passing filter values into the variable screen. See Code Below.

Notes

I've replaced the original infoobjects with placeholder names to emphasize that this can be used with ANY module rather than a specific business scenario. Obviously the solution is also extensible to any type or number of infoobjects within reason. Therefore the code samples are meant to be more of a guideline and not as copy-paste ready code.

 

In theory, ZCOMP can be loaded directly from ZDSO, as ZCDSO does not contain any data not present in ZCOMP. I thought adding ZCDSO would make for a good  seperation of loading tasks.

 

Use of ZCOMP as a compounded infoobject could save the need for the unique key and the ZIC end routine. Might have an effect on performance though.


 

2. ZDSO->ZCDSO start routine - aggregation of data

 

*$*$ begin of routine - insert your code only below this line        *-*

 

*aggregation - speeds up loading

SORT SOURCE_PACKAGE BY ZBASEA ZBASEB ZBASEC
DELETE ADJACENT DUPLICATES FROM SOURCE_PACKAGE

COMPARING ZBASEA ZBASEB ZBASEC.

 

*$*$ end of routine - insert your code only before this line         *-*

3A. ZCDSO->ZCOMP start routine - don't load existing combinations

 

*$*$ begin of routine - insert your code only below this line        *-*

 

* delete rows that already exist in master data from source package

DATA wa_zcomp TYPE /BIC/MZCOMP.

 

SELECT * FROM /BIC/MZCOMP

  INTO CORRESPONDING FIELDS OF wa_zcomp
  FOR ALL ENTRIES IN SOURCE_PACKAGE

  WHERE ZBASEA = SOURCE_PACKAGE-ZBASEA AND

       ZBASEB = SOURCE_PACKAGE-ZBASEB AND

        ZBASEC = SOURCE_PACKAGE-ZBASEC


        AND OBJVERS = 'A'.

  DELETE SOURCE_PACKAGE

    WHERE

ZBASEA = wa_zcomp-ZBASEA AND

 

       ZBASEB = wa_zcomp-ZBASEB AND

 

       ZBASEC = wa_zcomp-ZBASEC.


ENDSELECT.

 

*$*$ end of routine - insert your code only before this line         *-*

 

3B. ZCDSO->ZCOMP end routine - calculate ZCOMP values

 

*$*$ begin of 2nd part global - insert your code only below this line  *

DATA g_index TYPE /BIC/OIZCOMP.

SELECT MAX( /BIC/ZCOMP ) FROM /BIC/MZCOMP INTO g_index.

 

*$*$ end of 2nd part global - insert your code only before this line   *

 

 

 

*$*$ begin of routine - insert your code only below this line        *-*

* create unique key

LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.

  g_index = g_index + 1.

  <RESULT_FIELDS>-/BIC/ZCOMP = g_index.

ENDLOOP.

 

*$*$ end of routine - insert your code only before this line         *-*

 

4. ZCOMP-> ZCOMP expert routine - creating texts

* for example purposes only, obviously heavily dependent on scenario

 

 

 

 

 

*Select relevant texts from different text tables and combine those into

*one text

    DATA: it_zbasea TYPE TABLE OF /BI0/tzbasea,

          wa_zbasea TYPE /BI0/tzbasea,

it_zbaseb TYPE TABLE OF /BI0/tzbaseb,

 

          wa_zbaseb TYPE /BI0/tzbaseb.

 


    SELECT * FROM /BI0/tzbasea INTO CORRESPONDING FIELDS OF TABLE

   it_zbasea
      FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE

      zbasea = SOURCE_PACKAGE-zbasea.

 

    SELECT * FROM /BI0/tzbaseb INTO CORRESPONDING FIELDS OF TABLE

   it_zbaseb

      FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE

      zbaseb =  SOURCE_PACKAGE-zbaseb .

 

* create the text for each source record

    LOOP AT SOURCE_PACKAGE ASSIGNING <SOURCE_FIELDS>.

      RESULT_FIELDS-/BIC/ZCOMP= <SOURCE_FIELDS>-/BIC/ZCOMP.

 

 

      READ TABLE it_zbasea

      WITH KEY zbasea = <SOURCE_FIELDS>-zbasea LANGU = 'E'

      INTO wa_zbasea.

      if sy-subrc <> 0.

        clear wa_zbasea.

      endif.

      READ TABLE it_zbaseb

      WITH KEY zbaseb = <SOURCE_FIELDS>-zbaseb LANGU = 'E'

      INTO wa_zbaseb.

      if sy-subrc <> 0.

        clear wa_zbaseb.

      endif.

      CONCATENATE wa_zbasea-TXTMD(26) '\' <SOURCE_FIELDS>-zbasec(4)

      '\'

      wa_zbaseb-TXTMD(26) INTO RESULT_FIELDS-TXTLG.

      APPEND RESULT_FIELDS TO RESULT_PACKAGE.

    ENDLOOP.

 

 

5. ZDSO->ZIC end routine - calculate ZCOMP's value

 

 

 

 

$*$ begin of global - insert your declaration only below this line  *-*

... "insert your code here

TYPES: BEGIN OF  t_zcomp,

        ZCOMP           TYPE /BI0/OIZCOMP,
        ZBASEA           TYPE /BI0/OIZBASEA,

        ZBASEB          TYPE /BI0/OIZBASEB,

        ZBASEB          TYPE /BI0/OIZBASEC,

END   OF t_zcomp.

*$*$ end of global - insert your declaration only before this line   *-*

 

*$*$ begin of routine - insert your code only below this line        *-*

* compute zcomp value from existing fields

DATA: it_zcomp TYPE TABLE OF t_zcomp,

      wa_zcomp TYPE t_zcomp.

 

SELECT * FROM /BIC/MZCOMP INTO CORRESPONDING FIELDS OF TABLE

it_zcomp

  FOR ALL ENTRIES IN RESULT_PACKAGE

  WHERE ZBASEA = RESULT_PACKAGE-ZBASEA AND

        ZBASEB = RESULT_PACKAGE-ZBASEBAND

        ZBASEC = RESULT_PACKAGE-ZBASEC.

 

  SORT it_zcomp BY ZBASEA ASCENDING

           ZBASEB ASCENDING

           ZBASEC ASCENDING.

 

  LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>
    READ TABLE it_zcomp WITH KEY ZBASEA = <RESULT_FIELDS>-ZBASEA

                                        ZBASEB =

                                        <RESULT_FIELDS>-ZBASEB

                                        ZBASEC =

                                        <RESULT_FIELDS>-ZBASEC

                                        INTO

                                        wa_zcomp.

    <RESULT_fields>-/BIC/ZCOMP =  wa_zcomp-/BIC/ZCOMP.

  ENDLOOP.

 

*$*$ end of routine - insert your code only before this line         *-*

 

8. WAD command for passing filter values from analysis item into variable screen

 

 

<bi:ACTION type="CHOICE" value="INSTRUCTION" >

      <bi:INSTRUCTION >

          <bi:SET_VARIABLES_STATE >

              <bi:VARIABLE_SCREEN value="X" />

              <bi:VARIABLE_VALUES type="ORDEREDLIST" >

                  <bi:VARIABLE_VALUE type="COMPOSITE" index="1" >

                      <bi:VARIABLE value="ZCMP_MUL" text="ZCMP_MUL" />

                      <bi:VARIABLE_TYPE type="CHOICE" value="SELECTION_BINDING_TYPE" >

                          <bi:SELECTION_BINDING_TYPE type="CHOICE" value="DATA_PROVIDER_CHARACTERISTIC" >

                              <bi:DATA_PROVIDER_CHARACTERISTIC type="COMPOSITE" >

                                  <bi:DATA_PROVIDER_REF value="DP_4" />

                                  <bi:CHARACTERISTIC value="ZCOMP" text="ZCOMP" />

                              </bi:DATA_PROVIDER_CHARACTERISTIC>

                          </bi:SELECTION_BINDING_TYPE>

                      </bi:VARIABLE_TYPE>

                  </bi:VARIABLE_VALUE>

              </bi:VARIABLE_VALUES>

          </bi:SET_VARIABLES_STATE>

      </bi:INSTRUCTION>

  </bi:ACTION>

 

Table to get the list of workbooks in a specific Role

$
0
0

This blog explains  how to get the list of workbook under the specific role.

 

We can get the list of workbooks in a role using AGR_HIER table.

 

1. Goto SE16 Transaction.

 

2. Provide the table name as "AGR_HIER".

 

3. Provide the rolename in "AGR_NAME" feild and "RRMX" in "REPORT" as below and Execute.

 

1.jpg

 

Then we will get list of workbook ids which are in SAP_GUID feild under the specified role.

Have an external application like cognos execute a process chain

$
0
0

So here is a simple way to communicate between an external system and SAP BW.

 

The business scenario where we used this:

Our planning application from Cognos is running on SQL Server. Once the publish and the planning process is completed, we need to load budget data to BW.

Currently this is done on a fixed time and can either cause delays in the update or when the publish process isn’t finished causing the load to fail.

 

The goal is once Cognos is finished publishing the data and ready to be uploaded to BW have it automatically trigger a process chain in BW

 

Here is the approach that we followed to communicate between Cognos and BW (or any other external application you might have)

 

On BW side:

 

- First we need an Event (ZBU_LOAD) in SAP BW. Can be created with T-Code SM64.

A.png

- Use this Event now to trigger your process chain that will load the data from your external system.

 

On your external Application:

 

- Next we need is to trigger this event. For this we need PsExec that can be download from http://technet.microsoft.com/de-de/sysinternals/bb897553.aspx.

 

We added this syntax line after the batch file step finished publishing the data. You can put it anywhere in  processes on your external system.

 

\\BWSERVER\sapmnt\sapevt\sapevt.exe ZBU_LOAD =BWSERVER pf="Default Profile Path in BW".

 

The full syntax line then looks like this:

psexec -u DOMAIN\USER -p PASSWORD \\BWSERVER\\BWSERVER\sapmnt\SID\DVEBMGS00\exe\sapevt.exe ZBU_LOAD=BWHOST pf=\\BWSERVER\sapmnt\SID\sys\profile\default.pfl

 

Most likely your SAP basis team need to help you finding the exact path of sapevt.exe and Default profile.

 

Make sure the user connecting to your BW system has access right to the folder and is able to execute the sapevt.exe!

 

This will now trigger the load of your process chain via Event. We just need to call this command from external system... And this will trigger process chain in BW system.  You can add this command easily in a batch file.

0 records problem in Master Data load

$
0
0

On a rainy noon of Friday faced strange type of problem while loading master data text in SAP APO server (EHP2 for SAP SCM 7.0).

 

Problem: There is one generic transaction data source for Master data which was coming from BW side. We have to load its data in different-different info objects like plant, division, sales organisation etc. by putting selection of particular info object in Info package. e.g. - Division

 

Info+Pack+for+Division.PNG

 

After schedule the Info package we were able to see the 76 records of division at PSA level. Then putting the same selection at DTP level

 

DTP+0+recorde.PNG

                     

After executing the DTP we was getting 0 records at manage level instead of 76 records for Division at PSA level.

 

Manage+0+recorde.PNG

 

 

Solution:


1. Check the data at PSA level,how it looks

 

Data in PSA.JPG

                         

2. At DTP level filter is same as data in PSA not as selection of info package(all upper case letter)

 

DTP+for+Division.PNG

                 

3. After executing the DTP we were able to got 76 records of division at info object manage level.

 

Manage+recordes+added.PNG

 

The solution is very simple but some times simple things do wonderful things.

 

Note: While doing master data load check Handle Duplicate records in update tab of DTP and activate master data after loading.

 

setting for master.PNG

 

Thanks for reading..

Extracting Data from R/3 & Using DB Connect

$
0
0

Extraction: Extracting data from R/3 & using DB Connect.

Flat file Extraction: Once we create Info source, it automatically creates Data Source.

 

Extract Structure: Grouping of fields indicating what to be extracted from source.

Transfer Structure: Out of extracted fields what to be extracted source to transferred to B.W

Types of Extractors from R/3:

Lot of application areas: SD, MM, FI…..etc.,

Business Content Extractor: Any content delivered by SAP (Delivered Version), we must install to active version.            

Customer generated extractors: We don’t really create DS. Based on some other object it generates D.S(Extractor).

Ex: LIS, COPA, FISL.

Generic Extractions: - We explicitly create our own DS using Tables, FM, Views, Domain, Infoset.

LIS is outdated.

Infoset Query:- We can build query multiple objects, reports, domain, table, view, FM structure,

  1. T.code: SQ02.

DB Query: we are able to build multiple tables

SQ01: Run Infoset Query

SQ02: Creation of Infoset Query

SQ03: Assign user group

Generic Extraction:- Different ways to generate Generic DS  using Table, Views, FM, Domain, Infoset

How to create the DS using database pool & extract data.

VBAK (MANDT will never extract to BW) VBELN, ERPAT, NETWR, WEARK has to send to BW

First Step: Go to RSO2. No flexibility to create Hierarchy DS.

  1. How do I choose Master Attribute, Transaction, Master Text)…Depending on the target. If it is cube Transaction data, Attribute is attribute text is text.
  2. Whenever we are creating DS it designating (Assigning) to Application Component.
  3. Only when we are creating text data, domain option will be enabled other than this only enabled from view, query, FM.

It defines DS with the extract structure when save the DS(MANDT field will never extract from source)

When save DS it automatically will go to field selection.

A) Selection                                                        B) Hide field

A) it enables at BI side, data selection table

B) will play on transfer structure(i.e., will not available at R/3 & BI transfer structure)

 

  1. Next step __MCE_ITEM__à Replicate the DS in BW

Extract structure in R/3

After replicating DS, trans structure in BW

When we activate Tr. Rules in DS it forms (creates) Tr.structure in in R/3

When we used based on FM, explicitly define the extract structure, by using FM, in FM if we used select statement it reads all the records,

If we want read all the records per packet...

Ex: 1000 per packet 100… 10 packets 100*10=1000

  1. Open Cursor 10*100. When the first time reads 100 records, if packet 1. Next time on words…..Cursor

LO Extraction (Mostly we go for LO)

To extract logistic data from R/3 to BW…We have 1) LO 2) COPA 3) Generic

Logistic: Any product based, we buy raw material and we sell to customer, all these transaction called Logistic.

  1. Raw Material__MCE_ITEM__àInventory__MCE_ITEM__àPP__MCE_ITEM__à __MCE_ITEM__à __MCE_ITEM__à __MCE_ITEM__àQM __MCE_ITEM__à delivered to cash…all these are in LO Extractor

Shaff Floor: Machinery Calculation. How much o/p coming on each Machine.

LO: we have DS generated by SAP (Delivered Version)

Generic:  we have to create DS.

Every LO data source come with extract structure. Initially only extract structure…When we replicate it creates Transfer Structure in R/3….When we activate transfer rules then transfer structure in R/3.

                2LIS_11_VAHDR

Every LO starts with 2LIS

11 indicates application component number (11: sales order).. We have 02: purchase order, 12: Delivery, 13: Delivery, 03 : Inventory and so on…

VAHDR: (VA: Event of application HDR: Header lever data).

MC11VA0HDR: (Extract Structure)

0(zero) root node of application component Hierarchy.

SD:->

  1. Enquiry__MCE_ITEM__àQuotation __MCE_ITEM__à Sales Order __MCE_ITEM__àDelivery __MCE_ITEM__àPGI (Post Goods Issue i.e.. Literally give to customer)

How do we extract Master data:

We can extract using Generic or Business Content Extractor (DS)

Ex: Which business content extractor we have to use.

All DS are stored in table ROOSOURCE. We can find type of DS.

Transaction DS:- To extract DS from R/3 (Logistics).

ROOSOURCE (Table):- Execute fields:- Copy this field and Goto RODELTAM( Delta Process Details) ABR,AIE.

How do we decide (DSO) or Infocube

Depends on DS behavior, whenever change in the document if DS giving after image and before image we can load to cube directly, whenever the document after image we can load to DSO…it is called as Delta Process of DS, we can decide whether IC or DSO

Note:  DSO is   Detail level of reporting

                                    Detail level of Staging

Suppose Company depends on code

Scalability and layer architecture

First level is Staging (same as R/3 data)

Second level is Tranfer

Third level is Reporting

How to enable or Customize field property of Extract structure

  1. RSA6:__MCE_ITEM__à Any DS try to change __MCE_ITEM__à field selection of Extract Structure…Field (Selection Hide)

ROOS: Extraction…How it is enabled in which one has enabled, which is not…Maintain in the Table ROOS. .Settings can change for hide, selection.

 

  1. How to generate the LO DS:__MCE_ITEM__à When already DS is available, why do we need to generate DS.

            2LIS_11_VAHDR (Sales Order Header Data)

Step1: Identify the DS and it is in Delivered version, we have to install to active version.

 

To Activate      A) Navigate thru SBIW or RSA5 directly…..Install DS

                        B) If DS is active, inactivate the DS then go for maintenance

                        C) 100% we will never satisfied with extract structure we have to add fields go to

                                    __MCE_ITEM__à Maintain Extract Structure by selection from communication structure

            It allows to maintain with respect to SO Header only, 2LIS_11_VAHDR allows to maintain only from communication structure of header only which are related to sales order header.

Selection: will be available in Info Package for selection of fields.

Hide: When we don’t need then we go for Hide, will not be available in transfer structure in R/3, BW

Means of maintain extraction structure:__MCE_ITEM__à When we not satisfied with fields in the extract structure we go for add more fields from communication structure to extract structure.

Communication Structure :-> MCVBAK, MCVBAP, MCVBUP:__MCE_ITEM__à add to extraction structure

Fields in right side are communication structure

Fields in left side are extract structure

3)  Inversion: Will enable only for Key figures only……When ever the document changed, it brings 2 records, One is before image(1000) and Second is After Image (1500)

Delta: Changed record will come to BW.. Value in BW is 3500; to get 1500 we have to use Inversion.

4) Field only : Enhancement Purpose only

5) Update Modes:

            Serialized V3 is Outdated

Based on updated mode what we specified in LBWE, how delta records are processed will be defined…It brings the data based on time, data and sorting

__MCE_ITEM__1)      Direct Delta: Specific to Application. Its against to application level. We can’t specify update mode to particular DS

 

After this replication the DS in BW. It reads complete details of metadata of DS, it also maintains time stamp at what time it is replicated.

 

All historical data migrated to BW with Init, and next on words loading with Delta.

 

 

VA01, VA02,VA03:__MCE_ITEM__à New record when user creates new SO

 

SETUP TABLES: setup table is implemented with concept of Cluster table. Cluster table is DB Layer, It can be viewed multiple table with primary key and foreign key relation in Application Layer.

 

In the setup table, Every DS shows each level of Header, Item, Schedule line item which stores in one table in the DB. In the application it shows in  different level, it shows different DS.

 

When Info package runs with full or init data comes from setup table, When Info package runs with delta data comes from RSA7( Delta Queue)

 

Before we do the Delta we must run init, Before run init we must fill the setup table by doing statistical setup, it filled the data from data base thru communication structure (Interface like)

 

Whenever we run Statistical Setup Table, it has fill the data from database to setup table meanwhile of the running the SST the end user may enter new records those records will be missed, it will be mismatch with R/3 and BW.

 

How do we do?

We have to speak with end user, init will run off time (like Sat, Sun i.e Basis people)will down the server) from Monday delta will run as it is.

 

Before we run SST table must clean (delete) the data, again we run the SST

 

After Init… Whenever end user creates new S.O, the system also creates one LUW(Logical Unit of Work), if modified the existing record, LUW has 2 records(Old and New records)

X- Before image

Blank 0-After image.

 

V2= Asynchronous Update (Default)

V1= Synchronous Update

           

            DS                                            SETUP TABLE

__MCE_ITEM__1) 2LIS_11_VAHDR                            MC11VA0HDR

 

Select in SE11, by giving this setup table to confirm the when DS data has cleaned or not we go to setup table in SE11.

T.Code: OLI*BW (here OLI7BW)

 

__MCE_ITEM__1)      We can restrict data by document data restriction.

__MCE_ITEM__2)      Control the setup

Name of the Run:

Transaction Data:

Transaction Time:

 

__MCE_ITEM__o   Re determine Update group

__MCE_ITEM__o   Update documents

__MCE_ITEM__o   Block all orders (Means Blocking orders)

__MCE_ITEM__o   We are following for 100 records in No. tolerated

For Sales: Queued Delta

Inventory: Un serialized V3

Direct Delta:

The extracted data is transferred directly to the delta queue with the Document Posting. Thus the sequence of the transfer of document agrees with the chronological order of data creation.

1.png

Observation:

LO cockpit setting: Update mode as “Direct Delta” for the application (11) in LBWE transaction.

The exercise starts after the successful completion of nth Delta.

2.png

 

When run full ( Historical Data in 2 ways)

 

 

                  Full                  Initialize with data transfer

  1. 1)      Delete the setup table( LBWG)
  2. 2)      Run OLI*BW to fill the set up tables (Pre-calculated)
  3. When run Deltaà Data load in BW from Delta Queue (RSA7)

Whenever delta fails, will run repeat delta (present recs &Delivered

 

 


3.png

Posting frequency is less in Direct Delta

SMQ1: To delete the data in Queue

 

Unserialized V3( Only for new postings) [No changed Records]

 

5.png


 

V3 is asynchronous with Background

After run the init, Delta will be enabled . In delta Data maintain 2 sets.

  1. 1)      Delta            Present Data
  2. 2)      Delta Repeat Present +Previous Data (Present it stays until next delta run)

Whenever delta loads fails, we do repeat delta

Repeat Data contains previous delta records + present delta records.

Steps: 1) Set QM status not OK (Even it is red already, we have to set the QM status not OK)

2)` Delete the Bad request from data target once we delete Schedule the IP delta

LIS: Limitation in LIS

  1. 1) In LO we have one DS for Header

      Item

      Schedule

That is not the case in LIS

  1. 1)      In LIS we had single Ds to extract all Header, Item, Schedule line (Huge Volume come) With LIS
  2. 2)      In LIS we have (information Structure) these are like setup table.

It acts as transparent table, Delta records at stores at IS & Delta (RSA7)Un necessarily stores IS

3) Degrade the performance (due to sorting there are so many problem)

4) It can’t take different languages at a time

5) LUW1—English--- One segment---frequent data failures)

     LUW2—German--- One segment---frequent data failures) (Sequence + Segment) Initially LO only have V3 job

In Inventory no need of serialization of posting (in and out)

For this we go for Unserialized V3 in LO whenever we extract with Unserialized V3, don’t go for DSO, Its overwriting

When transaction is less we go for Direct Delta

When transaction is huge and serialization is not required we go for Queued Delta

When transaction is huge and Unserializaton is not required we go for Unserialized V3

How to generate LO DS

Ex: 2LIS_11_VAHDR( Sales order Header)

Install B.C Data Source by using RSA5 in R/3

Before activating search for structure

  1. SE11àData Type Searchà Structure…It will not be available.
  2. LO_Extract_04.3à when we activate transfer rules, now it creates transfer structure in R/3
  3. Flow:- R/3 source systemà Tra.Structureà PSAàTr. RulesàISàCube..Now we can start extraction of cube à IP createà RunàLoad.

Generating DS  (IMP)

How do we migrate Historical data and enable delta mechanism using Direct Delta

Migration means extracting from R/3 to BW is sales order created by and user for business. R/3 is already in live and BW is development (Implementation)

Historical transactions, all of them process through init and whatever new or changed transaction since last update delta.

  1. We have application VA01,VA02,VA03, Whenever creates SO in VA01àcreates new record in table.. 02 Changes in existing in table.

All of SO information stored VBAK,VBUK our concept is whatever Historical data by today, we must run through init, then enable for Delta.

 

 

 

Regards

 

Naresh


How to Restore the Query from 7.X Version to Older Version

$
0
0

After Migration of 3.X query  to 7.X Version, If we don’t want the query in higher version and want to restore to older version .

 

There is a way to restore to lower version. We can restore by using “COMPONENT_RESTORE” program.

 

The below are the steps to restore the query

 

1.       1.     Goto SE38 Transaction

2.       2.     Provide the Program name as “COMPONENT_RESTORE” and Execute.

 

                    1.jpg

 

 

 

1.      2.      In the next screen provide the infoprovider name and select Component type as “REP” and Execute.

 

                    2.jpg

 

1.      3.      In the next screen select the query that we want to restore and continue.

 

                    3.jpg

 

    

1.            5.     In the next screen it will ask to restore to 3.x version

 

2.            6.     Then click on “yes”

                         4.jpg

     

1.            7.     Now the query will successfully restore to older version.

 

1.   

lo-cockpit steps

$
0
0

LO Steps:

Go to LBWE and maintain the data source.

Make sure all users are locked in the system.

Go to LBWG and delte the setup table.

Delete the delta queue from RSA7

Go to transaction code SBIW and start the filling of setup table.

Replicate the data source BI side.

After completion of setup table from R/3 side you can fetch the data by infopackage


Update types:

Queued Delta
Direct Delta
Unserialized V3

Queued Delta: With queued delta update mode, the extraction data (for the relevant application) is written in an extraction queue (instead of in the update data as in V3) and can be transferred

to the BW delta queues by an update collective run, as previously executed during the V3 update.
After activating this method, up to 10000 document delta/changes to one LUW are cumulated per datasource in the BW delta queues.

Direct Delta: With this update mode,Each document posting is directly transferred into the BW delta queue
Each document posting with delta extraction leads to exactly one LUW in the respective BW delta queues
Just to remember that ‘LUW’ stands for Logical Unit of Work and it can be considered as an inseparable sequence of database operations that ends with a database commit (or a roll-back if an

error occurs).

Unserialized V3: With this update mode, that we can consider as the serializer’s brother, the extraction data continues to be written to the update tables using a V3 update module and then is

read and processed by a collective update run (through LBWE).
But, as the name of this method suggests, the V3 unserialized delta disowns the main characteristic of his brother: data is read in the update collective run without taking the sequence into

account and then transferred to the BW delta queues.

Setuptables: Access to application tables are not permitted, hence setup tables are there to collect the required data from the application tables.When a load fails, you can re-run the load to pull

the data from setup tables. Data will be there in setup tables. Setup tables are used to Initialize delta loads and for full load. Its part of LO Extraction scenario.

 

Regards,

Chary

Errors and Solutions

$
0
0

                                    Errors and Solutions

 

  1. Infopackage CS_ORDER_ATTR_CREATION_DATE failed due to the 797 duplicate record found. 235 recordings used in table /BI0/PCS_ORDER.

Sol: Activated infoobject 0CS_ORDER then pushed data from PSA

 

  1. CO_OM_WBS_1_COMMITMENT_UCA1 data load failed due Record 7393 :Value 7107175.10 not contained in the master data table for 0WBS_ELEMT

Sol: Pulled the required master data from R/3, and pushed the data from PSA. It was completed successfully.

  1. Infopackage:CS_ORDER_ATTR_CREATION_DATE failed due to 37 duplicate record found. 1023 recordings used in table /BI0/PCS_ORDER

Sol: After activating the masterdata for 0CS_ORDER,data was pushed from PSA.

4)

Infopackage ZBWVBAKVBAP_ATTR_CREATION_DATE failed due to the attributes for characteristic ZS_DOC are locked by a change run.

 

Solution:

Pushed data from PSA then activated characteristic ZS_DOC.

 

5)

Infopackage:0MATERIAL_ATTR_DELTA failed due to the attributes for characteristic 0MATERIAL are locked by a change run.

 

Solution:

After getting released from the lock data was pushed from PSA

6)

Infopackage:8Z11VAIT2_DELTA_240404 failed due to record 1726 :No SID found for value '0070547593000010 ' of characteristic ZS_DOC.

 

Solution:

After loading the masterdata for ZS_DOC the process was repeated.

7)

Background job:MD_ACT_DLY_4:00_CSORD_PCTR_ZSDOC failed due to Lock NOT set for: Change run for hierarchies and attributes.

 

Solution:

Repeated the job and completed successfully

8)

0DOC_NUMBER failed in activation due to lock NOT set for: Change run for hierarchies and attributes.

 

Solution:

After getting released from the lock the process was repeated.

9)

8FIAR03_DELTA data load failed due to Record 3046 :Value '100027782x ' (hex. ' ') of characteristic 0AC_DOC_NO contains invalid character.

 

Solution:

Corrected the data in the PSA with '100027782X ' and pushed the data from PSA. It was completed successfully.

10)

CO_OM_WBS_1_COMMITMENT_DX9X_UCA1_COPY data load failed due to Record 7622 :Value 7106979.10 not contained in the master data table for 0WBS_ELEMT

 

Solution:

Loaded the required WBS elements from R/3, pushed the data from PSA. It was completed successfully.

11)

FUNCT_LOC_ATTR data load faiked due to Lock NOT set for: Loading master data attributes.

 

Solution:

After the lock release pushed the data from PSA it was completed usccessfully.

12)

ZBWEQUZ_HIERARCHY_CREATION_DATE data load failed due to 1 duplicate record found. 13613 recordings used in table /BI0/XEQUIPMENT

 

Solution:

Activated the 0Equipment master data and pushed the data from PSA.

13)

Infopackage BWSOITMD failed due to the attributes for characteristic ZS_ORDITM are locked by a change run

 

Solution:

Pushed data from PSA and then activated characteristic ZS_ORDITM.

14)

Infopackage FUNCT_LOC_ATTR failed due to the attributes for characteristic 0FUNCT_LOC are locked by a change run

 

Solution:

Pushed data from PSA then activated characteristic 0FUNCT_LOC.

15)

Infopackage:8Z12VCIT1_DELTA failed due to record 710 :No SID found for value '0111099356000010 ' of characteristic ZS_DOC.

 

Solution:

After loading the master data for ZS_DOC the process was repeated.

16)

CO_OM_WBS_1_COMMITMENT_UCA1 data load failed due to Record 8059 :Value 7107382.10 not contained in the master data table for 0WBS_ELEMT

 

Solution:

Loaded the 0WBS_ELEMT 7107382.10 from R/3 to BW and pushed the data from PSA. It was completed succe3ssfully.

17)

0VENDOR_ATTR data load failed due to the attributes for characteristic 0VENDOR are locked by a change run.

 

Solution:

After the lock release pushed the data from PSA. It was completed successfully.

18)

FUNCT_LOC_ATTR data load failed due to the attributes for characteristic 0FUNCT_LOC are locked by a change run

 

Solution:

After the lock release pushed the data from PSA, it was copmleted successfully

19)

Infopackage BWSOITMD failed due to the attributes for characteristic ZS_ORDITM are locked by a change run.

 

Solution:

Pushed data from PSA then activated characteristic ZS_ORDITM.

20)

Infopackage FUNCT_LOC_ATTR failed due to the attributes for characteristic 0FUNCT_LOC are locked by a change run.

 

Solution:

Pushed data from PSA and then activated characteristic 0FUNCT_LOC.

21)

BWDX9XCOPS01_UCA1_COPY data load failed due to Record 15134 :Value 0001988050000011 not contained in the master data table for ZS_ORDITM.

 

Solution:

Uploaded the dependent master data from R/3 and pushed the datya from PSA. It was completed successfully on 16th Sep '06.

22)

ZZBWCOVP_DX9X_CP_UCA1_COPY data load failed due to Record 3326 :Value 0007828308000020 not contained in the master data table for ZS_ORDITM.

 

Solution:

Uploaded the dependent master data from R/3 and pushed the datya from PSA. It was completed successfully on 16th Sep '06.

23)

ZZBWCOVPD_DX9X_UCA1_COPY data load failed.

 

Solution:

Pushed the data from PSA, and it was completed successfully on 16th Sep '06.

24)

ZZBWCOVPD_DX9X_UCA1_COPY data load failed due to Job cancelled in MP1

 

Solution:

The loade was re-trigged.

25)

8Z12VCIT1_DELTA data load failed due to Record 1578 :No SID found for value '0001990147000011 ' of characteristic ZS_DOC.

 

Solution:

Maitained the master data for ZS_DOC, and pushed the data from PSA it was completed successfully.

26)

Infopackage FUNCT_LOC_ATTR failed due to the attributes for characteristic 0FUNCT_LOC are locked by a change run.

 

Solution:

Pushed data from PSA and then activated characteristic 0FUNCT_LOC.

27)

Infopackage 8Z11VAIT2_DELTA_240404 failed due to the Record 3038 :No SID found for value '0070553024000010 ' of characteristic ZS_DOC.

 

Solution:

Created masterdata and then pushed transactional data from PSA.

28)

Infopackage 0SALESDEAL_ATTR failed due to the attributes for characteristic 0SALESDEAL are locked by a change run

 

Solution:

Pushed data from PSA and then activated characteristic 0SALESDEAL.

29)

Infopackage Backup InfoCube ZPSC04BU_PLAN_DX9X_UCA1_COPY failed due to the Error occurred in the data selection.

 

Solution:

Retriggered the IP:Backup Info Cube ZPSC04BU_PLAN_DX9X_UCA1_COPY it ran sucessfully.

 

30)

Infopackage:BWDX9XCOPS01_UCA1_COPY failed due to record 22190 :Value 0007824107000051 not contained in the master data table for ZS_ORDITM.

 

Solution:

After loading the masterdata for ZS_ORDITM,data was pushed from PSA.

31)

Infopackage 0EC_PCA_1_SE90_CVP failed due to job not triggered in MP1.

 

Solution:

Job Retriggered and the load got sucessful.

32)

Infopackages:ZBWQMEL_ATTR_CHANGE_DATE and ZBWOP22_VP failed due to job terminated in MP1 because of ABAP/4 processor: DBIF_SETG_SQL_ERROR.

 

Solution:

The job was retriggerd and the load was successful.

33)

Infopackage ZBWVBAKVBAP_ATTR_CREATION_DATE failed due to the Job termination in source system.

 

Solution:

Retriggered the Infopackage ZBWVBAKVBAP_ATTR_CREATION_DATE then it ran sucessfully.

34)

Infopackage CS_ORDER_ATTR_CHANGE_DATE failed due to the 293 duplicate record found. 2784 recordings used in table /BI0/XCS_ORDER.

 

Solution:

Activated table /BI0/XCS_ORDER then pushed data from psa.

35)

Infopackage 0EQUIPMENT_ATTR_CREATION_TODAY failed due to The attributes for characteristic 0EQUIPMENT are locked by a change run

 

Solution:

Pushed data from PSA then activated characteristic 0EQUIPMENT.

36)

description:

Infopackage ZBWEQUZ_HIERARCHY_CREATION_DATE failed due to the 1 duplicate record found. 17266 recordings used in table /BI0/XEQUIPMENT

 

Solution:

Activated infoobject 0EQUIPMENT then pushed data from PSA.

Data load issue while loading data from PSA to DSO.

$
0
0

Hi,

 

I faced the data load issue while loading data from PSA to DSO.

 

While loading the data from PSA to DSO, while activating the DSO, status is showing job is cancelled.

 

Data load is failed due to invalid characteristics in PSO.

 

Error shown for Invalid characteristics for 0Material --- '151X1207CK10SP01 ‘,‘151X1207CK10SP02 ' , '151X1207CK10SP03 '

& for Incoterms2 ---- KÖSCHING

 

In the above given values for 0Material space is followed by last digit 1 as shown above & for object Incoterms2-- Ö is the invalid char.

 

To solve this issue now we have to delete the request in DSO & edit the PSA data by finding the invalid characteristics, i.e. for the above Scenario delete the space in 0Material as '151X1207CK10SP01'  '151X1207CK10SP02' , '151X1207CK10SP03' and change the Incoterms2 as KOSCHING and save it.

 

Now ones again execute the DTP & activate the DSO.

 

 

 

Regards,

Koti.

Setting up BW Near-Line Storage with SAP Sybase IQ

$
0
0

It feels strange for me to make a post in the BW section as I feel horribly un-qualified to do this

 

Below is a Video that Ethan Jewett and I co-created as part of the DSLayered video podcast series.

 

What we did was use the new and now native NLS functionality in BW 7.3 SP9 to archive data off into SAP Sybase IQ. This is something that I have been talking about for years (see this blog post dated 12 months ago) as I have had exposure to the PBS NLS add-on from my Sybase background.

 

You will be glad to know that the results were simply awesome and I feel that by implementing NLS now as part of your BW strategy could benefit customers in a few ways, namely:

 

  1. Get the immediate performance benefit of using SAP Sybase IQ ( ** spoiler alert ** SAP Sybase IQ was +- 300 % faster off the DSO level compared to BW running off an InfoCube)
  2. Shrink the size of the database under-lying your BW environment. This has many benefits:
    1. Smaller environment to query on = better performance
    2. Smaller environment to back up
    3. Etc etc
  3. Get customers ready for HANA and know that you have a seamless and slick archiving solution in place with no performance trade-offs. This also means that you do not have to panic about your HANA environment running away from you.

 

I would love to hear every-ones thoughts/opinions on this and I hope I got all the BW terminology correct

 

Get the popcorn ready and enjoy the video

 

Activating Transfer Rule/Structure against Time stamp Error

$
0
0

Hi Experts,

 

This is very well known to everyone who works on Prod/Support projects, we face Time stamp error at info package level due to changes in data sources or when transfer rules are internally Inactive in system. There is this ABAP program which we need to run to overcome this issue. Just run the RS_TRANSTRU_ACTIVATE_ALL which will automatically activates the transfer rules also. Whenever we execute this report on BW side the selection screen will asks for the source system and also the Info source/Data source along with two radio buttons related to Transfer rules with Lock status or with Inactive status. Generally, we go with radio button Inactive status. Once it gets finished, just repeat the info package and there wont be any issue again.

 

Thanks,

Karan Lokesh.

How to overcome R/3 Extraction Job Failure

$
0
0

Hi Experts,

 

We come across R/3 Extraction Job failure mainly in process chains wherein which these are triggered by events created in R/3. There are some scenarios in which the extraction job gets completed but extract job status is failed and hence the status was not updated properly to BW.  In this situation we need to manually change the status of extract program process in BW through a FM "ZRSPC_ABAP_FINISH". We need to execute this FM with correct program process variant and with status "Finished" if the extraction job gets finished in R/3 side or else we need to execute with status "Red" if the extraction job on R/3 gets failed due to some reason and re trigger the entire process chain.

 

Thanks,

Karan Lokesh.


Compression in Brief

$
0
0

Compression :

 

-->Every cube will have two fact tables /f & /e  and have same structure.

   when the char val same, dimension sid is same,c100 50 load after c100 60 load it creats request, except request id

  if run query it will check sthe data base it brings two records in the olap and aggregate the two records in olap and it brings

  only one  aggr ratio is 2/1  is 2. 

  before run the query i compress the cube it deletes the request id information and maintain is zero

  the data will goes to f to e table

  then report come for both the tables.

 

-->Goto Cube-->manage-->collapse tab and select latest request id click release in the request tab there is green mark in compression

  after compression we cant delete the request.

  if u want "with zero elimination" when keyigure should be zero

 

-->if errors are there "request reverse posting". and is multiplied with(-1)for all keyfigures and give it.

  delete data in cube in

 

3 Ways:

 

1.Selective deletion

2.Reqest by deletion

3.Delete data.

 

 

Goto-> Monitor and there is small icon with red symbol "request reverse posting" and  select update rules click immedieately and yes refresh

i can do "request reverse posting" only when the data is in PSA.

 

 

If dont have data in psa complex by selective deletion by full load

in real time compression cube is 7 days before.and check box cal.request id and click.

 

Only compress those requests that are older than xxx

number of days until the request is processed 7.

 

Load cube(ods) to cube it supports delta based on request no.

If source cube is  compression it doesnot enable the delta  because there is no request number.

only compress when the loading is next level target.

 

Hope this info is useful.

 

 

Regards,

Prasad.

How to create PSA between further data targets in BW 7.x Data Flow?

$
0
0

I noticed many people in SCN were asking questions regarding PSA creation between data targets. Keeping this in view I am writing this step by step blog on how to create a PSA in between further data targets in BW 7.x and higher versions.

 

This approach is not the best practice and might not be applicable in most of the scenarios.

 

We all know that PSA is not mandatory anymore while loading data to further data targets.

 

But if you want you can create a PSA in between data targets based on following assumptions:

  • If your scenario demands a requirement of PSA between further Data targets.
  • You want to have a intermediate PSA and the data flow needs to be in 7.x version.
  • You want to use the flexibility of DTP on top of PSA in between further data targets.
  • If you want to perform some manual correction of data before loading into cube.
  • If you would like to have a additional layer of transformation.

 

Following is the Step by Step document to achieve this:

 

The steps are based on Load from ODS to Cube. The same can be applied in any further data targets.

Example:  Assuming that you scenario already has a DSO and Cube

 

DSO Name: ZDSO_TES, Cube Name : ZCUB_TES

 

Step 1: We know that when ever a DSO is created, automatically data mart interface gets created with the prefix ‘8’ followed by DSO technical name. Find the data mart Infosource. In this case the Infosource name would be 8ZDSO_TES.

 

1.jpg

Step 2: Once you find the 8ZDSO_TES Infosource, you can expand the flow to view the associated transfer rules and Datasource.

2.jpg

Step 3: Right click on the transfer rule and select the option migrate.

3.jpg

     Select copy Infosource 3.x to New Infosource

4.jpg

 

     Provide a technical name for the New Infosource, which will be 7.x Infosource.

 

5.jpg

    

     You should get the following message.

51.jpg

Step 4.  Use the find option to view the generated Transforamtion.

     You have to use the Find under Modeling tab to view the generated transformation.

6.jpg

 

     It will look something like this:

7.jpg

 

Step 5: Double click on the transformation and in the change mode activate the transformation.

8.jpg

 

At this point the transformation is active and a 7.x generated Infosource gets created.


Step 6: Create a Transformation between Cube ZCUB_TES and the generated Infosource ZINT_PSA.

     Right click on the cube and select create transformation:

 

9.jpg

Select Source object type “Infosource”.  When you click on the Name you should be able to find the 7.x generated Infosource in the list.

10.jpg

     Select the generated Infosource, in this case it is ZINT_PSA.

     Activate the transformation.

 

Step 7:  Now the flow will look like below.

11.jpg

 

 

 

Step 8: Create DTP

Select the Source of DTP as DataSource and select your Datasource (8ZDSO_TES) from the list.

12.jpg

13.jpg

 

 

Step 9: Create Infopackage.

 

Infopackage will be created under the Datasource “8ZDSO_TES”

14.jpg

Step 10: Schedule the Infopackage and then Schedule the DTP.

 

Now if you schedule the infopackage, the data will be loaded from ODS into the PSA, then the DTP schedule will load the data into Cube.

Request loaded into PSA.

 

15.jpg

 

     DTP schedule loaded data into Cube successfully.

16.jpg

 

SAP has already provided different ways to control your data load between data targets ( ODS to ODS, ODS to CUBE, CUBE to CUBE), but if you want to have a PSA in between data targets in 7.x version of BW then you can use the above approach.


Compression in Brief

$
0
0

COMPRESSION :

 

-Every cube will have two fact tables /F & /E  and have same structure. 

  Before run the query i compress the cube it deletes the request id information and maintain is zero  the data will goes to F to E table.  Then report come for both the tables.

 

Steps for Compression :

 

Goto Cube-->Manage-->Collapse tab
and select latest request id click release in the request tab there is green mark in compression, after compression we cant delete the request.


* When we create cube it creates 2 fact tables F & E, Structure of E & F tables will be same
* Whenever we load data to cube, data will be loaded to F table
* Whenever we do compression  it deletes the  request id information & aggregates the records based on characteristic value combination  and put the data     into E table
* After Compression data will move from F table to E table

 

F table :

 

Figure1.jpg

 

 

 

E Table :

 

 

f2.jpg

 

 

* When do reporting on the Info cube data come from F table & E table
* Once the compression is done in the info cube we cannot delete the data based on request number.
* When we specify a request to compress, the specified request & all the requests below it will get compressed
* When you do compression in collapse tab there is checkbox   - zero elimination, if you select this check box, after aggregating the records if there is any key figure value with ‘zero’ it will delete those records.

 

 

Regards,

Prasad.

Correct values in ECC, Incorrect decimal place in PSA - Bill Of Material Item extraction

$
0
0

Issue Overview

 

Bill Of Material is part of the master data of production planning, specifying details for a product and its components.

There is no standard datasource for Bill Of Material extraction. Instead, there is a very good document by Dhanya Anthikatepambil that has the code and a
suggestion of a BW model.

 

The following post is regarding an issue encountered when trying to extract Bill Of Material Item information from the ECC, where some of the fields show correct values in the ECC system (RSA3) and incorrect decimal place in the PSA in the BW system.

This issue relates to use of the standard function CSAP_MAT_BOM_ITEM_SELECT under certain conditions, but the solution may be relevant for other similar cases.

 

In short, the standard function populates certain fields using the WRITE function, which caused a problem with the formatting of the data in our system. Instead we've enhanced the return structure of the function with new fields and populated them with direct assignment.

 

The coding was done byYulia Kramskoy.

 

Issue And Solution In Detail

 

For our purposes, we've used the code for the function YCPS_BOM_ITM detailed in the document. CSAP_MAT_BOM_ITEM_SELECT is called within that function.

 

The field "comp_qty" was showing incorrect values in the PSA - some of the results were multiplied by 1000, some weren't. The values in RSA3 were correct. Notice that this is following a replacement of "," with "." in the code in theaforementioned document. removing that replacement, we got results with a comma where the decimal point should be. In either case, the result in the PSA was incorrect in our system.

 

The field "comp_unit" was showing correct values in the PSA, however values were causing a run-time error at the BEX level (and were also showing wrong when running se37 for CSAP_MAT_BOM_ITEM_SELECT in the source system - as asterisks - ** ).

 

In an answer from SAP, it was suggested that we examine the decimal notation setting of ALEREMOTE. Creating an additional user for extraction with different settings was considered.

 

Instead we found an ABAP solution for this. Debugging into CSAP_MAT_BOM_ITEM_SELECT, we saw that the values of those fields were transferred using WRITE into the output structure in program LCSAPFC1, form convert_stpo_to_extern_format .

 

The solution was to add new fields in an append for the output structure STPO_API02 , and populate them with assignment in the enhancement in that form, without using the WRITE function. See code below.

 

 

Fix for incorrect quantity values - Enhancement in LCSAPFC1

 

Following is the part in the form "convert_stpo_to_extern_format" in LCSAPFC1 where the original fields are populated:

 



  
write i_stpo-menge      to o_stpo-comp_qty     unit i_stpo-meins.

  
write i_stpo-meins      to o_stpo-comp_unit.

 

 

And here is the code for populating the new fields, just before the endform statement:

 

 

 

*$*$-Start: (1)---------------------------------------------------------------------------------$*$*

ENHANCEMENT 1  YBW_PP_BOMITM.    "active version

*** added by Yulia Kramskoy start

o_stpo
-YY_MENGE = i_stpo-menge.

o_stpo
-YY_MEINS = i_stpo-MEINS.

*** added by Yulia Kramskoy end

ENDENHANCEMENT.


*$*$-End:   (1)---------------------------------------------------------------------------------$*$*

endform.

BEx Report Dump Error - could not be converted CL_RSDM_READ_MASTER_DATA->_SIDVAL_DIRECT_READ2

$
0
0

Hey All,

Have you ever come across this error message and thought what to do ? I certainly did ..  

SID ' '  for characteristic ' '  could not be converted CL_RSDM_READ_MASTER_DATA->_SIDVAL_DIRECT_READ2

 

Here is a program to check the inconsistencies in the master data tables and solve this error. ( some junk or error values in master data tables)

 

RSDMD_CHECKPRG_ALL.

Goto SE38, Run this Program.

Give the IOBJ Name , Check the Repair option and Execute.

you will find the all logs into green and error values are highlighted.

Now Run the Report. It ll work fine.
You can also check this in RSRV and Ensure Master data updated is activated, ACR , Aggregates are maintained.

All your comments are most welcomed and it's considered valuable. Correct me If I'm wrong ..

Viewing all 333 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>