Wednesday, October 6, 2010

The Tech details of Standard ODS / DSO in SAP DWH

"An Operational Data Store object (ODS object) is used to store consolidated and cleansed data (transaction data or master data for example) on a document level (atomic level)" - Refered from SAP Docs.It describes a consolidated dataset from one or more Info Sources / transformations (7.0) as illustrated below in Fig.1.
In this blog we will look at the Standard Data Store Object. We have other types namely Data Store Object with Direct Update (Transactional ODS in 3.x) and Write Optimized Data Store new with BI 7.x which contains only Active data table used to manage huge data loads for instance - Here is the link from Help portal Write optimised DSO





Architecture of Standard ODS /DSO (7.x)
"ODS Objects consist of three tables as shown in the Architecture graphic below" - Refered from SAP Docs:

image
Figure 1: ODS Architecture - Extracted from SAP Docs
TIP: The new data status is written to the table with active data in parallel to writing to the change log taking the advantage of parallel processes which can be customized globally or at the object level in the system


Lets go through a Scenario
In this example we will take the Master data object material and plant (0MAT_PLANT compounded with 0PLANT) with a few attributes for the demonstration purpose. Now define a ODS / DSO as below where material and plant is a key and the corresponding attributes as data fields.

image
Figure 2: ODS / DSO definition


Lets create a flat file data source or an info source with 3.x in this example to simplify the scenario with all the info objects we have defined in ODS structure
image
Figure 3: Info source definition



Lets check the flat file records, remember that the key fields are plant and material and we have a duplicate record as in the below Fig.4. The 'Unique Data Records'option is unchecked which means it can expect duplicate records.
image
Figure 4: Flat file Records


Check the monitor entries and we see that 3 records are transferred to update rules and two records are loaded in to NEWDATA table as we haven't activated the request yet. This is because we have a duplicate record for the key in the ODS which gets overwritten (Check the first two records in Fig 4)
image
Figure 5: Monitor Entries


Now check the data in the NEWDATA / ACTIVATION QUEUE table, we have only two records as the duplicate records gets overwritten with the most recent record i.e. record 2 in PSA got overwritten as it has got the same key material and plant.
image
Figure 6: Activation Queue



image
Figure 7: PSA data for comparison
Tip: The key figures will have the overwrite option by default, additionally we have the summation option to suit certain scenarios and the characteristics will overwrite always. The technical name of new data / Activation queue table is always for customer objects - /bic <name of ODS>140 and for SAP objects - /bio<name of ODS>140.


Once we activate the data we will have two records in ODS Active Data table. As we see below the Active Data table always has contains the semantic key (Material, Plant)

image
Figure 8: Active Data Table
TIP: The name of the active table /BIC/A<odsname>100 and /BI0 for SAP.


The change log table has these 2 entries with the new image (N). Remember the record mode we will look in to it later. The technical key (REQID, DATAPACKETID, RECORD NUMBER) will be part of change log
image
Figure 9: Change Log Table
TIP: The technical name is always /BIC/<internal generated number>.


Now we will add two new records material 75 plant 1, Material 80 Plant 1 and change the existing record for the key Material 1 and plant 1 as below
image
Figure 10: Add more records


When we look at the monitor there will be 3 records in the activation queue table as the duplicate records gets filtered out, in this example the first record in Fig.10
image
Figure 11: Monitor


Look at the new data table (Activation Queue) and we will have 3 records that are updated as seen in the monitor
image
Figure 12: Activation Queue


How the Change log works?
We will check the change log table to see how the deltas are handled. The highlighted records are from the first request that is uniquely identified by technical key (Request Number, Data packet number, Partition value of PSA and Data record number)
image
Figure 13: Change log Table 1
With the second load i.e. the second request the change log table puts the before and after Image for the relevant records (the non highlighted part from the Fig.13)

In the above example Material (1) and Plant (1) has the before image with record mode "x"(row 3 in the above Fig) And all the key figures will be have the "-" sign as we have opted to overwrite option and the characteristics will be overwritten always.
image
Figure 14: Change log Table 2


The after image " " which reflects the change in the data record (Check row 4 in the above fig). We have changed the characteristic Profit center with SE from SECOND and the Key figure Processing Time is changed from 1 to 2. A new record (last row in the above Fig) is added is with the Status "N" as it's a new record.
Summary
This gives us an overview of the standard ODS object and how the change log works. The various record modes available:
image
Figure 15: Record Modes



Check the note 399739 about the details of the Record Mode. The record mode(s) that a particular data source uses for the delta mechanism largely depends on the type of the extractor. Check the table RODELTM about the BW Delta Process methods with the record modes available as well our well known table ROOSOURCE for the extractor specific delta method.


For Instance LO Cockpit extractors use 'ABR' delta method that supplies After-Image, Before-Image, New Image and Reverse Image. Extractors in HR and Activity based costing uses the delta method 'ADD' i.e. with record mode 'A' and FI-GL,AR,AP extractors are based on delta method 'AIE' i.e. record mode space ' ' After image. The list goes on ..........

1 comment:

  1. hello purushotham reddy!!
    you have been posting so many topics ,which r very important and useful to many people who want to learn.i really appreciate your work and contribution.
    thank you very much!!!keep posting several other topics as well...

    ReplyDelete