Wednesday, September 29, 2010

BI New Features

Data Modeling
􀁹 Involves modeling / re-modeling (*) data warehouse layers
􀂄 Data Acquisition: Accessing and Integrating information from heterogeneous data sources
􀁹 Connectivity is made to virtually all data sources (enhanced data source concept) (*)
􀂄 Transformation
􀁹 Flexible Transformation capabilities (enhanced *)
􀂄 Data Distribution
􀁹 Open Hub Service
􀂄 Meta Data & Document Management
􀁹 Meta data integration from and to other SAP BI data marts
􀁹 Generation of documentation
􀁹 Sophisticated search functionality in conjunction with KM searching technology

Data Flow Control
􀁹 Defining the Data Transfer Process (*)
􀁹 Setting up process chains (enhanced *)
􀁹 Real-time data acquisition (*)
􀂄 Administration and Monitoring
􀁹 Administration Cockpit (*)
􀁹 Data Quality
􀂄 Performance Optimization
􀁹 BI Accelerator: Speeding up BI query performance by orders of magnitude (*)
􀁹 Aggregate Definition and Maintenance
􀁹 Caching
􀁹 DB Tool Support (statistics, indices etc)
􀂄 Information Lifecycle Management
􀁹 Near-Line storage (*) and archiving
􀂄 User Management
􀁹 Standard authorizations for warehouse modeling and administration
􀁹 Analysis authorizations (for end users) (revised *

* Indicates new features in SAP BI 7.0

BI Introduction

One of the major challenges Business Intelligence
(BI) customers face today is the integration of
different source systems in one Enterprise Data
Warehouse landscape. The IT Scenario Enterprise
Data Warehousing (EDW) of SAP NetWeaver 2004s
enables the BI customer to create and operate a
data warehouse in an enterprise-wide,
heterogeneous environment. The IT Scenario
Enterprise Data Warehousing structures into two
scenario variants, which cover design time
(modeling / implementation) and runtime aspects of
a highly flexible, reliable, robust and scalable BI
solution. EDW covers all steps a system
administrator requires to set up such a BI solution
and easily administer it.


SAP NetWeaver is depoyed along IT practices, according to specific business objectives. One of those
practices is Business Information Management. For organizations to manage the business information
needs, they typically require three things:
􀁹 1. Enterprise data warehousing
􀁹 2. Enterprise reporting, query and analysis
􀁹 3. A framework for business planning

Enterprise Data Warehousing (EDW):
􀂄 Enables customers to create and operate a data
warehouse in an enterprise-wide environment
􀂄 Combines strategic analyses and operational
reporting – enabling the business to report real-time
􀂄 Integrates heterogeneous systems
􀂄 Facilitates designtime as well as runtime of BI
models and processes
􀂄 Covers all steps an administrator requires to set up
a highly flexible, robust and scalable BI solution and
easily administer it

Dump Error while filling the set up table

Dump error while filling the Setup table  
I am trying to fill the setup table.While filling the stup table for 2lis_03_BF datasource, i got the Dump error after some time. I have checked the data in Setup table and found 9000 records.

Shall i proceed further by pulling the data from R/3 to BW.I have checked the data in RSA3 and got some data.

Shall i schedule the data from BW side.
 
I have checked the TCODE SM37 and found no jobs are available. I have checked the TCODE : NPRT and found some logs.In this, it is not showing any errors.

I have checked in SM21 and found some errors.

Time_______________Text

17:08:10____________Transaction Canceled M2 630 ( )
17:23:47____________Run-time error "TIME_OUT" occurred
17:23:47____________> Short dump "090302 172347 sand1_W2 2_00 " generated
17:30:08____________Status: 80% of IL00.DAT is in use
17:30:17____________Status: 90% of IL00.DAT is in use
17:30:18____________Overflow of Paging File (032768 Blocks)
17:30:18____________Run-time error "MEMORY_NO_MORE_PAGING" occurred
17:30:19____________> Short dump "090302 173018 sand1_W2 2_00 " generated

Shall i execute filling the setup table again. Do i need to give termination time More ( 2 Hrs difference)

Any issue on refilling the setup table again.


Hi,
Do like...
1. Take Down time in ECC, lock all users.
2. Load setup tables from 01.03.2009 to 02.03.2009, then load to BW with Init.
3. You Compress the requst.
4. Repeat the 2 step for BX and UM also.
5. You Compress the requst.
6. Then delete setup tables.
7. Then give selctions for historical data and fill setuptable and load Full loads to BW, after completing
all Full loads, then set the Delta.

Note: Don't forget compressing the rquests in Cube.Before start this process, Delete Q in SMQ1, RSA7,if it is Production, please be careful.

Generic Extraction with Function Module

Generic Datasource creation using function module
P Renjith Kumar SAP Employee Active Contributor Silver: 500-1,499 points
Business Card
Company: SAP Labs
Posted on Oct. 07, 2009 07:44 AM in ABAP, Business Intelligence (BI)

Subscribe.Subscribe
Print. Print
Permalink Permalink
Share
If the business content datasource do not  address the customer's requirment, then
we need to go for the generic datasource creation. The complex requirment needs to
be implemented by ABAP code and we need to create the function module to i
mplement those logic. The creation of generic datasource is quite complex when
compared to the other 2 methods ( View, Infoset ) . So this blog tries to explain that
concept.

Follow the procedure given below to create a generic datasource based on function
module . 

Create a structure with the necessary filelds , The Currency and quantity fields must Have reference fields also . Structure fields are shown below.

 image  



NETWR is currency field so it must have reference field that is in base table  .
Click Currency / Quan and give that field name .


image 

In our case we can copy the standard function module , information about them :

Pacakage name : RSAX  
Function module name : RSAX_BIW_GET_DATA_SAMPLE

Copy to your Z function module name and save .


image"

Open your copied function module from SE37 . There is no change in IMPORT tab.But
give the parameter name with relevant assoicated type in TABLES tab.



image"

In Exceptions No data . In Source code Tab . Write this ABAP Code to extract the data
into the generic data source.

To know more about the ABAP code , Just read this Wiki page, i have provided the
complete code of the function module there . Check for our generic fields and the
code logic implemented for our datasource.


 Save , Activate the function module .

Creating Generic Datasource

Using RS02 / SBIW transaction , create a generic datasource based on the
function module . It look like this .



image"

Save the datasource , Give selection fields , Remove the check mark in hide
field in RSO2 .

Testing the extraction

In RSA3 give the datasource name you created , and test the extraction .

image"

You can see that 16,102 records is selected .


Steps to be done on BI side


1.Create the infoobjects .
2.Create datasource
3.Create infoproviders .
4.Create transformation between the datasource and infoprovider.
5.Create an infopackage and load data till PSA .
6.Create DTP from datasoure to infoprovider and execute the DTP.

Now you can see the extracted data in the infoprovider.

If you want to implement the delta extarction functiionality , you need to right the
code for that seperately .

Hope this helps you to create a datasource based on function module .

BI Master Data Partioning

BI Master Data Partitioning
Monika Birdi Active Contributor Bronze: 250-499 points
Business Card
Company: Infosys Technologies Ltd
Posted on May. 28, 2010 08:34 AM in Business Intelligence (BI)

Subscribe.Subscribe
Print. Print
Permalink Permalink
Share
Size of master data can be as large as 40 millions or even more. One of our Utility Client has ~ 6 millions of master data records. This master data has 90 attributes making the table size really huge. We faced issue while activating the master data. The load was failing every time while activation. We tried activating the data for master data in background as well however no use. Then partitioning of master data table (P Table in our case) helped and we could activate the data. The partitioning was done using DB6CONV Program by Basis.DB6CONV is not a ABAP Program and it needs to be done outside the application server
For more details in DB6CONV program, please refer SAP Note 362325 - DB6: Table conversion using DB6CONV
After few days we received request for adding 3 more attributes. The changes were done however while transporting to Quality it failed with below error. Here ZTDB is the infoobject having transport issue.
image"
Since we had partition the P table for ZTDB infoobject the new attributes were trying to find a place holder in partition table for themselves however failed as it was not automatic.
 Approach:
Issue was finally handled by creating a new data class and then assigning this new data class to the infoobject. Remove the partitioning before assigning this data class.
To Create Data Class go to DB02 à Configuration à Data Classes. Click on Add Button available on the right side screen.
Give the technical name  (say ZMDPT) and description to the new class. “Data Tablespace” will be FACTD prefixed by your system name. E.g. D#FACTD in below example. “Index Tablespace” will be automatically populated.
image
Assign Data class to infoobject (Extras à Maintain DB Parameters).
image
.....
image"
After assigning the data class partition the P table.
Now infoobject has its own data class which is linked to the tablespace provided while converting and which is further attached to partition nodes. And assignment of new attributes will automatically be taken care while transport.
We have been usnig this approach from last 6 monts and no issues so far , with multiple changes to the object in terms of adding attributes , turning navigational attributes etc. Will keep you posted if i find something.

BI Master Data Partioning

BI Master Data Partitioning
Monika Birdi Active Contributor Bronze: 250-499 points
Business Card
Company: Infosys Technologies Ltd
Posted on May. 28, 2010 08:34 AM in Business Intelligence (BI)

Subscribe.Subscribe
Print. Print
Permalink Permalink
Share
Size of master data can be as large as 40 millions or even more. One of our Utility Client has ~ 6 millions of master data records. This master data has 90 attributes making the table size really huge. We faced issue while activating the master data. The load was failing every time while activation. We tried activating the data for master data in background as well however no use. Then partitioning of master data table (P Table in our case) helped and we could activate the data. The partitioning was done using DB6CONV Program by Basis.DB6CONV is not a ABAP Program and it needs to be done outside the application server
For more details in DB6CONV program, please refer SAP Note 362325 - DB6: Table conversion using DB6CONV
After few days we received request for adding 3 more attributes. The changes were done however while transporting to Quality it failed with below error. Here ZTDB is the infoobject having transport issue.
image"
Since we had partition the P table for ZTDB infoobject the new attributes were trying to find a place holder in partition table for themselves however failed as it was not automatic.
 Approach:
Issue was finally handled by creating a new data class and then assigning this new data class to the infoobject. Remove the partitioning before assigning this data class.
To Create Data Class go to DB02 à Configuration à Data Classes. Click on Add Button available on the right side screen.
Give the technical name  (say ZMDPT) and description to the new class. “Data Tablespace” will be FACTD prefixed by your system name. E.g. D#FACTD in below example. “Index Tablespace” will be automatically populated.
image
Assign Data class to infoobject (Extras à Maintain DB Parameters).
image
.....
image"
After assigning the data class partition the P table.
Now infoobject has its own data class which is linked to the tablespace provided while converting and which is further attached to partition nodes. And assignment of new attributes will automatically be taken care while transport.
We have been usnig this approach from last 6 monts and no issues so far , with multiple changes to the object in terms of adding attributes , turning navigational attributes etc. Will keep you posted if i find something.

Thursday, September 16, 2010

User Exists

User Exit Code for Transactional Datasources to populate appended Custom fields

* Table Declaration

TABLES: bkpf, bseg, psguid, coep, cobk, payr.

* TYPES Declaration


TYPES:
  BEGIN OF t_seqnr,
    pernr TYPE pa0001-pernr,  "PERNR
    seqnr TYPE pc261-seqnr,   "PR sequence number
  END OF t_seqnr,

  t_it_seqnr TYPE STANDARD TABLE OF t_seqnr,

  BEGIN OF t_taxau,
    pernr TYPE pa0001-pernr,  "PERNR
    seqnr TYPE pc261-seqnr,   "PR sequence number
    cntr1 TYPE cntrn,         "Tax auth code
    taxau TYPE taxat,         "Tax auth text
  END OF t_taxau,

  t_it_taxau TYPE STANDARD TABLE OF t_taxau.

* Structures declaration corresponding to the Extract structures of respective datasources.



DATA:   l_s_fiap_4 LIKE dtfiap_3,
        l_s_fiar_4 LIKE dtfiar_3,
        l_s_figl_4 LIKE dtfigl_4,
        l_s_cats_1 LIKE cats_is_1,
        l_s_check_dt LIKE zoxde10195,
        l_s_co_om_nwa_2  LIKE icnwacsta1,
        l_s_co_om_wbs_6  LIKE icwbscsta1,
       
  l_tabix LIKE sy-tabix,

  l_s_dt TYPE TABLE OF zoxde10195,

        prev_pernr         TYPE pa0001-pernr,
        it_seqnr           TYPE t_it_seqnr,
        lwa_seqnr          TYPE t_seqnr,
        lwa_taxau          TYPE t_taxau,
        lref_pay_access    TYPE REF TO cl_hr_pay_access,
        lref_pay_result    TYPE REF TO cl_hr_pay_result,
        lref_pay_result_us TYPE REF TO cl_hr_pay_result_us,
        it_rgdir           TYPE hrpy_tt_rgdir,
        it_taxau           TYPE t_it_taxau.

FIELD-SYMBOLS:
  <lfs_lwa_rgdir>  TYPE pc261,
  <lfs_lwa_tax>    TYPE pc22t,
  <lfs_lwa_py_rec> TYPE hrms_biw_py_rec1,
  <lfs_lwa_seqnr>  TYPE t_seqnr,
  <lfs_lwa_taxau>  TYPE t_taxau.

* Custom Code for BW Extractors.

CASE i_datasource.

* Begin of Additional Accounts Payable Transaction data enhancement........

  WHEN '0FI_AP_4'.

* BKPF Field Structure declaration.
    TYPES: BEGIN OF s_bkpf,
          bukrs TYPE bkpf-bukrs,
          belnr TYPE bkpf-belnr,
          gjahr TYPE bkpf-gjahr,
          usnam TYPE bkpf-usnam,
          bvorg TYPE bkpf-bvorg,
          bktxt TYPE bkpf-bktxt,
          ppnam TYPE bkpf-ppnam,
          xref1_hd TYPE bkpf-xref1_hd,
          xref2_hd TYPE bkpf-xref2_hd,
          END OF s_bkpf.

* PRPS Field Structure declaration.

    TYPES: BEGIN OF s_prps,
          pspnr TYPE prps-pspnr,
          posid TYPE prps-posid,
          END OF s_prps.

*BSEG field structure declaration.

    TYPES: BEGIN OF s_bseg,
          bukrs TYPE bseg-bukrs,
          belnr TYPE bseg-belnr,
          gjahr TYPE bseg-gjahr,
          buzei TYPE bseg-buzei,
          koart TYPE bseg-koart,
          zuonr TYPE bseg-zuonr,
          sgtxt TYPE bseg-sgtxt,
          zkokrs TYPE bseg-kokrs,
          zkostl TYPE bseg-kostl,
          xref1 TYPE bseg-xref1,
          xref2 TYPE bseg-xref2,
          xref3 TYPE bseg-xref3,
          segment TYPE bseg-segment,
          END OF s_bseg.

** Internal Table declaration

    DATA: it_bkpf TYPE STANDARD TABLE OF s_bkpf,
          it_prps TYPE STANDARD TABLE OF s_prps,
          it_bseg TYPE STANDARD TABLE OF s_bseg.

    DATA: it_data TYPE STANDARD TABLE OF dtfiap_3.

** Work Area declaration.

    DATA: wa_bkpf   TYPE s_bkpf,
          wa_prps   TYPE s_prps,
          wa_bseg   TYPE s_bseg.

** Variables.

    DATA: v_index TYPE sytabix.  "For looping

    REFRESH it_data.

** Move the data from c_t_data to internal table it_data.
   
it_data[] = c_t_data[].

** Sort the internal table it_data by BELNR.
    SORT it_data BY belnr.

** Refresh all internal tables.
    REFRESH: it_bkpf,
             it_prps,
             it_bseg.

    IF NOT it_data[] IS INITIAL.

** Select data from BKPF table for all entries in it_data.
      SELECT bukrs belnr gjahr usnam bvorg bktxt ppnam xref1_hd xref2_hd FROM bkpf
                INTO TABLE it_bkpf
                FOR ALL ENTRIES IN it_data
                WHERE bukrs = it_data-bukrs AND
                      belnr = it_data-belnr AND
                      gjahr = it_data-gjahr.

** Check sy-subrc and sort the internal table.
      IF sy-subrc = 0.
        SORT it_bkpf BY belnr bukrs gjahr.
        DELETE ADJACENT DUPLICATES FROM it_bkpf COMPARING ALL FIELDS.
      ENDIF.

** Select posid from prps table for all entries in it_data.
      SELECT pspnr posid
             FROM prps
             INTO TABLE it_prps
             FOR ALL ENTRIES IN it_data
             WHERE pspnr = it_data-projk.

** Check sy-subrc and sort the internal table.
      IF sy-subrc = 0.
        SORT it_prps BY pspnr.
        DELETE ADJACENT DUPLICATES FROM it_bkpf COMPARING ALL FIELDS.
      ENDIF.

**Select data from BSEG table for all entries in it_data table.
      SELECT bukrs belnr gjahr buzei koart zuonr sgtxt kokrs kostl xref1 xref2 xref3 segment
             FROM bseg
             INTO TABLE it_bseg
             FOR ALL ENTRIES IN it_data
             WHERE bukrs = it_data-bukrs AND
                   belnr = it_data-belnr AND
                   gjahr = it_data-gjahr.

** Check sy-subrc and sort the internal table.
      IF sy-subrc = 0.
        SORT it_bseg BY bukrs belnr gjahr buzei koart.

        DELETE ADJACENT DUPLICATES FROM it_bseg COMPARING ALL FIELDS.

      ENDIF.
    ENDIF.

    CLEAR v_index.

**Now loop at the c_t_data table and modify the table fields with the values from the
**above internal tables.

    LOOP AT c_t_data INTO l_s_fiap_4.

      v_index = sy-tabix.
      CLEAR wa_bkpf.
      READ TABLE it_bkpf INTO wa_bkpf WITH KEY bukrs = l_s_fiap_4-bukrs
                                               belnr = l_s_fiap_4-belnr
                                               gjahr = l_s_fiap_4-gjahr
                                               BINARY SEARCH.

      IF sy-subrc = 0.
        MOVE: wa_bkpf-xref2_hd   TO  l_s_fiap_4-zxref2_hd,
              wa_bkpf-xref1_hd   TO  l_s_fiap_4-zxref1_hd,
              wa_bkpf-usnam   TO  l_s_fiap_4-zusnam,
              wa_bkpf-bktxt   TO  l_s_fiap_4-zbktxt,
              wa_bkpf-ppnam TO  l_s_fiap_4-zppnam,
              wa_bkpf-bvorg TO  l_s_fiap_4-zbvorg.
      ENDIF.

      CLEAR wa_prps.
      READ TABLE it_prps INTO wa_prps WITH KEY pspnr = l_s_fiap_4-projk
             BINARY SEARCH.

      IF sy-subrc = 0.
        MOVE wa_prps-posid  TO  l_s_fiap_4-zposid.
      ENDIF.

      CLEAR wa_bseg.


      READ TABLE it_bseg INTO wa_bseg WITH KEY  bukrs = l_s_fiap_4-bukrs
                                                belnr = l_s_fiap_4-belnr
                                                gjahr = l_s_fiap_4-gjahr
                                                buzei = l_s_fiap_4-buzei
                                                BINARY SEARCH.

      IF sy-subrc = 0.
        MOVE: wa_bseg-xref2   TO  l_s_fiap_4-zzxref2_bseg,
              wa_bseg-xref1   TO  l_s_fiap_4-zzxref1_bseg,
              wa_bseg-xref3   TO  l_s_fiap_4-zzxref3_bseg,
              wa_bseg-segment TO  l_s_fiap_4-zzsegment_bseg,
              wa_bseg-zuonr   TO  l_s_fiap_4-zzzuonr_bseg,
              wa_bseg-zkostl  TO  l_s_fiap_4-zzkostl_bseg,
              wa_bseg-zkokrs  TO  l_s_fiap_4-zzkokrs_bseg.
      ENDIF.

      CLEAR wa_bseg.


      READ TABLE it_bseg INTO wa_bseg WITH KEY  bukrs = l_s_fiap_4-bukrs
                                                belnr = l_s_fiap_4-belnr
                                                gjahr = l_s_fiap_4-gjahr
                                                koart = 'S'
                                                BINARY SEARCH.

      IF sy-subrc = 0.
        MOVE:  wa_bseg-sgtxt   TO  l_s_fiap_4-zzsgtxt_bseg.

      ENDIF.

      MODIFY c_t_data FROM l_s_fiap_4 INDEX v_index.
      CLEAR l_s_fiap_4.

    ENDLOOP.

** End of Additional Accounts Payable Transaction data Enhancement....

** Begin of Additional Accounts Receivable Transaction data Enhancement....



  WHEN '0FI_AR_4'.

*Structure  declaration.

    DATA: BEGIN OF s_posid,
          posid TYPE prps-posid,
          END OF s_posid.

    CLEAR l_s_fiar_4.

**Now loop at the c_t_data table and modify the table fields with the values fetched from respective Tables.


    LOOP AT c_t_data INTO l_s_fiar_4.
      l_tabix = sy-tabix.

*SELECT statement for extracting field from PRPS table.
     
    CLEAR s_posid.

      SELECT SINGLE posid FROM prps INTO s_posid
      WHERE pspnr = l_s_fiar_4-projk.

      IF sy-subrc = 0.
        l_s_fiar_4-zposid = s_posid-posid.
        MODIFY c_t_data FROM l_s_fiar_4 INDEX l_tabix.
      ENDIF.

      CLEAR l_s_fiar_4.
    ENDLOOP.

** End of Additional Accounts Receivable Transaction data Enhancement....

** Begin of Additional CATTS Transaction data Enhancement....

  WHEN '0CA_TS_IS_1'.

*Structure  declaration.

    DATA: l_lgart LIKE catsdb-lgart.

    CLEAR l_s_cats_1.

**Now loop at the c_t_data table and modify the table fields with the values fetched from respective Tables.

    LOOP AT c_t_data INTO l_s_cats_1.

      l_tabix = sy-tabix.

      CLEAR l_lgart.

*SELECT statement for extracting LGART field from CATSDB table.

      SELECT SINGLE lgart FROM catsdb INTO l_lgart
                                         WHERE status = '30'  "l_s_cats_1-status
                                           AND pernr  = l_s_cats_1-pernr
                                           AND workdate = l_s_cats_1-workdate
                                           AND catshours = l_s_cats_1-catshours
                                           AND skostl = l_s_cats_1-skostl
                                           AND lstar  = l_s_cats_1-lstar
                                           AND lstnr = l_s_cats_1-lstnr
                                           AND rkostl = l_s_cats_1-rkostl
                                           AND raufnr = l_s_cats_1-raufnr
                                           AND rnplnr = l_s_cats_1-rnplnr
                                           AND rkdauf = l_s_cats_1-rkdauf
                                           AND rkdpos = l_s_cats_1-rkdpos
                                           AND rprznr = l_s_cats_1-rprznr
                                           AND reinr = l_s_cats_1-reinr
                                           AND waers = l_s_cats_1-waers
                                           AND kokrs = l_s_cats_1-kokrs
                                           AND meinh = l_s_cats_1-meinh
                                           AND tcurr = l_s_cats_1-tcurr
                                           AND price = l_s_cats_1-price
                                           AND werks = l_s_cats_1-werks
                                           AND autyp = l_s_cats_1-autyp
                                           AND apnam = l_s_cats_1-apnam
                                           AND ltxa1 = l_s_cats_1-ltxa1
                                           AND belnr = l_s_cats_1-belnr.

      IF sy-subrc = 0.
        l_s_cats_1-zzlgart = l_lgart.
        MODIFY c_t_data FROM l_s_cats_1 INDEX l_tabix.
      ENDIF.

      CLEAR l_s_cats_1.
    ENDLOOP.


* End of Additional CATTS Transaction data Enhancement....

*Begin of Additional GeneralLedger Line Item Transaction data Enhancement as per Julish Requirement on 22/05...

 
WHEN '0FI_GL_4'.

*Structure  declaration.

    TYPES: BEGIN OF s_bseg2,
          bukrs TYPE bseg-bukrs,
          belnr TYPE bseg-belnr,
          gjahr TYPE bseg-gjahr,
          buzei TYPE bseg-buzei,
          nplnr TYPE bseg-nplnr,
          aufpl TYPE bseg-aufpl,
          aplzl TYPE bseg-aplzl,

          END OF s_bseg2.

    DATA: BEGIN OF s_afvc,
        vornr  TYPE afvc-vornr,
          END OF s_afvc.

** Internal Table declaration

    DATA:   it_bseg2 TYPE STANDARD TABLE OF s_bseg2.

    DATA: it_data2 TYPE STANDARD TABLE OF dtfigl_4.

**Work Area declaration.

    DATA: wa_bseg2   TYPE s_bseg2.

**Variables.
    DATA: v_index2               TYPE  sytabix.  "For looping

    REFRESH it_data2.

**Move the data from c_t_data to internal table it_data.
    it_data2[] = c_t_data[].

** Sort the internal table it_data by BELNR.
    SORT it_data2 BY belnr.

**  Refresh all internal tables.
    REFRESH: it_bseg2.

    IF NOT it_data2[] IS INITIAL.

**Select data from BSEG table for all entries in it_data table.
      SELECT bukrs belnr gjahr buzei nplnr aufpl aplzl
             FROM bseg
             INTO TABLE it_bseg2
             FOR ALL ENTRIES IN it_data2
             WHERE bukrs = it_data2-bukrs AND
                   belnr = it_data2-belnr AND
                   gjahr = it_data2-gjahr.

** Check sy-subrc and sort the internal table.
      IF sy-subrc = 0.
        SORT it_bseg2 BY bukrs belnr gjahr buzei.

        DELETE ADJACENT DUPLICATES FROM it_bseg2 COMPARING ALL FIELDS.

      ENDIF.
    ENDIF.

**Now loop at the c_t_data table and modify the table fields with the values from the
**above internal tables.
    CLEAR v_index2.

    LOOP AT c_t_data INTO l_s_figl_4.

      v_index2 = sy-tabix.

      CLEAR wa_bseg2.

      READ TABLE it_bseg2 INTO wa_bseg2 WITH KEY  bukrs = l_s_figl_4-bukrs
                                                belnr = l_s_figl_4-belnr
                                                gjahr = l_s_figl_4-gjahr
                                                buzei = l_s_figl_4-buzei
                                                BINARY SEARCH.

      IF sy-subrc = 0.

        MOVE: wa_bseg2-nplnr   TO  l_s_figl_4-znetwork.

        IF wa_bseg2-nplnr IS NOT INITIAL.

          CLEAR s_afvc.

          SELECT SINGLE vornr INTO s_afvc FROM afvc WHERE aufpl = wa_bseg2-aufpl AND aplzl = wa_bseg2-aplzl.

          IF sy-subrc = 0.

            l_s_figl_4-zactivity = s_afvc.

          ENDIF.

        ENDIF.

      ENDIF.


      MODIFY c_t_data FROM l_s_figl_4 INDEX v_index2.

      CLEAR l_s_figl_4.


    ENDLOOP.

*End of Additional GeneralLedger Transaction data Enhancement....

* Begin of Additional Cheque Details Transaction data enhancement of Accounts Payable Application........


  WHEN 'ZBW_BSAK_DT'.

*Structure  declaration.

    DATA: check TYPE payr-chect.

**Now loop at the c_t_data table and modify the table fields with the values derived from Select Statements.

    CLEAR l_s_check_dt.
    LOOP AT c_t_data INTO l_s_check_dt.

      IF l_s_check_dt-augbl <> l_s_check_dt-belnr.
        APPEND l_s_check_dt TO l_s_dt.
      ENDIF.

      CLEAR l_s_check_dt.
    ENDLOOP.

    REFRESH c_t_data.

    CLEAR l_s_check_dt.
    LOOP AT l_s_dt INTO l_s_check_dt.
      CLEAR check.

*SELECT statement for extracting CHECT field from PAYR table.

      SELECT SINGLE chect FROM payr INTO check
      WHERE zbukr = l_s_check_dt-bukrs AND
            vblnr = l_s_check_dt-augbl.

*      IF sy-subrc = 0.

      l_s_check_dt-zzchect = check.

      MODIFY l_s_dt FROM l_s_check_dt TRANSPORTING zzchect.
      APPEND l_s_check_dt TO c_t_data.

*      ENDIF.

      CLEAR l_s_check_dt.
    ENDLOOP.

*End of Additional Cheque Details Transaction data enhancement…….

*Begin of Additional Netw. Activity Actual Costs Transaction data enhancement…

  WHEN '0CO_OM_NWA_2'.

*Structure  declaration.

    DATA: BEGIN OF s_bkpf_2,
            blart TYPE bkpf-blart,
          END OF s_bkpf_2,

          BEGIN OF s_bseg_2,
            umsks TYPE bseg-umsks,
          END OF s_bseg_2,


          BEGIN OF s_coep_2,
            vrgng TYPE coep-vrgng,
          END OF s_coep_2.

**Now loop at the c_t_data table and modify the table fields with the values derived from Select Statements.

    CLEAR l_s_co_om_nwa_2.
    LOOP AT c_t_data INTO l_s_co_om_nwa_2.

      l_tabix = sy-tabix.

      CLEAR s_bkpf_2.

*SELECT statement for extracting BLART field from BKPF table.

      SELECT SINGLE blart FROM bkpf INTO s_bkpf_2
      WHERE bukrs = l_s_co_om_nwa_2-bukrs AND
            belnr = l_s_co_om_nwa_2-belnr.

      IF sy-subrc = 0.
        l_s_co_om_nwa_2-zblart = s_bkpf_2-blart.
      ENDIF.

      CLEAR s_bseg_2.

*SELECT statement for extracting UMSKS field from BSEG table.

      SELECT SINGLE umsks FROM bseg INTO s_bseg_2
      WHERE bukrs = l_s_co_om_nwa_2-bukrs AND
            belnr = l_s_co_om_nwa_2-belnr AND
            buzei = l_s_co_om_nwa_2-buzei.

      IF sy-subrc = 0.
        l_s_co_om_nwa_2-zumsks = s_bseg_2-umsks.
      ENDIF.

      CLEAR s_coep_2.

*SELECT statement for extracting VRGNG field from COEP table.

      SELECT SINGLE vrgng FROM coep INTO s_coep_2
      WHERE kokrs = l_s_co_om_nwa_2-kokrs AND
            belnr = l_s_co_om_nwa_2-belnr AND
            buzei = l_s_co_om_nwa_2-buzei.

      IF sy-subrc = 0.
        l_s_co_om_nwa_2-zzvrgng = s_coep_2-vrgng.
      ENDIF.

      MODIFY c_t_data FROM l_s_co_om_nwa_2 INDEX l_tabix.

      CLEAR l_s_co_om_nwa_2.
    ENDLOOP.

*End of Additional Netw. Activity Actual Costs Transaction data enhancement…

*Begin of Additional Payroll Results Transaction data enhancement…


 WHEN '0HR_PY_REC_51'.

**Now loop at the c_t_data table and modify the table fields with the values derived from respective Tables.

    LOOP AT c_t_data ASSIGNING <lfs_lwa_py_rec>.
      CHECK <lfs_lwa_py_rec>-cntr1 <> '00'.
      lwa_seqnr-pernr = <lfs_lwa_py_rec>-pernr.
      lwa_seqnr-seqnr = <lfs_lwa_py_rec>-seqnr.
      COLLECT lwa_seqnr INTO it_seqnr.
    ENDLOOP.  "at c_t_data
    SORT it_seqnr BY pernr seqnr.
    CREATE OBJECT lref_pay_access.
    LOOP AT it_seqnr ASSIGNING <lfs_lwa_seqnr>.
      IF prev_pernr <> <lfs_lwa_seqnr>-pernr.
        CLEAR it_rgdir.

* Read list of payroll results for this pernr

        CALL METHOD lref_pay_access->read_cluster_dir
          EXPORTING
            pernr       = <lfs_lwa_seqnr>-pernr
          IMPORTING
            cluster_dir = it_rgdir
          EXCEPTIONS
            OTHERS      = 0.
        prev_pernr  = <lfs_lwa_seqnr>-pernr.
      ENDIF.
      READ TABLE it_rgdir ASSIGNING <lfs_lwa_rgdir>
        WITH KEY seqnr = <lfs_lwa_seqnr>-seqnr.
      CHECK sy-subrc = 0.

* Read single payroll result entry for this pernr

      CALL METHOD lref_pay_access->read_pa_result
        EXPORTING
          pernr                         = <lfs_lwa_seqnr>-pernr
          period                        = <lfs_lwa_rgdir>
          molga                         = '10'
        IMPORTING
          payroll_result                = lref_pay_result
        EXCEPTIONS
          no_authorization              = 1
          read_error                    = 2
          country_version_not_available = 3
          OTHERS                        = 4.
      CHECK sy-subrc = 0.
      lref_pay_result_us ?= lref_pay_result.
      CHECK sy-subrc = 0.

* Read tax authority description from tax table

      LOOP AT lref_pay_result_us->natio-tax ASSIGNING <lfs_lwa_tax>.
        lwa_taxau-pernr = <lfs_lwa_seqnr>-pernr.
        lwa_taxau-seqnr = <lfs_lwa_seqnr>-seqnr.
        lwa_taxau-cntr1 = <lfs_lwa_tax>-cntr1.
        lwa_taxau-taxau = <lfs_lwa_tax>-taxau.
        COLLECT lwa_taxau INTO it_taxau.
      ENDLOOP.  "at lref_pay_result_us
    ENDLOOP.   "at it_seqnr
    LOOP AT c_t_data ASSIGNING <lfs_lwa_py_rec>.
      CHECK <lfs_lwa_py_rec>-cntr1 <> '00'.
      READ TABLE it_taxau
        WITH KEY pernr = <lfs_lwa_py_rec>-pernr
                 seqnr = <lfs_lwa_py_rec>-seqnr
                 cntr1 = <lfs_lwa_py_rec>-cntr1
        ASSIGNING <lfs_lwa_taxau>.
      CHECK sy-subrc = 0.
      <lfs_lwa_py_rec>-zztaxau = <lfs_lwa_taxau>-taxau.
    ENDLOOP.   "at c_t_data

*End of Additional Payroll Results Transaction data enhancement…

*Begin of Additional WBS Elements: Actual Costs Transaction data enhancement…

 WHEN '0CO_OM_WBS_6'.

*Structure  declaration.

    DATA: v_vrgng LIKE coep-vrgng.

**Now loop at the c_t_data table and modify the table fields with the values derived from Select Statements

    CLEAR l_s_co_om_wbs_6.
    LOOP AT c_t_data INTO l_s_co_om_wbs_6.

      l_tabix = sy-tabix.

      CLEAR v_vrgng.

*SELECT statement for extracting VRGNG field from COEP table.

      SELECT SINGLE vrgng FROM coep INTO v_vrgng
      WHERE kokrs = l_s_co_om_wbs_6-kokrs AND
            belnr = l_s_co_om_wbs_6-belnr AND
            buzei = l_s_co_om_wbs_6-buzei.

      IF sy-subrc = 0.
        l_s_co_om_wbs_6-zzvrgng = v_vrgng.
      ENDIF.

      MODIFY c_t_data FROM l_s_co_om_wbs_6 INDEX l_tabix.

      CLEAR l_s_co_om_wbs_6.

    ENDLOOP.

*End of Additional WBS Elements: Actual Costs Transaction data enhancement…

  WHEN OTHERS.
    EXIT.

ENDCASE.