In Shell Project point of view,
*how many process chains u have* - 1800 + chains
and how many loads per day - 3 schedule per day ( G1, G2, G3 ), G1 is
for Europe Region ( starts at 6:15 am IST ), G2 is for Asia region ( Starts
at 7:30 pm IST), G3 is for US region ( starts at 12:30 pm IST ).
*modules of projects* -
FSS ( Financial Services )
PGS ( Procurment of Good and Supply),
GAME ( Global Asset Management system )
StBC ( Sell to Business Customers ),
StRC ( Sell to Retail Customers )
HM ( Hydro Corbon Managment ),
LSC ( Lubricants supply chain management )
*data source whic u have used -*
2LIS_BAITEM - Sales Item
2LIS_BAIHDR - Sales header
2LIS_12VCITEM - Delivery Item
2LIS_VDHRD - Delivery Header
2LIS_VCItem - Billing item
These are some of SAP standdard Data sources, serach in google with "SAP
standdard Data sources", u can find many more.
*title of project -* Shell BAM ( GSAP )
*Object of project* -
*Shell* is a global group of energy and petrochemicals companies, operating
in over 145 countries and employing more than 119, 000 people best known to
the public for service station and for exploring and producing oil and gas
on land and at sea. GSAP will replace Shell’s fragmented Enterprise Resource
Planning systems with a harmonized global platform, critical to the delivery
of OP-One benefits. Shell aims to reduce the number of Enterprise Resource
Planning systems in oil products from 123 to less than 10.
*flow of project* - Not clear with the question, U just can't explain the
flow of the projects but can explain flow of a particular Report.
like from which source system data is coming to BW, where u are storing it
in BW etc,
Eg : R/3 system --> 2LIS_BAITEM -->PSA-->ABAP Routine ( Update rules )
-->ODS --> Cube --> Multi Provider --> Query ( explain Logic in the query )
--> Report name.
*roles and responsibilities -*
- Monitoring Data load activities and recovered failures in Dev, Quality
system as part of Unit Testing**
- Preparing the process documents for each and every enhancement.**
- Handled the tickets of different severities by following SOP (Standard
Operating Procedures) and never missed the SLA (Service level agreement).
- Involved in solving high-prioritized tickets regarding extractions,
performances issues and also Data load failures.**
- Extensively worked on process chains to automate the Deleting indexes
in Info cubes, Process of info packages, creating indexes, ODS activation,
further updating, PSA request deletion and Attribute change run so on.**
- Experience using the Open Hub Feature to export data from SAP BW to
external systems (Flat files / Database tables).**
- Involved in Unit testing, Integration testing, Regression testing and
User acceptance testing in different servers.**
- Deleted the Delta queues by running the programs for different
applications. And filled the set up tables in production during the
downtime. Before upgrading the BI7.0 system.
- Having experience in RMI&OPSM interfaces technology.
- Enhancing the existing reports as per the new requirement.
- Monitoring the daily loads to various data targets in the system.
- Monitoring the process chains on the basis of daily, weekly & monthly.
- Manual loading and rolling up the data into data targets.
- Maintaining the logs for each manual load.
- Created aggregates to improve the query response.
- Performance tuning of queries using aggregates, indexing of Info cubes.
- Responsible for the Rev track system in maintaining, creating RevTracks
for the PGS area
used in transports - Shell uses the toll RevTrac to transfer the objects
from Dev to Production systems.
Above details are different from project to Project. Understand and prepare
some thing similar to ur project as well.