We provide IT Staff Augmentation Services!

Sr. Sap Bods Consultant Resume

Dallas, TX


  • Over 6+ years of IT experience in Data Warehousing, SAP Data Migration, Data Cleansing using and Business Objects Data Services(BODS), Informatica, BO Reporting Tools, WEBI, Crystal.
  • Experience in implementing Business Intelligence and Data - warehousing solutions using Business Objects 4.1/4.0/XIR3.1/R2 and SAP BI for RDBMS, OLAP Sources and SAP HANA.
  • Expertise in Data Analysis, Design, Modeling, Development, Testing and Implementation of Data Warehousing Applications, Data integration, Data cleansing, Data Governance.
  • Installed & configured Data Services XI 3.2 and Upgraded from Data Services XI 3.2 to 4.0.
  • Extensively worked on integration of Business Objects Data Services with SAP MODULES (Finance, HR & CRM) and Non-SAP Data Sources like Oracle 11i/10g/9i, SQL Server 2008/2005/2000, DB2, JD Edwards MainFrames, Flat Files and Legacy System files.
  • BODS/DataQuality in design of work flows, data flows, scripts and complex transforms using different data integrator, platform and data quality transforms.
  • Expertise in BODS code migration using Check-In/Out of Central/Local Repository.
  • Worked on ETL Code Reverse Engineering from IBM DataStage to Informatica and BODI/BODS.
  • Experience in debugging execution errors using Data Integrator logs (trace, monitor, and error) and by examining the target data.
  • Extensively used Data Services Management Console to schedule and execute jobs.
  • Expertise in data warehousing techniques for data cleansing, Slowly Changing Dimensions (SCD) phenomenon, surrogate key assignment and Change Data Capture (CDC).
  • Involved in design, development, optimization and maintenance of BODS jobs to extract data from various sources to data marts/data warehouse various SAP modules through IDOCS.
  • Experience in working with ABAP dataflows to extract the data from SAP Applications.
  • Hands on Knowledge on Data Quality concepts and familiarity with setting up Data Cleanse, Address Cleanse, Match and Associate transforms.
  • Prepared test scripts and test cases to validate data and maintaining the data quality.
  • Experience working with Open Hub Services to push data out of SAP BI using Data Services.
  • Configured the Data services to connect to the SAP R/3, CRM and BI.
  • Have thorough knowledge on Legacy System Migration Workbench (LSMW).
  • Experience in Data migration and BI using Oracle, SQL Server, Actuate and BIRT.
  • Ability to understand the business environment and translate business requirements into technical solutions.
  • Excellent communication with exceptional problem solving, including technical writing, presentation and strong interpersonal skills with the ability to interact with end-users, managers.


SAP Application: SAP BI 7.0, ECC 6, SAP BW 3.5, BO 4.0, XI 3.1.

ETL Tools: SAP BO Data Services, 4.1, Informatica.

Languages: BI- ABAP.

Source Systems: R/3 4.7, ECC 6.0, Legacy System, Flat file.

Reporting Tools: Bex, BO Webi.

Operating Systems: Windows95/98/XP/Vista/7, UNIX.


Confidential, Dallas, TX

Sr. SAP BODS Consultant


  • Sr. BODS Developer to develop and supported in Extraction, Transformation and Load process (ETL) using Business Objects Data Services to populate the tables in Data warehouse and Data marts.
  • Build Data Stores and Configurations to connect to SAP ECC, SAP BW, Source and Target systems.
  • Extensive experience in implementation of Data Cleanup procedures, transformations, Scripts, Stored Procedures and execution of test plans for loading the data successfully into the targets.
  • Extracted SAP ECC tables and developed transformations that apply the business rules given by the client and loaded the data into the target database.
  • Extracted source files and loaded to SAP HANA database
  • Build ETL logic on SAP HANA databases, build views and stored procedures using native HANA SQL
  • Identified bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs and redesign the existing mappings for improving the performance.
  • Used SAP Data Services Data Quality to develop components that would help to clean data elements like addresses and match entity names.
  • Worked on Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.
  • Extensively used Query Transform, Map Operations, Table Comparison, lookup function, Merge, Case, SQL, and Validation Transforms in order to load data from source to Target Systems.
  • Created Scripts like Starting Script and Ending Script for each job, sending the job notification to the user scripts and declaring the Local and Global Variables.
  • Defined file format to use as source and target files.
  • Worked on writing Functions to make the code reusable and used lookup functions to reference the reference table data.
  • Extensively used Try/Catch to handle exceptions and writing Scripting to automate the Job Process.
  • Experience in Migration of Jobs and workflows from Development to Test and to Production Servers to perform the integration and system testing.
  • Created physical design documents, technical specification documents for all ETL jobs.
  • Dealt with Unit Testing and data Validation testing of all the mappings end to end and also with UAT.
  • Used check-in/check-out procedure to import and export projects/jobs for secure access to central repository objects and maintaining version history.
  • Experience in Debugging and Performance Tuning of targets, sources and mappings.
  • Created several jobs for loading the data to Teradata 13.0 from flat files, data files, excel files, oracle views and SQL tables.

Environment: SAP Business Objects 4.2, Designer, SAP HANA, Teradata 13.0 (Teradata SQL Assistant), Oracle 10g

Confidential, Denver, CO

SAP BODS Consultant


  • BODS developer supporting ETL for Enterprise Data Warehouse (EDW) projects.
  • Defined Data Stores to allow Data Services to connect to the source or target database.
  • Created new mappings and updated old mappings according to changes in Business logic.
  • Designed Jobs and Complex Workflows and Dataflow for ETL to Enterprise Data Warehouse
  • Worked on developing BODS ETL code for initial, historical and delta loads to EDW.
  • Designed Jobs and Complex Workflows and Dataflow for ETL to Enterprise Data Warehouse.
  • Extensively used BODS ETL to load data from central Data Warehouse into Data Marts.
  • Scheduled various jobs for transferring the data to the enterprise data warehouse.
  • Created Data Flows to load data from flat file, CSV files with various formats into Data Warehouse.
  • Involved in creating batch jobs for data cleansing and address cleansing.
  • Involved in creating the batch jobs for de-duplicating the data for different sources.
  • Worked on making full load jobs in to incremental loads using Map Operation, Table Comparision and Auto Correct Load.
  • Worked on Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.
  • Worked to Reverse Engineer code build in IBM DataStage to convert to BODI/BODS ETL code.
  • Extensively used Query Transform, Map Operations, Table Comparison, Merge, Case, SQL, and Validation Transforms in order to load data from source to Target Systems.
  • Extensively used lookup, lookup ext functions in Data Integrator to load data from source to Target using a transform Table.
  • Designed & developed error catching procedures in BODI jobs.
  • Scheduled and Monitor jobs using Data Integrator Management Console.
  • Defined separate data store for each database to allow Data Integrator to connect to the source or target database.
  • Created Scripts like starting script and ending script for each job, sending the job notification to users using scripts and declaring the Local and Global Variables.
  • Created and administered Local and Central Repositories for multi user environment.
  • Migrated and tested jobs in different instances and validated the data by comparing source and target.
  • Automating all error handling, error escalation, and email notification procedures.
  • Generated several Metadata reports for Data Integrator mapping, and job execution statistics.
  • Worked on Reporting using Actuate, Crystal Reports and BO WEBI XI 3.1, 3.2.

Environment: SAP BODS 3.2, 4.0, SAP ECC, BW 7.3, B O 4.0, WebI, BEx Query Designer

TCS (Tata Consultancy Services) Hyderabad, India

Confidential, Boston, MA

SAP BODS Consultant


  • BODS Consultant supporting ETL development for data migration, data quality and Integration.
  • Extensively used ETL to load data from central Data Warehouse into Data Marts.
  • Work with Engineering team to build proof of concept to use as templates to extract data from SAP sources using standard and custom SAP BI extractors and ABAP dataflow.
  • Build a batch processing control mechanism using ETL control schema on target SQL.
  • Successful proof of concept delivery with optimized setup of BODS extraction programs.
  • Migrated data objects from SAP BI 3.5/7.0 to a new SAP BODS 4.0 extraction.
  • Worked extensively on troubleshooting the implementation of the Custom Extractors using SAP BODS for SAP CC/CI and OER modules.
  • Successfully resolved several issues by closely working with SAP on the SAP BI extractor integration to BODS on the ODQMON (operational delta queue monitoring).
  • Implemented / Customized Project (pre-built for Plant Maintenance and Cost Center), WSR Project, Scorecard Project and process of Data Migration.
  • Worked in mentoring the team for the best practices, guidelines for process improvement, reliability and Data Correctness.
  • Analyzed data in source systems to assess data cleanliness and make recommendations for addressing within the source systems or as part of the ETL process, responsible for error handling, and performance tuning of Jobs.
  • Worked on POC implementation of HR Rapid Mart for People Soft performed GAP Analysis and configuration of BO Data Services to handle data in and out of People Soft ERP System.
  • Handled data quality to address cleansi0ng name cleansing and de-duplication based on the business rules, installing and configuring data quality address directories for cleansing various country related data.

Environment: Business Objects Data Services (3.2) for ETL, SQL Server 2005/2008, Windows 2008/XP, ER-Studio.

TCS (Tata Consultancy Services) Hyderabad, India

Confidential, CA

Informatica Developer


  • Analyze the business requirements and functional specifications.
  • Involved in the design phase of the Informatica mappings.
  • Review the Source to Target Mapping specifications and Transformation rules prepared by the Business Analyst.
  • Coordinate with the Business Analyst for questions related to the requirements, Mapping specifications and Transformation rules.
  • Used Informatica Power Center for extraction, transformation and load (ETL) of data in the data warehouse.
  • Extracted data from different sources like Oracle, Delimited and Fixed Width Flat files and load into ODS.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure for developing Informatica mappings.
  • Parameterized the mappings and increased the re - usability.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
  • Involved and writing Oracle stored procedures and functions for calling during the execution of Informatica mapping or as Pre or Post session execution.
  • Involved in designing the ETL testing strategies for functional, integration and system testing for Data warehouse implementation.
  • Created Unit Test documents for the mappings to test the data load.
  • Involved in data validation from Source to staging to Target.
  • Resolved Defects logged in the Mercury Quality Center during testing.
  • Identified sources, targets, mappings and sessions bottlenecks and tuned them to improve performance.
  • Support to L2 team during any failures and delays of ETL workflows and autosys jobs.
  • Created SQL Queries for validating the target data against source data.
  • Extensively involved in writing unit Test Cases, queries for testing the target data against source data.
  • Involved in the documentation to describe program development, logic, and coding, testing, changes, corrections, preparing the test plan, execution and test report related to testing.
  • Working with SVN(software versioning and revision control system) to update all the release related documents for every release with new release version number.
  • For Release process guide the L2 team for DR in UAT and Release in PROD to make the release success.
  • Followed Informatica recommendations, methodologies and best practices.

Environment: Informatica 9.1.0,Oracle 9i, Oracle 10G, SQL, PL/SQL, DAC, IIR, OEDQ, SIEBEL,OBIEE, Microsoft Visio Standard 2010.

Page | 5

Hire Now