We provide IT Staff Augmentation Services!

Sap Bods Developer Resume

San Francisco, CA

SUMMARY:

  • Extensive experience working with ETL tool SAP Business Objects Data Services in designing and developing jobs with complex Mappings, Transforms, Transformations, Workflows, data flows, Configuring and scheduling the Workflows.
  • IT experience in Analysis, Design, Development and Implementation of Data Warehousing and Database applications using SAP Data Services (BODS - ETL).
  • SAP HANA Cloud Integration for Data Services (SAP HCI-DS) and SAP Integrated Business Planning (SAP IBP).
  • SAP Data services to connect to SAP ECC 6.0, SAP BW as Source and Target systems. Also, load data from PSA to Data Targets using Transformations and DTPs.
  • Experienced with Data Quality Management Metadata Management, Data Profiling, Master Data Management, Data Quality, and Data Model Standards .
  • Experience in Migration of Jobs and workflows from Development to Test and to Production Servers to perform the integration and system testing.
  • Experienced in Data Conversions, Data integration and Data Migration with specialization in BODS and expert on Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.
  • Efficient in Optimizing, Debugging and testing SQL queries, views and stored procedures written in Teradata, Oracle and SQL Server.
  • Design the solutions with SMEs and Solution Architects to implement the best design solutions.
  • Excellent Technical, Interpersonal, and Communication skills along with Time and Project management.

PROFESSIONAL EXPERIENCE:

Confidential, San Francisco, CA

SAP BODS Developer

Responsibilities:

  • Participated in business requirements analysis, report designing and modeling, development of technical specifications, testing and Implementation.
  • Responsible for Extracting Data from ECC (using BAPI calls & RFC Direct read method) and other legacy systems into SAP BW, SQL server and SAP HANA using SAP Data Services (4.2)
  • Collaborate with business teams and develop the jobs to extract, transform and load, populating and refreshing DW Tables from Oracle (IW) tables, Flat files and SQL tables sources using BODS Jobs.
  • Analyzed business requirements and worked closely with various application teams and business teams to develop ETL procedures that are consistent across all applications and system.
  • Wrote ETL design documents per the ETL coding standards with mapping documents and perform Transformations reviews.
  • Worked with SAP Information Steward for data profiling and created custom cleansing packages.
  • Performed column, address, dependency and redundancy and uniqueness profiling.
  • Created complex validation rules and rule bindings for data profiling according to business requirements.
  • Generated Data quality reports on the dash board for the data analysis purpose.
  • Worked with Data Quality Transforms such as Global Address Cleanse, Data Cleanse and Match Transform for cleansing the legacy records in customer master, material master, and vendor master.
  • Designed, implemented, and tested data flows, workflows, scripts, and jobs for multiple projects.
  • Profiled data to assess data quality.
  • Responsible for the Data loads to IBP from SAP ECC, Flatfile, and BW on HANA.
  • Performed data extracts from SAP Hana - IBP system to file formats and then loaded it back to BW-HANA and pushed it to ECC using BAPI call through BODS.
  • Extensively worked on creation of HCI Projects, Tasks and Process to perform ETL from SAP BW/SQL Server into SAP IBP and Flat files to send the data back to SAP.
  • Tested the data loaded into IBP using the IBP Excel Plug-in .
  • Schedule HCI Tasks to run weekly and fix any failures if schedules fail.
  • Prepared and delivered Functional and Technical specification documents for all the HCI Tasks .
  • Responsible for publishing the BODS batch jobs as web services as well as through executable commands to call them from the third-party scheduler (MAESTRO) applications.
  • Created Scripts like Starting Script and Ending Script for each job, sending the job notification to the user scripts and declaring the Local and Global Variables.
  • Extensively worked on creation of HCI Projects, Tasks and Process to perform ETL from SAP BW/SQL Server into SAP IBP and Flat files to send the data back to SAP.
  • Tested the data loaded into IBP using the IBP Excel Plug-in .
  • Schedule HCI Tasks to run weekly and fix any failures if schedules fail.
  • Prepared and delivered Functional and Technical specification documents for all the HCI Tasks .
  • Responsible for publishing the BODS batch jo
  • Design and write jobs to fetch data from SAP BW using the Open Hub Destination, kicking off the process chains and pushing the data to the BW DSOs using the external source system configurations.
  • Expert in implementing SCD1, SCD2 and SD3 ETL concepts for direct updates, history preserving using target based Table comparison, History preserving and map operation transformation.
  • Created several jobs for loading the data to SQL Server 2008 from flat files, data files, excel files, oracle tables, oracle views.
  • Created jobs to perform the full initial data loads and then switch these jobs to Delta mode to append the delta data in the existing data warehouse.
  • Involved in SAP ECC R/3 source extractions by using sap Source Extractors, SAP Functions making BAPI calls using BODS 4.1/4.2.
  • Created physical design documents, technical specification documents for all ETL jobs.
  • Extensively worked on error handling and performance tuning of programs and processes by using concepts like parallel execution, recovery mechanism, degree of parallelism (DOP), increasing number of concurrent loaders and bulk-loading options.
  • Provide Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Loading Data from multiple Data Sources to multiple Data Targets with different Transformations and DTP's by scheduling different updates.
  • Responsible for the execution, monitoring and scheduling of data services jobs using Data Services Management Console (Administrator) and used web services to publish the jobs .
  • Data services Workbench to load and provision data into SAP HANA.

Environment: SAP BO Data Services (Ver 4.1/Ver 4.2), SAP Information Steward (Ver. 4.2), SAP Business Objects XI R3.1, Oracle 11g, Designer, SQL Server 2008, Windows Server 2008 R2, HANA 1.x, IBP 1708/1711, HCI DSoD

Confidential, Santa Clara, CA

Data Warehousing Analyst/ SAP Data Services (Solution Developer)

Responsibilities:

  • Extensively used SAP Data Services to create ETL jobs and put data into staging tables and transform it further to populate the data warehouse tables.
  • Responsible for Extracting Data from ECC and other legacy systems into SAP BW and SQL server using SAP Data Services (Data Integrator).
  • Responsible for the execution, monitoring and scheduling of data services jobs using Data Services Management Console (Administrator) and used web services to publish the jobs.
  • Defined Data Stores and Configurations in SAP Data services to connect to R/3, SAP BW, Source and Target systems.
  • Used SAP Business Objects Data Services to create work flows and data flows to load the data into SAP BW staging tables (PSA) and transform it further to populate the DW tables.
  • Created several jobs for loading the data to SQL Server 2008 from flat files, data files, excel files, oracle tables, oracle views.
  • Created jobs to perform the full initial data load and then switched the jobs to Delta mode to append the delta data in the existing data warehouse (EDW).
  • Involved in sap R3 source extraction by using sap Source Extractors using BODS 4.0.
  • Responsible for pushing the daily delta data load to SAP APO.
  • Defined file format to use as source and target files and developed several reusable transformations and custom functions.
  • Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
  • Involved in migrating jobs from development to testing and then finally to the production.
  • Created physical design documents, technical specification documents for all ETL jobs.
  • Involved in error handling and performance tuning of data flows.
  • Expertise in dimensional data modeling, Star Schema Modeling, Snow-Flake Modeling, fact and dimensional table design, physical and logical data modeling.
  • Extensively used TRY/CATCH blocks to Handle Exceptions and writing Scripts to automate the Job Process.
  • Responsibility of creating and maintaining database objects and structures (e.g. tables, indexes, views, triggers, Functions, Procedures etc.).
  • Designed and maintained Universes and exported them to the Repository for making resources available to the users.
  • Responsible for the creating universes using Business Objects Universe Designer by creating the Business Objects data model selecting/joining tables, indicating cardinalities.
  • Designed, developed and managed Universes in the Designer for report generation.
  • Developed Web Intelligence Reports for End Users.

Environment: SAP BO Data Services (Ver.4.0), SAP Business Objects XI R3.1, Oracle 11g, Designer, SQL Server 2008, Windows Server 2008 R2.

Confidential, San Francisco, CA

SAP BODS/ETL Developer

Responsibilities:

  • Developed and supported in Extraction, Transformation and Load process (ETL) using Business Objects Data Services to populate the tables in Data warehouse and Data marts.
  • Expertise in gathering, analyzing and documenting business requirements, functional requirements, and data specifications for Business Objects Universes and Webi Reports.
  • Defined Data Stores and Configurations to connect to R/3, SAP BW, Source and Target systems.
  • Extensive experience in implementation of Data Cleanup procedures, transformations, Scripts, Stored Procedures and execution of test plans for loading the data successfully into the targets.
  • Loaded SAP R/3 tables and developed transformations that apply the business rules given by the client and loaded the data into the target database.
  • Identified bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs and redesign the existing mappings for improving the performance.
  • Used SAP Data Services Data Quality to develop components that would help to clean data elements like addresses and match entity names.
  • Expert on Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.
  • Extensively used Query Transform, Map Operations, Table Comparison, lookup function, Merge, Case, SQL, and Validation Transforms in order to load data from source t o Target Systems.
  • Created Scripts like Starting Script and Ending Script for each job, sending the job notification to the user scripts and declaring the Local and Global Variables.
  • Defined file format to use as source and target files.
  • ABAP data flow processing Using Data services.
  • Extensively used Try/Catch to handle exceptions and writing Scripting to automate the Job Process.
  • Experience in Migration of Jobs and workflows from Development to Test and to Production Servers to perform the integration and system testing.
  • Created physical design documents, technical specification documents for all ETL jobs.
  • Dealt with Unit Testing and data Validation testing of all the mappings end to end and also with UAT.
  • Used check-in/check-out procedure to import and export projects/jobs for secure access to central repository objects and maintaining version history.
  • Experience in Debugging and Performance Tuning of targets, sources and mappings.
  • Created several jobs for loading the data to Teradata 13.0 from flat files, data files, excel files, oracle views and SQL tables.
  • Designed, developed, deployed and maintained universes using Business Objects designer.
  • Developed Custom hierarchies to support drill up/down reports, and also created cascading LOVs (List of Values) to be used for cascading prompts.
  • Developed Complex Web Intelligence Reports for End Users.
  • Designed and developed extensive Webi reports using multiple Queries, combination of chart and tables.
  • Used cascading prompts in the reports to create more interactive reports for the users.
  • Used Business Objects free hand SQL to generate Ad-hoc reports.
  • Built reports that have multiple display objects on a single page (tables and charts) in Web Intelligence for user friendly analysis of data.

Environment: SAP Data Services XI (Ver. 3.2), SAP Business Objects 4.0, Designer, Web Intelligence, Info View, Crystal Reports 2008, Teradata 13.0 (Teradata SQL Assistant), Oracle 10g, MS-SQL, SQL Server, Toad, Windows 2003 Server.

Confidential

SQL Developer

Responsibilities:

  • Designed, developed, and maintained relational databases.
  • Documented the Business Requirements and Technical Requirements (Database perspective) in confluence.
  • Worked creating databases and various SQL server objects like schemas, tables, indexes, indexed view as per the specifications and requirements to meet the business functionality.
  • Worked with business users in designing the database model and maintain referenced relationships for the data integrity.
  • Translated business needs into data analysis, business intelligence data sources and reporting solutions for the clients depending upon business requirement.
  • Create indexes to improve the performance and retrieve the data faster.
  • Expert in using tools like MS SQL Profiler, Database Tuning Wizard, Windows Performance for monitoring and tuning SQL Server performance.
  • Recovered the databases (from backups) to a specific point of time, as per the requests.
  • Troubleshoot various problems that arises in a day-to-day work and fix the issues.
  • Create views and assist QA team to perform Data validation.

Confidential

SQL Developer

Responsibilities:

  • Analyze project information requirements, data relationships, and attributes, resulting in data flows, program structures, and outputs for the projects using Crystal Reports and other tools.
  • Provide support to the finance team in reconciling financial system.
  • Analysis, design, production support, tuning and maintenance of Oracle stored procedures and SQL Scripts.
  • Develop an in-depth knowledge of the company’s data-related process and systems. Work on problems of diverse scope where analysis of situations or data requires a review of a variety of factors.
  • Coding of Oracle PL/SQL procedures, operations, and code reviews related to database objects such as views, tables, stored procedures, functions, packages, and ad-hoc query writing and SQL tuning skills, tasks related to data loading, and documentation.
  • Provide direct support to the end-users.
  • Provides recommendations for improving processes and procedures.

Hire Now