We provide IT Staff Augmentation Services!

Sap Bods Developer Resume

0/5 (Submit Your Rating)

Waltham, MA

SUMMARY

  • 8 years of experience in the areas of Enterprise Data Warehouse and Well versed in Extraction, Transformation and Loading data from various sources into Data Warehouses and Data Marts with Business Objects EIM Suite of Products such as Data Services (3.x, 4.x), Information Steward 4.0, Data Quality, Address Directories, Data Insight 3.1.
  • Expertise in implementing Data Quality (Data Enrichment, De - Duplication etc. ) using various components present, in SAP Data Services (Global Address Cleanse, Merge and Match etc.)
  • Excellent knowledge in Reporting using SAP Business Objects XIR2,XIR3, 4.0, SP01, SP02, FP3, SP04, SAP Integrated Planning, Creating Reports and Dashboards using RDBMS Universe and OLAP Universe (On top of Bex).
  • An enthusiastic and project-oriented team player with solid Communication and leadership skills and the ability to develop creative solutions for challenging client needs.
  • Experience in creating and administering the metadata Data Services repository in a multi user environment.
  • Experience in working with BODI using with different data sources (Flat files, Oracle, SAP ECC, Microsoft SQL Server) and loading data into SAP BW and SAP ECC system using IDOC’s.
  • Experience in Tuning ETL Jobs, Recovery Mechanism, and Error Handling and also set up validation rules.
  • Good working knowledge in Data services 3.2 and worked on different Transport methods Direct download, Shared directory, ftp and custom transfer to import the data from Salesforce.com and Data Quality Transformations like Match Transform, Data Cleanse Transform, Address Cleanse Transformations, Parsing, Standardizing the data using DQ transformations.
  • Exposure to Business Objects Admin part, CMS, CCM, Import wizard, User level access and restriction scenarios.
  • Hands on experience towards Migration scenarios, scheduling and publishing the reports.
  • Good experience in dimensional data modeling like star schema, snowflake schema.
  • Strong experience in creating ETL batch jobs using SAP BODS designer console, monitoring and scheduling the jobs using Admin console.
  • BI Reporting using SAP BO XI R2/R3, 4.x (Webi, Crystal & Xcelsius) and SAP BI BEX
  • Creating Semantic layers using OLAP Cubes from scratch and integrating SAP BW & SAP Applications with BOXI R3.1, 4.0 and knowledge of working with Data modeling tools like ERWIN. Used DB tools like SQL Developer and TOAD.

Core Competencies

  • Dimensional Modeling using various ETL Tools like SAP BODS, exposure on SAP ECC systems, and Oracle SQL and PL/SQL, Implemented Performance tuning on ETL Data flows & Semantic layers.
  • Providing consulting in multiple end-to-end SAP/ Non-SAP Implementations showcasing abilities in gathering and analyzing Business requirements, designing best possible solutions in terms of system performance, cost and time.
  • Excellent hands on in Business Intelligence Domain - DW Architecture, Data Modeling, ETL Design and Development, Reporting Design and Development.
  • Exposure on loading data into SAP HANA database from Non-SAP and SAP Systems using RFC Connection.
  • Experience in using Import wizard for creation of BIAR Business Intelligence Archive Resource file to transfer the universes, reports from development environment to all other environments (MIGRATION).

TECHNICAL SKILLS

ETL Tools: SAP BO Data services (BODS), SAP BO Data Integrator (BODI), Informatica PowerCenter.

Business Objects: SAP Business Objects, SAP Business Objects Integration kit, Xcelsius.

Reporting Tools: BEX designer, Web Application Designer, Web Intelligence, Xcelsius Dashboards, Live office, Crystl Reports

Data Bases: Oracle, Sql Server

Languages: SQL/PLSQL. UNIX Shell Scripting

Workflow tools: MS-Excel, MS-VISIO, MS-PowerPoint, MS-Word

Operating Systems: Windows NT, Windows 2000 /XP, UNIX

IDE: TOAD

PROFESSIONAL EXPERIENCE

Confidential, Waltham, MA

SAP BODS Developer

Responsibilities:

  • Participated in project planning sessions with project managers, business analysts and team members to analyze business requirements and outline the proposed solution.
  • Responsible to talk to business users and finalize on any enhancements/new requirements needed by reporting department.
  • Developed Source to Target Mapping as per the requirements.
  • ETL Design (converting business functional specifications into mappings/workflows)and testing.
  • Exported and Data Integrator jobs to different repositories for backup and code migration purposes.
  • Tuned performance for large data files byincreasing block size, cache size and implemented performance tuning logic on sources, workflows, and data flows.
  • Performance tuning using Recovery mechanisms.
  • Implemented Server groups to use advance performance tuning feature such as parallel data flow, dividing dataflow in to sub dataflow and Degree of Parallelism.
  • Involved in writingDS scriptsand also used some built-in functions like Search, Replace, Lookup ext and Custom Functions like Sending Email whenever an exception is raised.
  • Created Data Services mappings to load the data warehouse,the mappings involved extensive use of simple and complex transformations like Key Generator, SQL Query Transform, Table Comparison, Case, Validation, Merge, lookup etc. in Data flows.
  • Involved in System integration testing along with other team members using various source data scenarios.
  • Experienced in scheduling and monitoring of jobs using DI management console during pre-production phases.
  • Developed Data Integrator jobs as per requirement.
  • Worked on Creating Repository and associating the Job server to it.
  • Created Batch jobs using BODS ETL Tool to extract SAP and Non SAP Data and Build the EDW.
  • Daily Job monitoring for regular scheduled jobs.
  • Designed Business objects semantic layers (Universe) with SAP Warehouse and Non SAP warehouse.
  • Developed critical reports like drill down, Slice and Dice, master/detail for analysis
  • Created complex reports using the multiple data providers and synchronized the data providers by linking the common objects to create the common report.
  • Created different reports containing charts showing revenue, booking, and backlog and operating income by different verticals and Company codes.
  • Implemented Security Features of Business Objects like row level, object level and report level to make the data secure.

Environment: SAP BODS 4.0, SAP BO XIR3, MS SQL Server 2008, Oracle 10g

Confidential - Stamford, CT

SAP BODS Developer

Responsibilities:

  • Working closely with Business Leaders to understand the requirement, Analysis on the requirements and Setting up Dev, UAT and Production Environment.
  • Interacting with Client, Managers, SME's, BA and Database team.
  • Coordinate with other teams to ensure availability of Source Data.
  • Timely response to customer requests.
  • No customer complaints on delivery or quality of deliverables.
  • The development of feeds with ease according to the functional specification document.
  • Completeness of deliverables without any rework.
  • Involved in performance tuning and Unit testing and UAT Testing.
  • UAT cycle support on every major release.
  • Deploying into production and upgrading the production environment.
  • Defining the new process for the application
  • Setting up the new security rules and regulations for User and Developers
  • Creating complex workflows, promoting the code from one environment to another environment.
  • Deploying the code in Production.
  • Taking care of PRD issues and release issues
  • Giving the Estimates for the Implementation.
  • Business Objects Data Service
  • Configuration of BODS, Repository creation and server management
  • Central Repository configuration and best practices for version control
  • Data Profile Management
  • Created custom Data Quality packages & rules using SAP Information Steward.
  • Scheduling the jobs and User Management in Management Console
  • ETL Design & use of Complex ETL routines to extract data from ODS (converting business functional specifications into mappings/workflows).
  • Liaise with Business users and Business Analysts regarding requirements.
  • Developed Source to Target Mapping as per the requirements and Data Integrator jobs to different repositories and code migration purposes.
  • Tuned performance for large data files byincreasing block size, cache size and implemented performance tuning logic on sources, workflows, and data flows.
  • Performance tuning using Recovery mechanisms
  • Implemented Server groups to use advance performance tuning feature such as parallel data flow, dividing dataflow in to sub dataflow and Degree of Parallelism.
  • Involved in writingDS scriptsand also used some built-in functions like Search, Replace, Lookup ext and Custom Functions like Sending Email whenever an exception is raised.
  • Created Data Services mappings to load the data warehouse,the mappings involved extensive use of simple and complex transformations like Key Generator, SQL Query Transform, Table Comparison, Case, Validation, Merge, lookup etc. in Data flows.
  • Involved in System integration testing along with other team members using various source data scenarios.
  • Experienced in scheduling and monitoring of jobs using DI management console during pre-production phases.
  • Developed Data Integrator jobs as per requirement.
  • Worked on Creating Repository and associating the Job server to it.
  • Involved in code enhancement, testing based on change request (CR) from end users.
  • Experience in debugging execution errors usingData Services logs (trace, statistics and error) and by examining the target data.
  • Experienced in scheduling and monitoring of jobs using DI management console during pre-production phases.
  • Enterprise wide Data Quality Dashboard
  • Guiding the development team right from requirement to deployment.
  • Leading a Team of 10 in both Onsite/Offshore.
  • Handling Project Management for different assignments.

Environment: SAP Business Objects Data Services (BODS) 4.0. SAP Information Steward 4.0, MS Sql server 2008, Oracle 10g

Confidential, Foster City, CA

SAP BODS Developer

Responsibilities:

  • Analyzing the requirement specifications.
  • Developed SQL queries to ensure loading of data, performed unit testing at various levels of ETL, developed and implemented the coding of BODS Jobs.
  • Debugged the Data flows by using debuggers and break points.
  • Prepared Technical Specification and Resolution Document.
  • Technical Unit Testing, Regression volume testing.
  • Daily Monitoring of Active batch jobs which triggers BODS jobs and BOBJ schedules.
  • Involved in performance tuning, identifying and resolving performance bottlenecks at BODS and database level.
  • In span of 2 months worked on resolving most of the major performance issues and design, thereby improving the loading time for daily extracts.
  • This is a recommendable improvement done
  • Coordinated with DBA/ SAP Functional/ SAP Technical/ Universe/ Reporting teams for day to day work.
  • Followed client coding standards and SDLC.
  • Working with the Internal ticketing system and keeping the delivery of tasks on time.
  • Developed project documents which will be used in future by new or existing users to understand the scope of ETL designs/load processes.
  • Handled ETL related issues and was quick enough to propose solutions.

Environment: SAP BODS, Rapid Marts, BOBJ reporting WEBI, SQL Server, windows server 2008 OS.

Confidential, Charlotte, NC

Informatica Developer

Responsibilities:

  • Worked with heterogeneous sources from various channels like Oracle and flat files.
  • Used Informatica PowerCenter Designer (Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer, and Mapplet Designer).
  • Used transformations Source Qualifier, Router, Expression, Aggregator, Lookup, Update Strategy, and Sequence Generator.
  • Wrote SQL-Overrides in Source Qualifier.
  • Implemented Slowly Changing Dimensions (SCD) Type II for maintaining history.
  • Extensively worked in the performance tuning of ETL mappings and sessions.
  • Used Lookup Transformation to access data from tables, which are not the source for mapping and also used Unconnected Lookup to improve performance.
  • Created reusable transformations and used in different mappings.
  • Developed incremental loading through Informatica mappings.
  • Created simple cognos report studio reports as part of Console free solution.
  • Created Cognos Framework Manager Model using stored procedures.
  • Involved in fixing invalid mappings, testing of sessions and the target data.
  • Used TOAD to evaluate SQL execution.

Environment: Informatica PowerCenter 8.1.1, Cognos Report Studio, Oracle 9i, Windows XP, UNIX and TOAD.

Confidential, Rockford, IL

Informatica Developer

Responsibilities:

  • Extensively involved in System Study & Business Requirements Analysis, Translated business requirements into data mart design.
  • Worked on Informatica PowerCenter tool Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager. The objective is to extract data stored in Oracle.
  • The project involved extracting data from different data providers, transforming the data from these files before loading the data into target (warehouse) Oracle tables.
  • Informatica Designer tools were used to design the source definition, target definition and transformations to build mappings.
  • Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translated business rules and functionality requirements into ETL procedures.
  • Designed and Developed complex aggregate, joins, look up transformation rules (business rules) to generate consolidated (Fact/Summary) data identified by dimensions using Informatica PowerCenter tool.
  • Created the mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy.
  • Setting up Batches and sessions to schedule the loads at required frequency using PowerCenter Server manager. Generated completion messages and status reports using Server manager.
  • Developed script, PL/SQL stored procedures, Triggers for automation of certain task outside Informatica box.
  • Configured the sessions using workflow manager to have multiple partitions on Source data and to improve performance.
  • Extensively Using the Debugger.
  • Extensively worked in the performance tuning of programs, ETL procedures and processes.

Environment: Oracle9i, Informatica PowerCenter 7.1.1, SQL, PL/SQL, Toad

We'd love your feedback!