We provide IT Staff Augmentation Services!

Bods Developer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • 10 years of professional IT experience in analyzing requirements, designing, building, highly distributed mission critical products and Applications.
  • Worked as an ETL Developer and SQL developer as a SAP - BODS Technical Consultant using Business Objects Data Integration Data Services(BODI/DS 3.2, 4.0 and 4.2)
  • Expert in Data Extraction, Transforming and Loading (ETL) using Business Objects Data Integrator XI (BODI), Data-services xi3.2, 4.0(BODS) and Business Objects Web intelligence reporting.
  • Implementing RFC connection and them to interact with SAP systems.
  • Implementing ABAP data flows in SAP BODS to extract the data from SAP ECC system.
  • Implemented Slowly Changing Dimension - Type 2 (SCD - Type 2) for the required dimensions by using TABLE COMPARIOSN Transform, HISTORY PRESERVING Transform and KEY GENERATION Transform
  • Used the LOOKUP EXT functions to derive the columns in BODS ETL by look upping the values in lookup table of type validity tables
  • Experience in debugging execution errors using Data Services logs (Trace, Statistics, and Error).
  • Tested data flows and scripts against sample data and real data.
  • Worked with End-To-End implementation with Data warehousing team and Strong understanding of Data warehouse concepts Star Schema, Snowflake Schemas and Multi-Dimensional Models for query and analysis requirements.
  • Experience in Automation of jobs through Management console & creating Batch jobs based on Client requirement.
  • Configuration of Local Repository for each application Environment.
  • Working knowledge of Software Development Life Cycle (SDLC), having thorough understanding of various phases like Requirements, Analysis/Design, Development, Project Management and Testing.
  • Knowledge on developing logical and physical schema of the database using data modeling.
  • Experienced with SQL queries, views and stored procedures
  • Extensively worked on Metadata management for user provided files.
  • Knowledge on Repository Manager Configuration, Job Server configuration, Management Console.
  • Experience in Requirement gathering, Fit-Gap analysis, Blueprinting, build, Configuration, Testing, cutover and Go-live activities for data migration.
  • Experience in Release management process including development, testing and production.
  • Worked on SAP BODS Information Steward like creating the rules, binding rules, score card setup and data profiling.
  • Worked in preparing technical documents like Data Mapping Document and Unit testing evidence documents.
  • Worked on validations on preproduction and production deployments.
  • Troubleshoot failed jobs in Preproduction and production environments
  • Communicate and Co-ordinate with Offshore

TECHNICAL SKILLS

ETL Tools: Business Objects Data Services XI 3.2/4.0/4.2

ERP: SAP R/3 (4.6 C/4.7), ECC 5.0, ECC 6.0 SAP S/4 HANA

Databases: SAP HANA, Teradata, Oracle 10g, SQL Server 2005/2008.

Reporting: SAP Business Objects, Web I report

Languages and Others: SQL, Big Data Tools HADOOP HDFS, Map Reduce, PIG SQOOP and BPM tool PEGA

Operating System: MS-Dos, Windows 9X, Advanced 200X Server and Linux

Packages/Tools: Remedy, HPQC

Programming Languages: C, C++, Java, SQL,PL SQL

PROFESSIONAL EXPERIENCE

Confidential

BODS Developer

Responsibilities:

  • Design, develop, unit test and implement Extract, Transformation and Load (ETL) processes, programs and scripts.
  • Involved to preparing DMRs and built Jobs for different subject areas and tested thoroughly.
  • Create project flow like jobs, Workflows and Data flows
  • As part of production support Monitoring daily ETL scheduled jobs.
  • Trouble shooting the issues and fixing them.
  • Work on code migration from low level envo to high level envo like DEV, QA to PROD
  • Schedule the jobs in ESP. Set up the DS inbuilt and third-party schedules
  • Run the jobs for monthly end loads on adhoc basis
  • Write SQL scripts for Data base changes
  • Work on incidents to fix issues and debugging them to resolve
  • Create batch files and schedule them in third party tool
  • Adhoc execution of the jobs in Management Console.
  • Used the LOOKUP EXT function to derive the columns in BODS ETL by look upping the values in lookup table of type validity tables
  • Resolve the issues based on request submitting by the Users
  • Create new data stores to connect different servers
  • We Implemented Recovery Mechanism for unsuccessful batch jobs execution.
  • Give on call support for fixing the issues of failures
  • Work on Map operation, Case, Table comparison, history preserve and key generation transformations
  • Identify metrics and measurements for data quality scorecards.
  • Automation of SAPDS/IS jobs using in-built and third party schedulers.
  • Attending the daily meetings and discuss with requirements and about solving the issues
  • Experience using ABAP data flows in BODS while loading data from SAP systems
  • Developing and executing advanced data profiling, data validation scenarios.
  • Prepare mapping documents and deployment instructions documents and shared them into business folders.
  • SAP Information Steward (Rules development, Profiling Rules development, Scorecard setup and basic IS administration).
  • Creating rules and binding them against the source data
  • Experience in developing scorecards, dashboards and Data Quality mappings.
  • Maintain BODS Security as per the client specifications and Interacted with client on daily status calls.Reload the data for the particular period of missing data.
  • Create and manage Jobs for Data extract and Load.
  • Must possess excellent analysis and data logic skills.
  • Identifies bad records and root cause of them and avoid them loading into tables.

Confidential

BODS developer

Responsibilities:

  • Design, develop, unit test and implement Extract, Transformation and Load (ETL) processes, programs and scripts.
  • Involved to preparing data mapping reports and built Jobs for different subject areas and tested thoroughly.Creating project flow like jobs, Workflows and Data flows
  • As part of production support Monitoring daily ETL and Informatica Cloud scheduled jobs.
  • Trouble shooting the issues and fixing them.
  • Working on code migration from low level envo to high level envo like DEV, QA to PROD
  • Scheduling the jobs in CA workload automatation ESP edition and in Informatica cloud
  • Creating connections in IC and migrating the code from lower envo to prod envo
  • Set up the DS inbuilt and third party schedules
  • Run the jobs for monthly end loads on adhoc basis
  • Writing SQL scripts for Data base changes
  • Good Knowledge on Guide wire applications, Claim, Policy and Billing center)
  • Knowledge on processing of Databhub and guidewireinfo center
  • Working on incidents to fix issues and debugging them to resolve
  • Creating batch files and schedule them in third party tool
  • Adhoc execution of the jobs in Management Console.
  • Used the LOOKUP EXT function to derive the columns in BODS ETL by look upping the values in lookup table of type validity tables
  • Resolve the issues based on request submitting by the Users
  • Creating new datastores to connect different servers
  • We Implemented Recovery Mechanism for unsuccessful batch jobs execution.
  • Giving on call support for fixing the issues of failures
  • Developing and executing advanced data profiling, data validation scenarios.
  • Working on Map operation,Case, Table comparison, history preserve and key generation transformations
  • Attending the daily meetings and discuss with requirements and about solving the issues
  • Prepared mapping documents and deployment instructions documents and shared them into business folders.
  • Experience in developing scorecards, dashboards and Data Quality mappings.
  • Maintaining BODS Security as per the client specifications and Interacted with client on daily status calls.Reloading the data for the particular period of missing data.
  • Create and manage Jobs for Data extract and Load.
  • Identifies bad records and root cause of them and avoid them loading into tables.
  • Worked on daily and monthly manual loads.
  • Experienced in designing and implementing different types of data flows which includes Case, Merge, Validation, Map Operation, Data Transfer, Row Generation, Table Comparison transforms etc, using Business Objects Data Services/Data Integrator.
  • Implemented SCD Type2 for the customer selected dimensions using Table Comparison, History Preserving and Key Generation transforms.
  • Working with source team for the required source data.
  • Daily interaction with the clients to provide business expected results.
  • Executed complex jobs in Debug mode and traced out errors using logs.
  • Deployed the jobs and validating them in PPD and PROD environments.
  • Experience in developing scorecards, dashboards and Data Quality mappings.

Confidential

Consultant

Responsibilities:

  • Extracting the data from SAP ECC Extractors, tables and deliver data to Target Teradata, flat files using SAP BODS.
  • Implementing ABAP data flows in SAP BODS environment to extract the SAP ECC tables,IDOCS
  • Implementing RFC connections and testing them to communicate with SAP systems.
  • Defined separate data store for each database to allow Data Services to connect to the source or target database.
  • Strong experience in Object development using extract, mapping/transform tools
  • Experience writing the SQL scripts
  • Worked on SAP BW Source and targets like Open hub destination, Transfer structures in SAP BODS
  • Design, develop, unit test and implement Extract, Transformation and Load (ETL) processes, programs and scripts.
  • Knowledge on exciting LSMW scripts.
  • Worked on Data Quality transformations
  • Set up the DS and IS inbuilt and third party schedules
  • Test planning, processing/tracing messages, researching errors and conducting root cause analysis to identify corrective action
  • Implemented the changes in jobs according to the change request document.
  • Implemented parallel processing (multithreading) for better performance.
  • Extracted data from Flat Files, Databases and moving into staging area and transform and finally load into the Oracle as target.
  • Developed different phases like Stage, SDL, EDW(EL - Enterprise Layer) to process the data
  • Modifying the incoming data through Platform, Data Integrator transforms.
  • Implemented SCD Type2 for the customer selected dimensions using Table Comparison, History Preserving and Key Generation transforms.
  • Experienced in designing and implementing different types of data flows which includes Case, Merge, Validation, Map Operation, Data Transfer, Row Generation, Table Comparison transforms etc, using Business Objects Data Services/Data Integrator.
  • Used BODS Data Integrator Transforms PIVOT and REVERSE PIVOT to apply the row level and column level transformation according to reporting requirement and implemented without PIVOT Transforms by wiring the SQL scripts.
  • Experienced in designing and implementing different types of data flows which includes Case, Merge, Validation, Map Operation, Data Transfer, Row Generation, Table Comparison transforms etc, using Business Objects Data Services/Data Integrator.
  • Validated and executed batch jobs. Tested data flows and scripts against sample data and real data. Creation of initial and incremental load jobs.

Confidential

SAP BODS Consultant

Responsibilities:

  • Extracting the data from SAP Extractors, Tables, SAP BWSources, flat files, Excel, Oracle which are dynamic in structure using BODS.
  • Implemented dynamic SQL queries to load the data into target.
  • Design, develop, unit test and implement Extract, Transformation and Load (ETL) processes, programs and scripts.
  • Involved to preparing DMRs and built Jobs for different subject areas and tested thoroughly.
  • Worked on Requirement gathering, Fit-Gap analysis, Blueprinting, build, Configuration, Testing, cutover and Go-live activities for data migration.
  • Extensively working on SAP BODI Designer Components- Projects, Jobs, Workflows, Data Flows, ABAP Data flows, Data Store and Transformations, Formats.
  • Designed various mappings for extracting data from SAP Source systems and relational tables.
  • Using designer to create source definitions, design targets, create jobs and develop transformations.
  • Created different transformations for loading the data into oracle database e.g. Query transformation, Merge, pivot, Case, Table comparison, Key Generation and map operation transformation.
  • Created the embedded dataflow to reuse the logic.
  • Developed SCD Type2 Effective Date Range by using History Preserving transformation according to the business requirements.
  • Implemented Recovery Mechanism for unsuccessful batch jobs execution.
  • Worked on transforms like query, table comparison, SQL, CDC, merge, case, key, date generation.
  • Used Merge, Merge Join, Union All, Derived Column, Audit etc for transformations & to add columns with lineage metadata and other operational audit data.
  • Responsible forcreating mapping and transformationspecifications based on the business requirements from various business teams and implement inData Integrator jobs.
  • Migrated all the Projects of the team fromDEV TO TEST TO PPD TO PROD, & scheduled the jobs as part of the Administrator.
  • Worked on all the standards & guidelines Best practices Documentation, and also the functional & Technical design specification documentation.
  • Resolving the tickets within a specified SLA time.
  • Implemented conditional flow to load either full load or delta load.
  • Monitoring the jobs and report failed jobs

Confidential

BODS Consultant

Responsibilities:

  • Design, develop, unit test and implement Extract, Transformation and Load (ETL) processes, programs and scripts.
  • Involved to preparing DMRs and built Jobs for different subject areas and tested thoroughly.
  • Worked on Business Objects Data-Integrator, and used functions like lookup, date, conversions, etc in mappings.
  • Created Repositories, Users & assigned the security in the Multi-Team Environment.
  • Worked on the DI scripting & prepared a standard Incremental load template which enables the job failure/success notifications.
  • Extracted Data from all the Relational Databases including oracle, Microsoft Sql server and loaded in different environments which include BW, ORACLE, SQL SERVER, and .txt & .CSV files.
  • Worked on the BW open Hub tables to extract the data from the BW.
  • Created Global variables to read the files from the shared location from the BODS Designer
  • Conduct root cause analysis and resolve production problems and data issues Manage reporting meta data and data standardization for end to end integration
  • Sharing the Documents in centralized shared folders and test evidences.
  • Worked extensively on the life cycle management, in moving the reports from the environments of developing, testing and production.
  • Validated and executed batch jobs. Tested data flows and scripts against sample data and real data. Creation of initial and incremental load jobs.

We'd love your feedback!