We provide IT Staff Augmentation Services!

Sr Etl Solutions Manager Resume

0/5 (Submit Your Rating)

Irving, TexaS

SUMMARY

  • IT professional with more than 10 plus years of development experience using SAP applications such as SAP Business Objects Platform, SAP Business Objects Data Services (BODS)/Data Integrator, Informatica, Information Steward, HCI - IBP and SAP HANA.
  • Highly experienced in crating tasks in SAP HANA Cloud Integration (HCI)/ Cloud Platform Integration (CPI) with the ability to push the data IBP (Integrated Business Planning).
  • Strong experience in SAP BODS connecting SAP HANA and Cloud Technologies - GCP (Google Cloud Platform) using GCP Buckets, Connecting to Azure as a storage as well as blob containers.
  • Deep skills building BODS interfaces connecting Salesforce (Adapter) performing bulk & delta loads.
  • Build and Support Integrations for managing data feeds between BW HANA and Snowflake Data Warehouse.
  • 7 years of experience in MuleSoft, IBM Integration Bus, MQ/MFT, IBM DataPower, and SAP PO
  • Experienced in Analysis, Design, Development and Implementation of Data Warehousing and Database applications using SAP Data Services (ETL), Business Objects Webi Reports.
  • SAP Data services to connect to SAP ECC 6.0, SAP BW as Source and Target systems. Also, load data fromPSA to Data Targets using Transformations and DTPs.
  • Expertise in gathering the business requirement and building Universes in the Designer, retrieving data using Universes, Free hand SQL methods and creating complex ad-hoc Reports using data access and report design using Web Intelligence modules in Business Objects XI R3.
  • Perform unit and integrated systems testing, migrate and deploy Jobs and workflows to Production Servers, schedule and support ELS (Early Life Support) for integrations.
  • Experienced in Data Conversions, Data integration and Data Migration with specialization in BODS and expert on Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.
  • Efficient in Optimizing, Debugging and testing SQL queries, calculation views and stored procedures written in HANA, Teradata, Oracle and SQL Server.
  • Excellent Technical, Interpersonal, and Communication skills along with Time and Project management.

PROFESSIONAL EXPERIENCE

Confidential, Irving, Texas

Sr ETL Solutions Manager

Responsibilities:

  • Participated in business requirements analysis, report designing and modeling, development of technical specifications, testing and Implementation.
  • Responsible for Extracting Data from ECC (using BAPI calls & RFC Direct read method) and other legacy systems into SAP BW, SQL server and SAP HANA using SAP Data Services, Mulesoft & IIB
  • Created the MULE ESB artifacts, created flows and configured the MULE configuration files and deployed the application.
  • Implemented integration flows using Mule Anypoint Studio to connect to REST, SOAP service, Oracle Database.
  • Designed and developed enterprise services using RAML and based APIs and used various transformers in Mule ESB based on use case and had implemented the custom transformations.
  • Worked on MuleSoft Any Point API platform on designing and implementing Mule API’s.
  • Implemented Json Logger Functionality to check the Json output in API console logger.
  • Worked on the Mule API Gateway for the application of policies to API as well for managing the security.
  • Also worked with the Proxy settings using the API Gateway for the API’s.
  • Extensively used Mule components that includes File, FTP, SFTP, Salesforce, Data Base, HTTPS, WMQ, Anypoint MQ and Cloud Hub.
  • Developed and consumed the Amazon web services using Dynamo Database, SQS Queues, and SNS Notifications to communicate with different departments.
  • Migrated Mule ESB 4.1.3 apps to Mule ESB 4.2.1 and updated all the dependencies.
  • Developed Mule flows to integrate Data from various sources into Database, from AWS Console topics and SQS queues.
  • Worked on Security Authorization parameters like OAuth, SSL, TLS Configurations and Secure properties using Vaulted Mule keys with Mule properties file editor.
  • Involved in creating http inbound and outbound flows, transformers, filtering, and Security of Mule Flows.
  • Created Mule Flow using End Point, Connector and Component been using Mule ESB to communicate client/server systems.
  • Deployed Mule ESB applications into Anypoint studio platform with API’S Gateway.
  • Designed jobs to get data from SAP BW using the Open Hub Destination, kicking off the process chains and also pushing the data to the BW DSOs using the external source system configurations.
  • Setup integrations connecting to GCP (google cloud Platform) buckets as storage and automated feeds from SAP HANA to GCP buckets.
  • Build integrations supporting sales/rebates/chargebacks data transfers to Snowflake Data warehouse.
  • Extensively used Azure Blob storage containers to deliver files and connections to Azure SQL Datawarehouse tables.
  • Developed SAP HANA Calculation views (both graphical and SQL Scripting), Stored Procedures, flowgraphs (SDI), using both SAP HANA Studio and Web IDE
  • Involved inerror handlingandperformance tuningof data flows.
  • ProvideKnowledge Transferto the end users and created extensivedocumentationon the design, development, implementation, daily loads and process flow of the mappings.
  • Loading Data from multiple Data Sources to multiple Data Targets withdifferent Transformations and DTP'sby scheduling different updates.
  • Tuned performancefor large data files by increasing block size, cache size and target-based commit interval.

Environment: Java 1.8, Mule ESB 4.1.3,4.2.1 AnyPoint Studio, Mule Standalone Server, Splunk logs, Cloudhub, GitHub, ActiveMQ, Salesforce, Postman, SOAP, REST, OAuth, Apache-Maven 3.3.5, AWS, Putty, MUnit, RAML, JSON, Dynamo DB, Runtime Fabric, GCP, EDI, SAP BO Data Services

Confidential, Santa Clara, CA

Data Warehousing Analyst/ SAP Data Services

Responsibilities:

  • Participated in business requirements analysis, report designing and modeling, development of technical specifications, testing and Implementation.
  • Extensively used SAP Data Services to create ETL jobs and put data into staging tables and transform it further to populate the data warehouse tables.
  • Responsible for Extracting Data from ECC and other legacy systems into SAP BW and SQL server using SAP Data Services (Data Integrator).
  • Responsible for the execution, monitoring and scheduling of data services jobs using Data Services Management Console (Administrator) and used web services to publish the jobs.
  • Defined Data Stores and Configurations in SAP Data services to connect to R/3, SAP BW, Source and Target systems.
  • Used SAP Business Objects Data Services to create work flows and data flows to load the data into SAP BW staging tables (PSA) and transform it further to populate the DW tables.
  • Developed complex mappings in BODS to load the data from various sources into the Data Mart, using different transformations like Query, Case Transformations, Table Comparison, Merge, Pivot, Reverse Pivot, Lookup, Key Generation, History Preserving, Date Generation, Map Operation and Validation etc.
  • Created several jobs for loading the data to SQL Server 2008 from flat files, data files, excel files, oracle tables, oracle views.
  • Created jobs in order to perform the full initial data load and then switched the jobs to Delta mode in order to append the delta data in the existing data warehouse (EDW).
  • Involved in sap R3 source extraction by using sap Source Extractors using BODS 4.0.
  • Responsible for pushing the daily delta data load to SAP APO.
  • Defined file format to use as source and target files and developed several reusable transformations and custom functions.
  • Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
  • Involved in migrating jobs from development to testing and then finally to the production.
  • Created physical design documents, technical specification documents for all ETL jobs.
  • Involved in error handling and performance tuning of data flows.
  • Expertise in dimensional data modeling, Star Schema Modeling, Snow-Flake Modeling, fact and dimensional table design, physical and logical data modeling.
  • Extensively used TRY/CATCH blocks to Handle Exceptions and writing Scripts to automate the Job Process.
  • Responsibility of creating and maintaining database objects and structures (e.g. tables, indexes, views, triggers, Functions, Procedures etc.)
  • Designed and maintained Universes and exported them to the Repository for making resources available to the users.
  • Responsible for the creating universes using Business Objects Universe Designer by creating the Business Objects data model selecting/joining tables, indicating cardinalities.
  • Designed, developed and managed Universesin the Designer for report generation.
  • Developed Web Intelligence Reports for End Users.

Environment: SAP BO Data Services (Ver.4.0), SAP Business Objects XI R3.1, Oracle 11g, Designer, SQL Server 2008, Windows Server 2008 R2.

Confidential, San Francisco, CA

SAP BODS/ETL Developer

Responsibilities:

  • Developed and supported in Extraction, Transformation and Load process (ETL) using Business Objects Data Services to populate the tables in Data warehouse and Data marts.
  • Expertise in gathering, analyzing and documenting business requirements, functional requirements, and data specifications for Business Objects Universes and Webi Reports.
  • Defined Data Stores and Configurations to connect to R/3, SAP BW, Source and Target systems.
  • Extensive experience in implementation of Data Cleanup procedures, transformations, Scripts, Stored Procedures and execution of test plans for loading the data successfully into the targets.
  • Loaded SAP R/3 tables and developed transformations that apply the business rules given by the client and loaded the data into the target database.
  • Identified bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs and redesign the existing mappings for improving the performance.
  • Used SAP Data Services Data Quality to develop components that would help to clean data elements like addresses and match entity names.
  • Expert on Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.
  • Extensively used Query Transform, Map Operations, Table Comparison, lookup function, Merge, Case, SQL, and Validation Transforms in order to load data from source t o Target Systems.
  • Created Scripts like Starting Script and Ending Script for each job, sending the job notification to the user scripts and declaring the Local and Global Variables.
  • Defined file format to use as source and target files.
  • ABAP data flow processing Using Data services.
  • Worked on writing Functions to make the code reusable and used lookup functions to reference the reference table data.
  • Extensively used Try/Catch to handle exceptions and writing Scripting to automate the Job Process.
  • Experience in Migration of Jobs and workflows from Development to Test and to Production Servers to perform the integration and system testing.
  • Created physical design documents, technical specification documents for all ETL jobs.
  • Dealt with Unit Testing and data Validation testing of all the mappings end to end and also with UAT.
  • Used check-in/check-out procedure to import and export projects/jobs for secure access to central repository objects and maintaining version history.
  • Experience in Debugging and Performance Tuning of targets, sources and mappings.
  • Created several jobs for loading the data to Teradata 13.0 from flat files, data files, excel files, oracle views and SQL tables.
  • Designed, developed, deployed and maintained universes using Business Objects designer.
  • Developed Custom hierarchies to support drill up/down reports, and also created cascading LOVs (List of Values) to be used for cascading prompts.
  • Developed Complex Web Intelligence Reports for End Users.
  • Designed and developed extensive Webi reports using multiple Queries, combination of chart and tables.
  • Used cascading prompts in the reports to create more interactive reports for the users.
  • Used Business Objects free hand SQL to generate Ad-hoc reports
  • Built reports that have multiple display objects on a single page (tables and charts) inWeb Intelligencefor user friendly analysis of data.

Environment: SAP Data Services XI (Ver. 3.2), SAP Business Objects 4.0, Designer, Web Intelligence, Info View, Crystal Reports 2008, Teradata 13.0 (Teradata SQL Assistant), Oracle 10g, MS-SQL, SQL Server, Toad, Windows 2003 Server.

Confidential, Seattle, WA

BODI / Business Objects Developer

Responsibilities:

  • Analyzed business requirements and worked closely with various application teams and business teams to develop ETL procedures that are consistent across all applications and system.
  • Extracting data from different databases using Business Objects Data Integrator and transforming data using different transforms using logic, functions, scripts and loading in to target database.
  • Developed the jobs to extract, transform and load, populating and refreshing DW Tables from Flat files and SQL server 2003 tables, views and oracle views sources using BODI Jobs.
  • Managed Repositories using administrator console as per business requirements.
  • Created mappings using Data Integrator jobs and designed workflows using Workflows to build DW as per business rules.
  • Created the flat File Format and efficiently used FTP to get/put the files to population of target tables.
  • Wrote ETL design documents, establish ETL coding standards and perform Transforms reviews.
  • Prepared data Extract from legacy system and populating SQL Database with huge volume of historical data using bulk load on which data store for BO has been created.
  • Tuned performance for large data files by increasing block size, cache size and target based commit interval.
  • Created jobs in order to perform the full initial data load and then switched the jobs to Delta mode in order to append the delta data in the existing data warehouse.
  • Created Scripts, Variables and workflows using BODI and scheduled workflows at specified frequency.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Loaded Data from multiple Data Sources to multiple Data Targets withdifferent Transformations and DTP'sby scheduling different updates.
  • Loaded data fromPSA to Data Targets using Transformations and DTPs.
  • SAP R3 source extraction by using sap Source Extractors using data integrator.
  • Interacted with Users to know their Business views while gathering the Report requirements and provided Several Report Mock-ups to finalize the requirements.
  • Created customized Web Intelligence reports from various different sources of data.
  • Designed universes and resolved join problems such as Loops, Chasm and Fan traps by creating Aliases, Contexts and using the features available in the Designer to separate queries on measures.
  • Wrote complex queries including Sub Queries, Unions, Intersect and Aliases.

Environment: Business Objects Data Integrator XI R2, Business Objects XI 3.1, BO Universe Designer, Web Intelligence XI 3.1, Info View, Data Integrator, Oracle 10g, SQL Server 2003, Windows Server 2003 Server.

We'd love your feedback!