We provide IT Staff Augmentation Services!

Integration / Etl Specialist Resume

SUMMARY

  • Over 7+ years of IT experience in all phases of Data Warehousing and Business Intelligence including Requirements Gathering & Analysis, Design, Development, Testing and Production support.
  • Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions, and Workflows using Informatica Power Center.
  • Extensive experience in Extraction, Transformation, and Loading (ETL) data from various data sources into Data Warehouse and Data Marts using Informatica Power Center tools (Repository Manager, Designer, Workflow Manager, Workflow Monitor).
  • Experience in designing/developing complex mapping using transformations like Source Qualifier, Router, Filter, Expression, Sorter, Aggregator, Normalizer, Joiner, Sequence Generator, Connected and Unconnected Lookup and Update Strategy.
  • Extensive experience in Data Warehousing, ETL, Design, Development, Implementation, and Testing of Data Warehouse/Data Mart Systems.
  • Experience working in IDQ (Informatica Developer Tools) to perform data cleansing, data matching, data conversion, exception handling, data profiling, reporting and monitoring.
  • Used Address validator, Parse Transformation, Join Analysis Transformation to perform IDQ activities.
  • Used IDQ to perform Unit testing and create Mapplets that are imported and used in PowerCenter Designer.
  • Involved in tuning the Oracle PL/SQL queries by using Indexing, Hints and Parallel execution process.
  • Proficient in the integration of various data sources involving multiple relational databases like Oracle, MS SQL Server, DB2, Greenplum, Teradata, XML and Flat Files (fixed width, delimited).
  • Worked with PLSQL Stored Procedures, Triggers, Cursors, Indexes and Functions.
  • Proficient in developing and maintaining various PLSQL/Database objects like packages, functions, stored procedures, triggers, tables, Views, Materialized views, Indexes, Sequences, partitions, etc.
  • Experience in developing SQL*Loader control programs and PL/SQL validation scripts for validating the data load from staging area to production.
  • Strong Knowledge on Oracle Architecture and Database Design using Normalization and E/R diagrams.
  • Excellent conceptual knowledge of Oracle 10g/11g, Data Warehouse Concepts, ETL, Data modeling, and Dimensional Modeling.
  • Strong experience in writing UNIX Shell scripts, SQL Scripts for development, automation of ETL process, error handling, and auditing purposes.
  • Created different types of validation and reconciliation scenarios using the data from different databases, including Oracle, Teradata, MSSQL, and MySQL.
  • Good understanding of Data modeling concepts ER Diagrams, UML, Use Cases, Normalization, and De - normalization of Tables, Excellent analytical, problem-solving skills with strong technical background and interpersonal skills.
  • Experience in Scheduling Informatica sessions for automation of loads in Autosys, Control M, UC4.
  • Expertise in understanding the Business process, writing Test Plans, Test Strategies, Test Cases, Executing scenarios as part of designing.
  • Experience working with offshore and onsite coordination.
  • Flexible, enthusiastic and project-oriented team player with solid communication and leadership skills to develop a creative solution for challenging client needs.
  • Able to work independently and collaborate proactively & cross-functionally within a team.

TECHNICAL SKILLS

Data Warehousing Tools (ETL): Informatica Power Center 10.2,10.1.1,9.6.1, 9.5.1,8.x, Informatica Data Quality 10.2,9.6.1,9.5.1,8.x, Informatica Data Validation Option (DVO), Informatica Cloud, Informatica Analyst, Informatica PowerExchange, Informatica Metadata Manager, Jitterbit Harmony

RDBMS: Amazon Redshift, Oracle 12C,12G,11G, DB2, Teradata, SQL Server, Netezza, PostgreSQL, Mysql

Reporting Tools: Power BI, Tableau, Business Objects, Micro Strategy, Tableau

Languages: SQL, PL/SQL, UNIX Shell Scripting, Python, Apache Spark, Apache Scala

Operating Systems: Windows, UNIX, Linux, Centos

DB Utilities: Aginity WorkBench, TOAD, SQL* Loader, SQL Developer, SQL* Plus

Applications/Other Tools: AWS S3, AWS EC2, AWS SCT, AWS DMS, MS office, Silectis Magpie (Data Lake), Salesforce, Workday, Composite Data Virtualization, MS office, Control - M, Win SCP, Autosys, Putty, JIRA, Confluence, Service Now, Share Point, Cherwell Service Management, Nagios, Splunk.

PROFESSIONAL EXPERIENCE

Confidential

Integration / ETL Specialist

Responsibilities:

  • Deep understanding of various Confidential data eco systems, product stack and integration among heterogeneous data source systems. Domain knowledge in security services industry including video surveillance, alarm monitoring, and health safety. Intensive knowledge of billing and installation processes and security monitoring services.
  • Understanding of Confidential enterprise data warehouse systems, integration frameworks, processes, data and their relationships.
  • Performed Data modelling, Data Integration, Data Administration and Data Governance practices. Effort estimation and planning of work streams.
  • Understanding of Confidential data sources and integration points and their relationships. Integrating the Master Mind Billing System data, Kronos Workforce and Human Capital Management Data to Oracle and Loading the Incremental Data for BI Reporting and testing the same.
  • Integrate Data from Various Source Systems like Salesforce CPQ to Oracle Net Suite, Salesforce to Workday (Contact, Opportunity, Contract Information Data), SAP Ariba using Jitterbit.
  • Developed mappings on ALTOVA MapForce calling SOAP and REST Web Services directly within a mapping.
  • Created Database mappings, including mappings between flatfiles, Excel and other database formats to convert data and provide options to automate transformations using MapForce - Altova.
  • Involved in migration of the BI Data Warehouse from Oracle Data Warehouse to Google Cloud Platform Big Query.
  • Create ETL technical design based on business requirements and develop code for critical data components
  • Designed conceptual, logical and physical, and dimensional data models using Erwin (ER Studio) Forward Engineered Databases. Implemented and maintained the database design.
  • Extensively worked with DBAs to consistently improve the overall performance of the data warehouse load process.
  • Identified and analyzed data discrepancies and data quality issues and worked to ensure data consistency and integrity across environments. Worked on Data profiling to establish Data quality Rules using Informatica Data Quality (IDQ).
  • Developed customer lifetime value models to estimate future cash flows and identify high value customers for Score Card Dashboard.
  • Analyzed transactional data, Developed Customer Scoring Models and Provided analysis on customer segmentation for optimizing marketing campaign and catalogue content.
  • Performed trend analysis to identify customer spending patterns to uncover possible fraudulent activity using Transactional Data from various sources.
  • Created On Demand Reports for Executive CEO Summary, Financial Break Down, Project Financials Etc.
  • Automated Data Loads for Informatica and Tableau Dashboard refreshes by having dependencies using Tidal Automation.

Environment: Master Mind Billing(MMB), Kronos Workforce and Human Capital Management, Salesforce CRM, Workday HRM, MapForce - Altova, Jitter Bit Harmony, ServiceNow, Python, Apache Spark SQL, Informatica Power Center 10.2, Informatica Meta Data Manager, Oracle 12C, Oracle 19C, SQL Server Cloud, Flat files, JSON Files, Parquet Files, SOAP API, REST API, Composite Data Virtualization Studio, Toad for Oracle, Power BI, Tableau, SVN, Putty, File Zilla, Unix, Google Cloud Big Query, JIRA, Service Now, Tidal Automation, Confluence, Share Point.

Confidential

Informatica Developer

Responsibilities:

  • Actively participated in requirement gatherings and analyzed Business Requirements with the Business SMEs, Project Managers, Users, Data Model Architects, DBAs and ETL leads to develop a data model for Dealer, Inventory, Finance, Franchise in Amazon Redshift.
  • Migrating the Used Car Listings (UCL) data to Amazon Cloud from On-premises and Loading the Incremental Data to Data lake (AWS S3) and testing the same using Silectis Magpie Tool for Data Analytics.
  • Integrated Data warehouse Design for enormous sources feeding the Warehouse like Billing (Bill Immon), Finance, Sales, Call Details, Customer, Usage, Product Integration ( Confidential for Life, VHR, Used Car Listings).
  • Involved in migration of the BI Data Warehouse from Actian Matrix (formerly Paraccel) to Redshift.
  • Integrated data from Salesforce System using Snap logic ETL tool to handle Bulk Loads into Redshift using BULK writer.
  • Used Bitbucket code versioning tool, loaded and retrieved Snap logic assets based on commit id using snaps.
  • Introspect Salesforce Data by using Composite data virtualization studio before loading the data into Enterprise Data Warehouse.
  • Design and Build the ETL processes to load Core Confidential Data from various source systems in order to perform data cleansing and transformations using various Informatica transformations like Aggregator, Update Strategy, Lookup, Joiner, Transaction Control, Mapplets and reusable Transformation etc.
  • Implemented Incremental Data Loads for Daily loads using Informatica Power Center.
  • Impelemented SCD type 2 for target as a flat file by using MD5 function, and loaded the Informatica Output file into Enterprise Data Warehouse tables using Insert, Upsert Scripts which are cost efficient, improves ETL runtimes, loads data in bulk.
  • Developed test cases and test plans. Perform unit test for new and/or modified ETL programs.
  • Utilized Control - M to automate daily, weekly, monthly, billing jobs in FTP Server, Informatica, Snaplogic, Data Lake.

Environment: Silectis Magpie (DataLake), AWS S3, AWS EC2, Python, Apache Spark SQL, Informatica Power Center 10.2, Informatica Power Exchange, Informatica Meta Data Manager, Snaplogic, Oracle 12C, AWS Redshift, MySQL, Flat files, JSON Files, Parquet Files, SOAP API, REST API, SAP, Composite Data Virtualization Studio, Workday, Salesforce, Toad for Oracle, Toad for MySQL, SAP Business Objects, Microstrategy, Bitbucket, Box, Putty, File Zilla, Unix, JIRA, Service Now, Confluence, Share Point.

Hire Now