We provide IT Staff Augmentation Services!

Etl Consultant/lead Resume

4.00/5 (Submit Your Rating)

Plano, TX

SUMMARY

  • Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
  • Created various mappings using transformations like Aggregator, Source Qualifier, and Connected/Unconnected lookup, Update Strategy, Joiner, Filter and Router Transformations. integration of various data sources like Oracle, SQL Server, Flat Files, into Data Warehouse and also experienced in Data Cleansing and Data Analysis.
  • Knowledge on Informatica MDM to design, develop, test and review & optimize Informatica MDM.
  • Knowledge in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules,
  • Merge properties and Batch Group creation.
  • Involved in implementing the Land Process of loading the customer Data Set into Informatica MDM from various source systems
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Very good experience in upgrading the Informatica 9.6.1 to 10.1.1.
  • Actively involved in preparation of ETL coding standards and checklists, code reviews, working with business users on UAT test scenarios
  • Involved developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
  • Involved in creating Created Tableau dashboards using stack bars, bar graphs, scattered plots, geographical maps, Gantt charts etc. using show me functionality. Dashboards and stories as needed using Tableau Desktop and Tableau Server.
  • Experience in the maintenance and scheduling of Tableau Data extracts using Tableau server and the Tableau Command utility.
  • Extensively used Tabadmin and Tabcmd commands in creating backups and restoring backups of Tableau repository
  • Performance Tuning for targets, mappings and sessions, Task Automation using UNIX Shell Scripts, Job scheduling and Communicating with Server using pmcmd.
  • Provided data matching, data cleansing, exception handling, reporting and monitoring features to the data warehouse.
  • Informatica Power Exchange for loading/retrieving data from Oracle and MS SQL server system.
  • Installing and configuring the Informatica PowerExchange servers on windows and UNIX for Oracle and MS Sql server.
  • Collecting and analyzing user’s requirements for new reports in Data Warehouse
  • Design of database objects and new ETL loads to user’s requirements. This includes table and indexes design, writing triggers.
  • Performance tuning and data sampling for Dev environment
  • Provide 24 x 7 on - call rotational support including troubleshooting and issue resolution on priority
  • Designed a training plan for offshore tier2 support team.
  • Strong knowledge in Informatica Administration including Installation and up gradation, Configuration, Maintenance and Troubleshooting.
  • Good communication, collaboration & team building skills with proficiency at grasping new technical concepts quickly and utilize the same in a productive manner

TECHNICAL SKILLS

ETL Tools: Informatica - Power Center, PowerExchange, IICS, Informatica BDM, Informatica MDM

Databases: Oracle 8i,9i,10g,11g, MS SQL Server, Teradata, MS Access, DB2

Programming Languages: SQL, PL / SQL, C, C++ Java, and Shell

Operating Systems Windows: 2 R2 / 2008 R / 2000 / XP, Linux

PROFESSIONAL EXPERIENCE

Confidential

ETL Consultant/Lead

Responsibilities:

  • Created complex Informatica mappings using Unconnected/connected Lookup, joiner, Rank, Source Qualifier, Sorter, expression and Router transformations to extract, transform and loaded data to HE Enterprise system.
  • Worked on defects and change requests and Production Support issues.
  • Developed Informatica mappings, reusable sessions and master workflows.
  • Creating physical data objects for relational database, flat files.
  • Created Reconciliation queries and balancing queries to validate source and target data based on business Scenarios.
  • Worked on optimizing performance of various long running queries and stored procedures and mappings.
  • Reviewed ETL mappings, reconciliation and balancing queries and other deliverables to ensure that the standards and best practices are followed.
  • Worked with multiple functional area SMEs to understand the requirement and help them to solution the requirement.
  • 24 x 7 on-call rotational production support including troubleshooting and issue resolution on priority
  • Design and development of Tableau data sources and dashboards and visualization using different types of charts.
  • Setting up active directory on the tableau server for the effortless password management for the single sign on in the Salesforce and to view the embedded dashboards.
  • Demonstration of the tableau dashboards and its functionality to the business users for the self-service business intelligence
  • Lead the Tableau workbooks and data source release management to deploy from Dev to UAT to Prod environment.
  • Experience in building Data Integration and Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing using SQL Server Integration Services (SSIS).
  • Worked on escalated tasks related to interconnectivity issues and complex cloud-based identity management and user authentication, service interruptions with Virtual Machines (their host nodes) and associated virtual storage (Blobs, Tables, Queues).
  • Converted and developed reports from Power BI to Tableau. While design the Power Bi reports and dashboards, the measures are created using DAX programming language. Converted all the calculations using DAX formulas.
  • Developed Tableau data visualization using Cross tabs, Heat maps, Box and Whisker charts, Scatter Plots, Geographic Map, Pie Charts and Bar Charts and Density Chart.
  • Developed donut charts and implemented complex features in charts like creating bar charts in tooltip.
  • Worked extensively with Advance analysis Actions, Calculations, Parameters, Background images, Maps, Trend Lines, Statistics and table calculations.
  • Hands on building Groups, hierarchies, sets to create detail level summary reports and Dashboard using KPI's

Environment: SSIS, SSRS, Visual Studio Tableau.X, PowerBI-2.X, Python, Azure DB, ServiceNow, Agile, Tableau Alteryx (POC), SnowFlake, TDM(Test Data Mgmt), Azure SQL Database, Informatica 10.X

Confidential, Plano, TX

Lead ETL Data Analyst

Responsibilities:

  • Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
  • Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data. Successfully implemented IDD using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries.
  • Successfully implemented IDD - MDM using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries for searching the data for crops, materials and breeders in IDD data tab.
  • Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
  • Used different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
  • Responsible for developing High-level Database Architectures and ensuring that database designs are consistent with internal information architecture and information management standards, as well as industry standards
  • Created architectural documentation for high level conceptual design, detailed specification, conceptual and logical data models, defining source system to target system mappings, establishing context diagrams, process architecture models and documenting data flows and appropriate technologies
  • Exerting architectural oversight over parallel development initiatives
  • Participated in in-depth evaluations for new technology platforms and made recommendations.Performed complex analysis on large sets of data to identify data patterns, infer data trajectory and make recommendations for changes to program logic to fix or minimize any discrepancies
  • Lead multiple modeling, simulations and analysis efforts to uncover the best data solutions
  • Researched, recommended and introduced new database technologies such as Informatica Data validation Option (DVO), Test Data Management (TDM) based on PowerCenter. Owned the implementation across 7 KP regions
  • Responsible for maintaining Data management Standards, Data Governance, Data Dictionary.
  • Worked on anomaly/outlier detection on massive healthcare data sets, Performed source system analysis, identification of key data issues, data profiling and development of normalized and star/snowflake schema
  • Provided Database administration support including security, performance tuning and monitoring
  • Provided Database/SQL performance tuning on ETL, complex extract SQL and user reporting queries
  • Created data visualization using Tableau - Dashboard’s for Allergy medication index, Avoidable hospitalization days. Customized reports for Patient metrics report and resolved data inconsistency issues
  • Created complex SQL scripts using recursive CTE, stored procedures for data analysis and custom ETL jobs
  • Owned new enhancement projects from definition to implementation such as with Pharmacy Data warehouse to bring in new medication data to Caboodle Data Warehouse
  • Developed and enhanced data warehouses using SQL Server Integration Services (SSIS) and ETL tools to create new packages for implementation of change requests and new project requirements.

Environment: Informatica, Informatica IDQ, MDM Talend, Big Data Platform 6.2, SIEBEL CRM, Tableau (Desktop Intelligence), PL/SQL Developer, Toad, Zeke, NoSQL,Tableau,PowerBI-2.X,Python,Azure

Confidential

BI Lead Consultant

Responsibilities:

  • Was responsible for requirements gathering, preparing mapping document, architecting end to end ETL flow, building complex ETL procedures, developing strategy to move existing data feeds into the Data Warehouse (DW), perform data cleansing activities using various IDQ transformations.
  • Created complex mappings using connected / unconnected lookups.
  • Performed analysis of the data transformations and data types for transforming business rules into transformation logic for the ETL process.
  • Added and modified Informatica mappings and sessions to improve performance, accuracy, and maintainability of existing ETL functionality.
  • Extensively implemented performance tuning of target, source and mapping, used debugger to troubleshoot logical errors.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions and scheduling them to run at specified time with required frequency.
  • Extensively used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules. Used workflow Monitor to monitor the performance of the Jobs.
  • Used Informatica Data Quality transformations to parse the “Financial Advisor” and “Financial Institution” information from Salesforce and Touchpoint systems and perform various activities such as standardization, labeling, parsing, address validation, address suggestion, matching and consolidation to identify redundant and duplicate information and achieve MASTER record.
  • Developed Physical Layer, Business Model & Mapping and Presentation Layer using Oracle BI Administration Tool-OBIEE
  • Created various session and repository variables and initialized them in the Initialization Blocks to change metadata dynamically as the environment changes.
  • Created ETL process using SSIS to transfer data from heterogeneous data sources.
  • Created logging for ETL load at package level and task level to log number of records processed by each package and each task in a package using SSIS
  • Maintaining the versions using CVS and sending the release notes for each release.
  • Wrapper developed in Python for instantiating multi-threaded application and running with other applications
  • Designed SSIS Packages to transfer data from flat files to SQL Server using Business Intelligence Development Studio.
  • Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc.
  • Created reports with Analysis Services Cube(SSAS) as the data source using SSRS and also used MDX queries in building reports
  • Extensively used EPIC Data cube in generating reports for Appointments, HB, PB, Denials, Census
  • Design and development of Tableau data sources and dashboards and visualization using different types of charts.
  • Lead the Tableau workbooks and data source release management to deploy from Dev to UAT to Prod environment.
  • Experience in building Data Integration and Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing using SQL Server Integration Services (SSIS).
  • Provided SQL Performance optimization for Caboodle ETL, OLAP SlicerDicer reporting clients in SQL Server and Oracle by analyzing the Execution plan, Statistics to find bottle neck in IO/Memory/Sort/Index scans, fixing recursive hash joins
  • Worked on design and development of Custom Informatica mappings, workflows to load data into staging area, data warehouse for Oracle OBIA
  • Worked with JDE Supply Chain Management, Value chain Planning
  • Dynamic settings for parallelism and load balancing Intelligent task queue engine based on user-defined and computed scores .Index management for ETL and query performance with DAC

Environment: Microsoft Server 2017/2012,SQL Integration Services(SSIS),SharePoint, OBIA, JDE(9.1), Control - M (BMC Scheduler ) Informatica Power Center 9.1/9.6 OBIEE 11.1.1.6, Oracle 11g,Autosys, Python,EPIC, Caboodle, Cogtio, EMR health Care

We'd love your feedback!