We provide IT Staff Augmentation Services!

Etl Informatica Lead Resume

5.00/5 (Submit Your Rating)

Vernon Hills, IL

SUMMARY

  • 7+ yearsof IT experience in Analysis, Design, Development, Implementation, Testing and Support of Data Warehousing and Data Integration Solutions using Informatica PowerCenter, Informatica Cloud (IICS), Informatica Power Exchange.
  • 7+ years hands - on experience in Data warehousingandETLprocesses using Informatica.
  • 1+ years of experience in Reporting tool Qlik
  • Excellent understanding of ETL, Dimensional Data Modeling techniques, Slowly Changing Dimensions (SCD) and Data Warehouse Concepts - Star and Snowflake schemas, Fact and Dimension tables, Surrogate keys, and Normalization/Demoralization.
  • Experience in Data Warehouse/Data Mart Development Life Cycle.
  • Expertise in DWH technical architecture, design, business requirement definition and Data Modeling. Responsible for designing, coding, testing, integrating the ETL processes and implementing the reporting requirements.
  • Good hands-on working experience with using all kinds of SQL Server Constraints (Primary Keys, Foreign Keys, Defaults, Checks, and Unique), Writing Transact SQL (T-SQL) Queries, & Dynamic-queries.
  • Well versed with ETL procedures to load data from different sources like Oracle, flat files, XML files into DWH using Informatica PowerCenter.
  • Expertise in maintaining data quality, data organization, metadata and data profiling
  • Designed distributed queuing client server architectures and configured MQ
  • Data integration with SFDC and Oracle using Informatica cloud into AWS S3 by converting specifications to programs and data mapping in an ETL Informatica Cloud environment
  • Design, Development, Testing and Implementation of ETL processes using Informatica Cloud
  • Design and develop code in SQL and PL/SQL. Comfortable developing UNIX shell scripts to run SQL scripts and Informatica workflows from Unix server.
  • Design/developed and managed Power Center upgrades from v7.x to v8.6, Migrate ETL code from Informatica v7.x to v8.6. Integrate and managed workload of Power Exchange CDC.
  • Using AWS DMS created task, with migrate existing data, migrate existing data and replicate ongoing changes and replicate data changes.
  • Experience working with Informatica Power Exchange integration with IICS to read data from Condense Files and load into Teradata Datawarehouse environment.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Experience building jobs, mappings in Informatica Cloud for Cloud Application Data Replication/ Extractions or Data Integrations
  • Extracted the raw data from SFDC, CDL to staging tables using Informatica Cloud into Oracle and Teradata.
  • Created AWS DMS tasks for migrate views using full-load tasks.
  • Extensively used various Performance Tuning techniques to improve ETL performance.
  • Extensive knowledge with Teradata SQL Assistant, PUMP, Fast Export, Fast Load, Multiload, BTEQ, Coded complex scripts for the Teradata Utilities and finely tuned the queries to enhance performance.
  • Experience in doing POC integrating Hadoop Big Data with Existing ETL application, have very intense knowledge on Hadoop Architecture and basic knowledge of Hive, Pig, Impala and integration of Big Data with traditional data warehouse.
  • ETL pipelines in and out of data warehouse using combination of Informatica Cloud and Snowflakes SnowSQL Writing SQL queries against Snowflake.
  • Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system which include loading nested JSON formatted data into snowflake table.
  • Worked on AWS Data Pipeline to configure data loads from S3 to into Redshift and used JSON schema to define table and column mapping from S3 data to Redshift
  • Created Databases, Tables, Stored Procedures, DDL/DML Triggers, Views, User defined data types, functions, Cursors and Indexes using T-SQL.
  • Load data into Amazon Redshift and use AWS Cloud Watch to collect and monitor AWS RDS instances within Confidential
  • Proficient in understanding business processes/requirements and translating them into technical requirement specifications.
  • Developed and executed a migration strategy to move Data Warehouse from an Oracle platform to AWS Redshift.
  • Excellent interpersonal and communication skills, technically competent and result-oriented with problem solving skills and ability to work effectively as a team member as well as independently.
  • Developed effective working relationships with client team to understand support requirements, develop tactical and strategic plans to implement technology solutions and effectively manage client expectations.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 10.x9.5/9.0/8.x/7.x, Informatica Power, OLTP,SNOWFLAKE, Informatica Cloud (IICS), Informatica Power Exchange 9.5.1.

Databases: Teradata V2R12/V2R6/V2R5, Oracle 11g/10g/9i/8i, MS SQL Server

Languages: SQL, PL/SQL, Java, C, Shell Scripting, Perl, PHP, XML, HTML.

Data Modeling: MS Visio, Erwin.

Big Data: Hadoop, Hive, Pig, Map Reduce, HDFS, Sqoop.

Tools: SQL Developer, Toad, SQL*Plus, Autosys, MS Office.

Environment: Unix, Windows 7/XP/Vista, Linux.

IT Concepts: Data Structures and Algorithms, Software Development Life Cycle.

Amazon Console: Glue, Data Pipeline (DP), S3, EMR, Database Migration Services (DMS), Athena Amazon Redshift, AWS CLI

PROFESSIONAL EXPERIENCE

Confidential, Vernon Hills, IL

ETL Informatica Lead

Responsibilities:

  • Involved in gathering, understating business requirements and provide estimate to complete the requirements.
  • Worked with Data modeler to restructure the DDL structure for adding or removing columns and creating new tables/views based on the business needs to technical coding.
  • Conducted weekly meetings with business to discuss new requirements expectations, estimates, deployment dates, and current progress of development.
  • Created views, triggers, stored procedures and functions for analyzing data.
  • Created and configured the sessions for workflow. Created and maintained parameter files for workflows in UNIX.
  • Experience working with Salesforce and CDL connector to read data from Salesforce objects into Cloud Warehouse using IICS.
  • Worked with Informatica Cloud Data Loader for Salesforce, for reducing the time taken to import or export critical business information between Salesforce CRM, Force.com.
  • Extracted the raw data from SFDC, CDL to staging tables using Informatica Cloud into Oracle and Teradata.
  • Developed Cloud mappings to extract the data for different regions (APAC, UK and America) and created Filewatcher jobs to setup the dependency between Cloud and PowerCenter jobs.
  • Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
  • Evaluate Snowflake Design considerations for any change in the application and built the Logical and Physical data model for snowflake as per the changes required
  • Develop stored procedures/views in Snowflake and use in Talend for loading Dimensions and Facts and redesigned the Views in snowflake to increase the performance.
  • Develop, enhance, modifying and maintain existing applications
  • Used Teradata SQL Assistant for Writing Complex Queries in ValidatingETLProcesses, PL/SQL Procedures, Packages for Business Rules.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Identified problems in existing production data and developed one-time scripts to correct them.
  • Provide analysis to the business with detailed data comparison of before and after change in excel sheet before moving code into production.
  • Deployed Informatica mappings, workflows as well as probably Unix Shell Script and Teradata DDL deployments in Snow and MKS.
  • Created detailed Unit Test Document with all possible Test cases/Scripts.
  • Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.
  • Performance tuning was done at the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.
  • Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Setting dependencies between tasks and executing the Execution plans.
  • Performed data analysis and gathered columns metadata of source systems for understanding requirement feasibility analysis.
  • Outlined the complete process flow and documented the data conversion, integration process and load mapping changes for data deployment.
  • Wrote complex queries in Teradata SQL assistant to check the data from Source and Target
  • Coordinated with offshore testing team to provide requirements and testing strategies.
  • Wrote Unix shell scripts for handling various files.
  • Involved in Migrating Objects from Teradata to Snowflake and Developed data warehouse model in snowflake for over 100 datasets using whereScape

Environment: Informatica PowerCenter, T-SQL, Teradata, Business Objects, Oracle 11g, Flat files, UNIX, Shell scripting, Toad, Windows, Snowflake, Informatica Cloud (IICS).

Confidential, WEST CHESTER, PA

ETL Informatica Developer

Responsibilities:

  • Involved in gathering and understating business requirements.
  • Extensively worked on Informatica PowerCenter tools- Mapping Designer, Workflow Manager, Workflow Monitor.
  • Experience working with Informatica Power Exchange integration with IICS to read data from Condense Files and load into Teradata Datawarehouse environment.
  • Worked with Data modeler to restructure the data mart according to changes in the model worked on changes in the Informatica converting business needs to technical coding.
  • Created views, triggers, stored procedures and functions for analyzing data.
  • Created and configured the sessions for workflow. Created and maintained parameter files for workflows in UNIX.
  • Strong Experience in implementing Data warehouse solutions in Confidential Redshift; Worked on various projects to migrate data from on premise databases to Confidential Redshift, RDS and S3
  • Designing and building multi-terabyte, full end-to-end Data Warehouse infrastructure from the ground up on Confidential Redshift for large scale data handling Millions of records every day.
  • Performed all aspects of verification, validation including functional, structural, regression, load and system testing.
  • Developed PowerCenter Workflows and Sessions, and also sets up Power Exchange connections to database and mainframe files and created the report for the Informatica CDC workflow’s performance
  • Created detailed Unit Test Document with all possible Test cases/Scripts.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Develop, enhance, modifying and maintain existing applications
  • Used Toad and Teradata SQL Assistant, SQL Plus for Writing Complex Queries in ValidatingETLProcesses, PL/SQL Procedures, Packages for Business Rules.
  • Wrote Stored Procedures / Transact-SQL scripts, Altered/Re-indexed tables as per the business need.
  • Involved in Data Migration from different systems to Oracle system.
  • Coordinate deployments that include Informatica deployments as well as probably Unix Shell Script and Database DDL deployments
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
  • Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
  • Created Implementation Plan for AutoSys DEV environment to address all jobs, calendars and job-related elements being implemented into AutoSys.
  • Wrote complex queries in Teradata SQL assistant to check the data from Source and Target
  • Created project plan for development, Informatica Power Exchange and production Power Center upgrade and used Power Exchange tool to analyze the raw source data for legacy systems.
  • Experience building a metadata model using a data governance tool that will be used to establish a data.
  • Worked on onsite and offshore team coordination.
  • Wrote Unix shell scripts for handling various files.

Environment: Informatica PowerCenter, T-SQL, Informatica Power Exchange, Teradata, Business Objects, Oracle 11g, Flat files, UNIX, Shell scripting, Toad, Windows.

Confidential, King of Prussia, PA

ETL Informatica Developer

Responsibilities:

  • Involved in design, development and maintenance of database for Data warehouse project.
  • Involved in Business Users Meetings to understand their requirements.
  • Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica 8.x
  • Optimizing and tuning the Redshift environment, enabling queries to perform up to 100x faster for Tableau and SAS Visual Analytics
  • Data integration with SFDC and Oracle using Informatica cloud into AWS S3 by converting specifications to programs and data mapping in an ETL Informatica Cloud environment
  • Wrote various data normalization jobs for new data ingested into Redshift
  • Advanced knowledge on Confidential Redshift and MPP database concepts.
  • Migrated on premise database structure to Confidential Redshift data warehouse
  • Responsible for Designing Logical and Physical data modelling for various data sources on Confidential Redshift
  • Designed and Developed ETL jobs to extract data from Salesforce replica and load it in data mart in Redshift.
  • Created existing Informatica Mappings into complex SQL’s and executed them in AWS DP jobs to load the data from AWS S3 to Amazon Redshift database.
  • Created SQL’s views for dimensional and fact tables in Amazon Redshift and gave access to reporting team to build reports.
  • Written Unit test Script for validating the data from Netezza versus Amazon Redshift.
  • Responsible for deliverables on a project which may include Web Intelligence reports in the Business Objects product suite as per process defined
  • Filtered bad data from legacy systems using T-SQL, implemented constraints and triggers into new system for data consistency.
  • Using AWS DMS created task, with migrate existing data, migrate existing data and replicate ongoing changes and replicate data changes.
  • Created AWS DMS tasks for migrate views using full-load tasks.
  • Migrated the data from SQL Server and Oracle to S3 bucket using AWS DMS.
  • Created the source and target endpoints in AWS DMS to migrate the source systems like Oracle and SQL Server to S3.
  • Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
  • Developing ETL process, performance tune ETL programs and Analyzing ETL jobs using Informatica Power Center.
  • Creating DMS jobs for loading full load data and incremental load data from Source systems to Target systems S3 using AWS console.
  • Experienced with installation of AWS CLI to control various AWS services through SHELL/BASH scripting.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, lookup, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Develop complex ETL Informatica Power Center mapping and its corresponding sessions & worklets, workflows.
  • Develop Informatica Power center batch architecture to extract, transform and load data from different sources like oracle, flat files and XML files sent by third parties.
  • Develop online applications using OOP as well as Python programming
  • Program systems with both data and code giving ability to senders and receivers.

Environment: Informatica Power Center, Oracle, SQL Server, Teradata MS Access, SQL, Netezza XML/VSAM/Flat Files, SOAP UI, UNIX, Teradata, Microsoft Visual Studio, Jira, AWS DMS, AWS DP, Amazon Redshift, AWS CLI, AWS S3.

Confidential, Alpharetta, GA

ETL Informatica Developer

Responsibilities:

  • Translated the business processes into Informatica mappings.
  • Responsible for coordinating development and testing efforts with offshore team members.
  • Develop, test and maintain ETL procedures employing both ETL tools and custom PL/SQL.
  • Extensive involvement with the Quality Assurance team for building exhaustive set of test cases.
  • Implemented logic to control job dependencies between the workflows solely through the use of event-raise and event-wait tasks and entries made by ETLs in pilot database tables.
  • Identified and resolved the bottlenecks in source, target, transformations, mappings and sessions to improve performance.
  • Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
  • Experience with data quality, Metadata and ETL tools.
  • Retrieve and modify data within the system
  • Design, code, test, debug, and document programs
  • Collaborate with business partners for system applications requirements
  • Involved in performance tuning of the ETL Informatica Power Center codes. Perform review of peer code, Unit Test cases perform end-to-end testing of Data warehouse/Data Mart load.
  • Creation of database objects like tables, views, procedures using oracle tools like Toad, PL/SQL Developer and SQL* plus. Created PL/SQL stored procedures for moving the data from staging area to data mart
  • Responsible on release management activity such as creating the Implementation plan, creating the complete package for uat& prod migration which includes informatica labels, Teradata DDL/DML, Control-m DRF file and UNIX scripts.
  • Responsible for creation and adaptation of Business Objects Universes.
  • Use Toad for DDL's and to validate the data.
  • Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
  • Worked on Informatica PowerCenter Big Data Edition and involved in high-speed data ingestion and extraction.
  • Performed the Unit and Integration Testing which validated that the data is mapped correctly which provides a qualitative check of overall data flow.
  • Created and Documented ETL Test Plans, Test Cases, Expected Results, Assumptions and Validations.
  • Prepared the coding standards and quality assurance policies and procedures.

Environment: Informatica PowerCenter, T-SQL, Teradata, Oracle, Toad, Business Objects, Shell Scripts, Autosys, Windows XP.

Confidential

ETL Informatica Developer

Responsibilities:

  • Involved in the source data analysis of the client data from various McKesson clients such as Illinois Medicaid, Baptist Health, Medication Therapy Management (MTM), Care First, and Health & Wellness which had various heterogeneous sources including flat file, xml, unstructured data, etc. and load into Oracle.
  • Preparation of ETL specifications and the transformation rules for the various Health care providers based on their business requirements and the Data Intake document provided by the business analyst
  • Developed Informatica objects - mappings, sessions, and workflows based on the prepared low-level design documents.
  • Involved in developing the SQLs, which are used to apply all business rules on the data before load into the target tables.
  • Performed InformaticaETLtesting to validate end-to-end data from source system MS SQL Server and Target system Teradata environment.
  • Implemented optimization techniques for performance tuning and wrote necessary Pre-& Post session shell scripts.
  • Performed testing, knowledge transfer and provide technical support and hands-on mentoring in the use of Informatica.
  • Debug the Informatica mappings and validate the data in the target tables once it was loaded with mappings.
  • Analyze and fix defects raised during testing and tracking it to closure. Also, involved in production support and fixing the production issues.

Environment: Informatica PowerCenter, Oracle, SQL, PL/SQL, Mercury Quality Center, Windows XP.

We'd love your feedback!