We provide IT Staff Augmentation Services!

Sr. Etl Informatica /teradata Developer Resume

Lafayette, LA

PROFESSIONAL SUMMARY:

  • Around 6 years of experience in Information Technology with emphasis on Data Warehouse/Data Mart development using developing strategies for extraction, transformation, and loading (ETL) in Informatica Power Center 10.1.0/ 9.6.1 / 9.5.1/9.1.1/8.6.1/8.1.1/7.1.2 from various database sources.
  • ­Strong work experience in ETL life cycle development, performed ETL procedure to load data from different sources into data marts and data warehouse using Power Center, Designer, Workflow Manager and Workflow Monitor.
  • Involved in Informatica upgrade projects from one version to another version.
  • Involved in understanding of Business Processes, identification of dimensions and Facts for OLAP applications.
  • Strong knowledge of understanding Data Modeling.
  • Comprehensive experience of working with Type1, Type2 methodologies for Slowly Changing Dimensions (SCD) management.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle, MS SQL Server, DB2, My Sql and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Extensive experience in developing Stored Procedures, Functions, Triggers and Complex SQL queries.
  • Used Informatica Power Connect for SAP to pull data from SAP R/3.
  • Experience on Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Good working knowledge of various Informatica designer transformations like Source Qualifier, Dynamic and Static Lookups, connected and Unconnected lookups, Expression, Data Masking, Filter, Router, Joiner, Normalizer and Update Strategy transformation.
  • Extensively used Teradata Utilities like Tpump, Fast - Load, Multi-Load, BTEQ and Fast-Export.
  • Involved in Sql performance Tuning.
  • Worked on Performance Tuning , identifying and resolving performance bottlenecks in Informatica.
  • Expereience in integrating Hadoop with Informatica Power Center also involved in moving Hadoop process to Informatica process. We have done the end to end development for the data the client needed. Also, we worked on maintaining the master data using Informatica MDM.
  • Experience in Task Automation using UNIX Scripts, Job scheduling and communicating with Server using pmcmd .
  • Extensively used Autosys , Control-M and Maestro for Job monitoring and scheduling along with Production on call support. HP Vertica Event Log Processing on AWS Project - designed, implemented all components of the MySQL event web log processing data mart
  • Highly motivated and goal-oriented individual with a strong background in SDLC Project Management and Resource Planning using Agile methodologies & Waterfall Methodologies.
  • Excellent interpersonal and communication skills.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 10.1.0/ 9.6.1/9.5.1 /8.6.1/8.1.1/ Informatica Cloud

RDBMS: Oracle 10g/9i/8i, Teradata 14/12, DB2, SQL Server 2000/2005/2008 , MySQL, Sybase

Modeling: Dimensional Data Modeling, Star Schema Modeling, Snow - flake Schema Modeling, Erwin, Microsoft Visio.

QA Tools: Quality Center

Operating System: Windows, Unix, Linux

Reporting Tools: Cognos, Business Objects

Languages: Java, XML, UNIX Shell Scripting, SQL, PL/SQL, Perl

PROFESSIONAL EXPERIENCE:

Confidential, lafayette, LA

Sr. ETL Informatica /Teradata Developer

Responsibilities:

  • Created BTEQ script for pre population of the work tables prior to the main load process.
  • Created views on the top of the final EDW tables in order to generate the reports.
  • Created tables, shell views in HealthSpring Teradata environment for the staging, current Load and IDS layers.
  • Working on the claims, Provier, Drug, Patient, Eligibility dimensions to build the structures and to process data.
  • Involved heavily in writing complex SQL queries based on the given requirements.
  • Prepared optimized SQL queries to handle performance bottlenecks.
  • Created a BTEQ scripts for pre population of the work tables prior to the main load process.
  • Extensively used SQL Overrides in Lookups, Source filter and Source Qualifiers.
  • Used Fastload for loading into the empty tables.
  • Responsible for loading data into warehouse from different sources using Multiload and Fastload to load millions of records.
  • Performed complex defect fixes in various environments like UAT to ensure the proper delivery of the developed jobs into the production environment .
  • Designed and developed complex ETL mappings by making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Responsible for efficient monitoring of the process through regular ETL load status check.
  • Proactively took responsibilities for internal organizational processes like Quality Management & Employee Performance management.
  • Creating Series of Macros for various applications in TERADATA SQL Assistant.
  • Loading the data in TDM process from SQL to TERADATA CORE.
  • Created Sessions, command task, reusable worklets and workflows in Workflow Manager.
  • Used volatile table and derived queries for breaking u complex queries into simpler queries.
  • Created and scheduled Jobs based on demand and run on time using scheduling tool Active Batch

Environment:: Informatica Power Center 10.1.0, Salesforce, Teradata, SQL Server, sFlat files, Quality Center,Gitlabs, Junkins,Udeploy.

Confidential, Woonsocket, RI

Sr. ETL Informatica Developer

Responsibilities:

  • Involved in coding, testing, implementing, debugging and documenting the complex programs.
  • Used the Data Masking transformation to change sensitive production data to realistic test data for non-production environments.
  • Created Mappings with shared objects, Reusable Transformations and Mapplets.
  • Working with different database Teradata, DB2.
  • Used U.D.F to handle different data rules.
  • Prepared optimized SQL queries to handle performance bottlenecks.
  • Worked with different sources like Oracle, Sql Server, Flat Files.
  • Extensively used SQL Overrides in Lookups, Source filter and Source Qualifiers.
  • Used Teradata Multiload to insert and update masked data.
  • Created UNIX shell scripts to kick off Informatica workflow in batch mode.
  • Invoked Informatica workflows using “pmcmd” utility from the UNIX script.
  • Developed Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server and Oracle PL/SQL.
  • Performed complex defect fixes in various environments like UAT to ensure the proper delivery of the developed jobs into the production environment .
  • Designed and developed complex ETL mappings by making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Responsible for efficient monitoring of the process through regular ETL load status check.
  • Proactively took responsibilities for internal organizational processes like Quality Management & Employee Performance management.
  • Created Parameter files and validation scripts, Created Reusable and Non-Reusable command task in Workflow Manager.
  • Created Sessions, command task, reusable worklets and workflows in Workflow Manager.
  • Developed Cloud mappings to extract the data for different regions .
  • Refreshing the mappings for any changes/additions to CRM source attributes.
  • Developed the audit activity for all the cloud mappings.
  • During the course of the project, participated in multiple meetings with the client and data architect / ETL architect to propose better strategies for performance improvement and gather new requirements.
  • Provided production support for business users and documented problems and solutions for running the workflows.
  • Created and scheduled Jobs based on demand and run on time using scheduling tool Control-M . Created UNIX shell script to FTP flat files from different ordering systems to the ETL server .

Environment:: Informatica Power Center 9.6.1, Salesforce, Teradata, Oracle 11g, Unix, SQL Server, sFlat files, Quality Center, puTTY, Winscp3,Toad

Confidential, Southlake, TX

Sr. ETL Informatica / Teradata Developer

Responsibilities:

  • Involved in analysis of source systems, business requirements and identification of business rule and responsible for developing, support and maintenance for the ETL process using Informatica PC.
  • Involved in informatica Upgrades process from one version to other versions.
  • Created / updated ETL design documents for all the Informatica components changed.
  • Extracted data from heterogeneous sources like oracle, DB2, XML, Flat File and perform the data validation and cleansing in staging area then loaded in to Data Warehouse Teradata using Teradata Utilities & Informatica.
  • Written Teradata BTEQ, M-LOAD, F-LOAD also used TPT in Informatica.
  • Loading the data from source to stage and stage to core/base.
  • Made use of reusable Informatica transformations, shared sources and targets.
  • Created Informatica transformations/mapplets/mappings/tasks/worklets/workflows using Power Center to & Variables.
  • Worked on Data Masking transformation to process the confidential claims data for the testing purposes.
  • Implemented various loads like daily loads, weekly loads, and quarterly loads and on demand load using Incremental loading strategy and concepts of changes Data Capture (CDC).
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Created mappings for Type1, Type2 slowly changing dimensions (SCD) / complete refresh mappings.
  • Extensively used various Data Cleansing and Data Conversion functions like LTRIM, RTRIM, TO DATE, Decode, and IIF functions in Expression Transformation.
  • Extensively used the Workflow Manager tasks like Session, Event-Wait, Timer, Command, Decision, Control and E-mail while creating worklets/workflows.
  • Designing and development of ETL Jobs using HP Vertica , Datastage Teradata and Linux shells
  • Involved and experienced in working with Hadoop environment - Hive QL.
  • Worked with reporting team in helping with EDW data for their reports using Business Objects
  • Worked with “pmcmd” command line program to communicate with the Informatica server, to start, stop and schedule workflows.
  • HP Vertica Event Log Processing on AWS Project - designed, implemented all components of the MySQL event web log processing data mart
  • Involved in supporting EDW environment 24*7 rotation system and strong grip in using scheduling tool Maestro.
  • During the course of the project, participated in multiple meetings with the client and data architect / ETL architect to propose better strategies for performance improvement and gather new requirements.

Environment: Informatica Power Center 9.6.1/9.5.1 , Oracle 11g, DB2, XML, Flat Files, Teradata 14/12, Hadoop, Hive QL, Maestro, UNIX, Windows, Toad.

Confidential

Informatica Developer

Responsibilities:

  • Developed ETL mappings, Transformations and Loading using Informatica Power Center 8.6.1. sExtensively used ETL to load data from Flat file, MS Excel, which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle 10g.
  • Developed and tested all the Informatica mappings, sessions and workflows - involving several Tasks.
  • Worked on Dimension as well as Fact tables, developed mappings and loaded data on to the relational database.
  • Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, update strategy, Lookup, sequence generator, joiner, Stored Procedure.
  • Analyzed the session, event and error logs for troubleshooting mappings and sessions.
  • Provided support for the applications after production deployment to take care of any post-deployment issues.

Environment: Informatica 8.6.1, Oracle 10g, Flat Files, SQL Programming, Unix, Windows, MS Excel, SQL *Plus.

Hire Now