We provide IT Staff Augmentation Services!

Sr.etl Informatica/teradata Developer & Support Engineer Resume

2.00/5 (Submit Your Rating)

Tampa, FL

SUMMARY:

  • 9 Plus years of experience in Information Technology with building & supporting Data Warehouse/Data Mart using Informatica Power Center 10.2/9.6.1/ 9.5.1/9.1.1 /8.6.1 /8.1.1
  • Strong work experience in Data Warehouse lifecycle process.
  • Involved in understanding of Business Processes, grain identification, identification of dimensions and measures (Facts).
  • Extensive knowledge on Understanding Data Modeling(ER & Dimensional Modeling), doing Data Integration and Data Migration.
  • Extensive experience in working with different RDBMS Oracle, Teradata, My Sql, SQL Server, DB2, Sybase and also with File base system - Flat Files & XML Files.
  • Extensive experience in designing and developing complex mappings using transformations lookups (Connected & Un Connected), Normalizer, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, java and Update Strategy.
  • Expert on implementing Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.
  • Expert on Implementing Change Data Capture (CDC) for handling Incremental loads.
  • Experience on Mapping Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Experience on post-session and pre-session shell scripts for tasks like merging flat files after Creating, deleting temporary files, changing the file name to reflect the file generated date etc.
  • Extensively used Informatica Mapping Parameters and variables.
  • Extensively worked with Informatica performance tuning involving Identifying and eliminating bottlenecks.
  • Experience with Integrating Informatica with Teradata and using Teradata features.
  • Extensive experience with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load & Tpump also with TPT.
  • Proficient in Teradata EXPLAIN plans, Collect Stats option, Primary Indexes (PI, NUPI), Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Join Indexes (JI), Volatile, global temporary, derived tables etc.
  • Expertise in performance tuning and query optimization of the Teradata SQLs.
  • Experience with Unit Testing, working with QA team on System testing also involved in UAT.
  • Experience with ETL Migrations & Code deployments also involved with Post production validations.
  • Solid experience in writing SQL queries, Stored Procedures.
  • Experience in working with UNIX Shell Scripts for automatically running sessions, aborting sessions and creating adhoc parameter files. Written number of shell scripts to run various batch jobs.
  • Involved in supporting Heavy DWH production system in rotational process.
  • Experienced being on top of head in fixing production failures without effecting business and dependencies.
  • Experienced working with Business Users as part of supporting DWH data issues through ticketing system raised by users.
  • Implemented Data Warehouse projects both in Agile and Water Fall Methodologies and good understanding on Scrum process.
  • Excellent interpersonal and communication skills, capable of driving the DWH projects independently.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 10.2/9.6.1/ 9.5.1/9.1.1 /8.6.1

Databases: Teradata15/14/13/12, Oracle, MS SQL Server, My Sql, DB2

Reporting: Cognos, Business Objects, Dashboard Reporting

Tools: Toad, SQL developer, Sql Assistant

Languages: Unix shell Script, SQL, PL/SQL, Java, Perl

Process/Methodologies: Waterfall Methodology, Agile Methodology (Rally/Jira)

Operating Systems: Windows, Unix & Linux

PROFESSIONAL EXPERIENCE:

Confidential, Tampa, FL

Sr.ETL Informatica/Teradata Developer & Support Engineer

Responsibilities:

  • Analysed the Business Requirement Documents (BRD) & Source to Target Mapping Document (STM) for better ETL Process before the development.
  • Used Informatica Power Center to pull data from different source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.
  • Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, java, update strategy and stored procedure.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
  • Used command task extensively to execute the unix scripts from informatica.
  • Developed mappings for SCD Type 1 and SCD Type 2 dimensions.
  • Created process to handle Change Data Capture (CDC).
  • Involved in source data Profiling and Tuning of queries for Source Data Extraction
  • Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
  • Involved in Enhancing existing Production Objects for additional reporting requirements.
  • Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
  • Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance.
  • Loaded data in to the Teradata tables using Teradata Utilities Bteq, Fast Load, Multi Load, and Fast Export, TPT.
  • Extensively worked in the performance tuning of Teradata BTEQ.
  • Worked extensively with different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing the Mappings.
  • Involved in moving the BIG data Hive QL process to Informatica & Teradata.
  • Involved in analysis for future migration of Informatica to SSIS and knowledge on SSIS tool.
  • Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
  • Involved in doing DWH data Analysis for reporting & also written SQL for ADHOC reporting purpose.
  • Written Unix Wrappers for different purposes - PMCMD in Built, Archiving Logs, Purging data, FTP the Files.
  • Co-ordinate with reporting team for better understanding of DWH data.
  • Involved in writing technical design documentation along with deployment activities.
  • Scheduling Informatica jobs and implementing dependencies if necessary using Maestro.
  • Involved in Heavy Production Support on rotational basis.

Environment: Informatica Power Center 10.2/9.6.1 SSIS 2017, Teradata 15/14, Oracle 11g, Sybase, My Sql, Sql Server, Flat Files, SQL, PL/SQL, Maestro, Cron Tab, UNIX, Agile, SVN, Sql Assistant, Toad 9.0, Cognos.

Confidential, San Antonio, TX

Sr. ETL Informatica/Teradata Developer

Responsibilities:

  • Worked with Business Analyst for HLD & LLD along with BRD.
  • Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router and Joiner transformations.
  • Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, Mapplet Input and Mapplet Output transformations.
  • Worked with source databases like Oracle, SQL Server and Flat Files.
  • Worked with extracting data from SFDC.
  • Extensively worked with Teradata utilities BTEQ, F-Load, M-load & TPT to load data in to Teradata ware house.
  • Created complex mappings using Unconnected and Connected lookup Transformations.
  • Responsible for the performance tuning of the ETL process at source level, target level, mapping level and session level.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size and target based commit interval.
  • Responsible for Performance Tuning of Teradata scripts using explain plans, indexing and Statistics.
  • Implemented slowly changing dimension Type 1 and Type 2.
  • Worked with various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database and Shared Cache.
  • Worked extensively with update strategy transformation for implementing inserts and updates.
  • Worked with various Informatica Power Center objects like Mappings, transformations, Mapplet, Workflows and Session Tasks.
  • Auditing is captured in the audit table and EOD snapshot of daily entry is sent to the distributed list to analyze if there are any abnormalities.
  • As per business we implemented Auditing and Balancing on the transactional sources so that every record read is either captured in the maintenance tables or wrote to Target tables.
  • Extensively used the tasks like email task to deliver the generated reports to the mailboxes and command tasks to write post session and pre session commands.
  • Extensively used debugger to test the logic implemented in the mappings.
  • Performed error handing using session logs.
  • Involved in production support when required.
  • Monitored workflows and session using Power Center workflows monitor.
  • Used Informatica Scheduler for scheduling the workflows in dev for testing.
  • Provided 24*7 support for Production environment jobs.
  • Monitoring the Extraction and loading processes of data and Involved in writing UNIX shell scripting for automating the jobs.

Environment: Informatica Power Center 9.5.1/9.1.1 , IDQ 9.5.1, Power Exchange, Teradata 14, Oracle 11g, Main Frame, My Sql, SFDC, Flat Files, Autosys, Toad, Textpad, Sql Assistant, Putty, UNIX, Windows.

Confidential, Phoenix, AZ

ETL Informatica Developer

Responsibilities:

  • Analyzed the Business Requirement Documents (BRD) and laid out the steps for the data extraction, business logic implementation & loading into targets.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Developed Source to Target Mappings using Informatica Power Center Designer from Oracle, Flat files sources to Teradata database, implementing the business rules
  • Modified BTEQ scripts to load data from Teradata Staging area to Teradata Ware House.
  • Requirement gathering and discussion with Architect for design plan.
  • Created series of Macros for various applications in Teradata SQL Assistant
  • Working closely with Architects and Lead for the applications assessment to all the Data Masking Team on Proxy server and proving support on the databases and applications.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirements.
  • Extracted, transformed data from various sources such as Flat files, Oracle 11g and transferred data to the target data warehouse Teradata
  • Responsible for building Teradata temporary tables, indexes, macros and BTEQ Scripts for loading/transforming data.
  • Developed scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata
  • Tested raw data and executed performance scripts.
  • Successfully upgraded Informatica 9.1 and to 9.5 and responsible for validating objects in new version of Informatica.
  • Managed postproduction issues and delivered all assignments/projects within specified time lines.
  • Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
  • Supported QA for each region testing using Health Rules and Health Answers. Assisted for QA.
  • Written Technical design document and application workbook and handover applications to production team.
  • Worked in Production support team.

Environment: Informatica Power Center 9.5.1/9.1.1 , Oracle 11g, DB2, Teradata 14.0, Flat Files, Teradata SQL Assistant, Toad, Winscp, Putty, Tivoli Workload Scheduler, UNIX.

Confidential

Data Integration Engineer

Responsibilities:

  • Involved in business requirement analysis and prepared functional requirement document.
  • Involved in the ETL technical design discussions and prepared ETL high level technical design document.
  • Involved in migrating the Legacy data in to Data Bases Using Snap Logic.
  • Involved in the analysis of source to target mapping provided by data analysts and prepared function and technical design documents.
  • Extracted data from Flat files, high volume of data sets from data files, Oracle using Informatica ETL mappings/SQL PLSQL scripts and loaded to Data Store Area.
  • Created complex Informatica mappings using Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations to extract, transform and loaded data mart area.
  • Created re-usable transformations/mapplets and used across various mappings.
  • Developed Informatica workflows/worklets/sessions associated with the mappings using Workflow Manager.
  • Wrote Unit Test cases and executed unit test scripts successfully.
  • Involved in performance tuning of Informatica code using standard informatica tuning steps.
  • Involved in the performance tuning of SQL/PLSQL scripts based on explain plan.
  • Supported during QA/UAT/PROD deployments and bug fixes.
  • Schedule daily backup of Informatica environment, restore Informatica environment when required.
  • Administer projects, roles, user, and privileges across all three environments, Configure and setup LDAP security integration with Informatica, Load balancing of the ETL services.
  • Involved in code Reviews as per ETL/Informatica standards and best practices.

Environment: Informatica 9.6.1(Repository Manager, Admin Console, Designer, Workflow Manager Workflow Monitor), Toad, Oracle 10g, UNIX, Snap Logic, AWS, Windows.

Confidential

ETL Developer

Responsibilities:

  • Interacted with the Business Users to define/map Requirements.
  • Extensively worked on Power Center Client Tools like Repository Admin Console, Repository Manager, Designer, Workflow Manager, and Workflow Monitor.
  • Analyzed the source data coming from different sources (Oracle, DB2, XML, QCARE, Flat files) and worked on developing ETL mappings.
  • Developed complex Informatica Mappings, reusable Mapplets and Transformations for different types of tests in research studies on daily and monthly basis.
  • Implemented mapping level optimization with best route possible without compromising with business requirements.
  • Created Sessions, reusable worklets and workflows in Workflow Manager and Scheduled workflows and sessions at specified frequency.
  • Worked on fixing invalid Mappings, testing of Stored Procedures and Functions, and Integration Testing of Informatica Sessions.
  • Responsible for the Performance tuning at the Source Level, Target Level, Mapping Level and Session Level.
  • Worked extensively on SQL, PL/SQL, and UNIX shell scripting.
  • Generated XML files to deliver to Thompson Reuters.
  • Performed Data profiling for data quality purposes.
  • Proven Accountability including professional documentation, and weekly status report.
  • Performed Quantitative and Qualitative Data Testing.
  • Documented flowcharts for the ETL (Extract Transform and Load) flow of data using Microsoft Visio and created metadata documents for the Reports and the mappings developed and Unit test scenario documentation for the mentioned.

Environment: Informatica Power Center 8.6.1, UNIX, Shell Scripting, Oracle, SQL Profiler, SQL, PL/SQL.

Confidential

Informatica Developer

Responsibilities:

  • Used Informatica client tools - Source Analyzer, Target designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and designing the mappings of data flow from source system to data warehouse.
  • Experience in understanding Logical and Physical design of dimensional data modeling.
  • Implemented Slowly Changing Dimensions - Type II in mappings as per the requirements.
  • Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status.
  • Used Debugger to troubleshoot the mappings.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter, and Source Qualifier.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Defined Target Load Order Plan for loading data into Target Tables.
  • Prepared Unit Test Cases.
  • Set Standards for naming conventions and best practices for Informatica mapping development.

Environment: Informatica Power Center 8.6.1, Oracle, PL/SQL, Toad, Autosys, Unix, Erwin

We'd love your feedback!