We provide IT Staff Augmentation Services!

Sr. Etl Informatica/teradata Developer Resume

4.00/5 (Submit Your Rating)

Tampa, FL

SUMMARY

  • 6 Plus years of experience in Information Technology with building & supporting Data Warehouse/Data Mart using Informatica Power Center 10.2/9.6.1/ 9.5.1/9.1.1 /8.6.1 /8.1.1
  • Strong work experience in Data Warehouse lifecycle process.
  • Involved in the understanding of Business Processes, grain identification, identification of dimensions and measures (Facts).
  • Proficiency in logical data modeling, including developing and/or using UML (Unified Modeling Language) Class Diagrams
  • Extensive knowledge on Understanding Data Modeling(ER & Dimensional Modeling), doing Data Integration and Data Migration.
  • Extensive experience in working with different RDBMS Oracle, Teradata, MySQL, SQL Server, DB2, Sybase and also with File base system - Flat Files & XML Files.
  • Extensive experience in designing and developing complex mappings using transformations lookups (Connected & Unconnected), Normalizer, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, java and Update Strategy.
  • Expert on implementing Slowly Changing Dimensions Type 1, Type 2 and Type 3 for inserting and updating Target tables for maintaining the history.
  • Expert on Implementing Change Data Capture (CDC) for handling Incremental loads.
  • Experience in Mapping Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Experience on post-session and pre-session shell scripts for tasks like merging flat files after Creating, deleting temporary files, changing the file name to reflect the file generated date, etc.
  • Extensively used Informatica Mapping Parameters and variables.
  • Extensively worked with Informatica performance tuning involving Identifying and eliminating bottlenecks.
  • Experience with Integrating Informatica with Teradata and using Teradata features.
  • Extensive experience with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load & TPump also with TPT.
  • Proficient in Teradata EXPLAIN plans, Collect Stats option, Primary Indexes (PI, NUPI), Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Join Indexes (JI), Volatile, global temporary, derived tables etc.
  • Expertise in performance tuning and query optimization of the Teradata SQLs.
  • Experience with Unit Testing, working with QA team on System testing also involved in UAT.
  • Experience with ETL Migrations & Code deployments also involved with Post-production validations.
  • Solid experience in writing SQL queries, Stored Procedures.
  • Experience in working with UNIX Shell Scripts for automatically running sessions, aborting sessions and creating ad-hoc parameter files. Written number of shell scripts to run various batch jobs.
  • Involved in supporting the heavy DWH production system in the rotational process.
  • Experienced being on top of the head in fixing production failures without effecting business and dependencies.
  • Expertise in UML, Rational Unified Process (RUP) and Rational Rose.
  • Experienced working with Business Users as part of supporting DWH data issues through ticketing system raised by users.
  • Implemented Data Warehouse projects both in Agile and Water Fall Methodologies and a good understanding of the Scrum process.
  • Excellent interpersonal and communication skills, capable of driving the DWH projects independently.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.2/9.6.1/ 9.5.1/9.1.1 /8.6.1

Databases: Teradata15/14/13/12, Oracle, MS SQL Server, MySQL, DB2

Reporting: Cognos, Business Objects, Dashboard Reporting

Tools: Toad, SQL developer, SQL Assistant

Languages: Unix Shell Script, SQL, PL/SQL, Java, Perl

Process/Methodologies: Waterfall Methodology, Agile Methodology (Rally/Jira)

Operating Systems: Windows, Unix & Linux

PROFESSIONAL EXPERIENCE

Confidential, Tampa, FL

Sr. ETL Informatica/Teradata Developer

Responsibilities:

  • Analyzed the Business Requirement Documents (BRD) & Source to Target Mapping Document (STM) for a better ETL Process before the development.
  • Used Informatica PowerCenter to pull data from different source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.
  • Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, java, update strategy and stored procedure.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
  • Used command task extensively to execute the Unix scripts from Informatica.
  • Developed mappings for SCD Type 1 and SCD Type 2 dimensions.
  • Created process to handle Change Data Capture (CDC).
  • Involved in source data Profiling and Tuning of queries for Source Data Extraction
  • Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
  • Involved in Enhancing existing Production Objects for additional reporting requirements.
  • Tuned the performance of mappings by following Informatica best practices and also applied several methods to get the best performance by decreasing the run time of workflows.
  • Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance.
  • Loaded data into the Teradata tables using Teradata Utilities Bteq, Fast Load, Multi Load, and Fast Export, TPT.
  • Involved in UML, Rational Unified Process (RUP) and Rational Rose.
  • Experience with UML data modeling
  • Extensively worked in the performance tuning of Teradata BTEQ.
  • Worked extensively with different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing the Mappings.
  • Involved in moving the BIG data Hive QL process to Informatica & Teradata.
  • Involved in analysis for future migration of Informatica to SSIS and knowledge on the SSIS tool.
  • Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
  • Involved in doing DWH data Analysis for reporting & also written SQL for ADHOC reporting purpose.
  • Written Unix Wrappers for different purposes - PMCMD in Built, Archiving Logs, Purging data, FTP the Files.
  • Co-ordinate with the reporting team for a better understanding of DWH data.
  • Involved in writing technical design documentation along with deployment activities.
  • Scheduling Informatica jobs and implementing dependencies if necessary using Maestro.
  • Involved in Heavy Production Support on a rotational basis.

Environment: Informatica Power Center 10.2/9.6.1 SSIS 2017, Teradata 15/14, Oracle 11g, Sybase, My Sql, Sql Server, Flat Files, SQL, PL/SQL, Maestro, Cron Tab, UNIX, Agile, SVN, Sql Assistant, Toad 9.0, Cognos.

Confidential, San Antonio, TX

Sr. ETL Informatica/Teradata Developer

Responsibilities:

  • Worked with Business Analyst for HLD & LLD along with BRD.
  • Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router and Joiner transformations.
  • Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, Mapplet Input, and Mapplet Output transformations.
  • Worked with source databases like Oracle, SQL Server, and Flat Files.
  • Worked with extracting data from SFDC.
  • Extensively worked with Teradata utilities BTEQ, F-Load, M-load & TPT to load data into Teradata warehouse.
  • Completed all required information within the PI Builder File Validated and received the Validation Complete box on each sheet within the PI Builder File.
  • Involved in utilizing the Data Validation Checker on each sheet within the PI Builder File
  • Created complex mappings using Unconnected and Connected lookup Transformations.
  • Responsible for the performance tuning of the ETL process at the source level, target level, mapping level, and session-level.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, and target based commit interval.
  • Responsible for Performance Tuning of Teradata scripts using explain plans, indexing, and Statistics.
  • Implemented slowly changing dimension Type 1 and Type 2.
  • Worked with various lookup caches like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from the database and Shared Cache.
  • Worked extensively with update strategy transformation for implementing inserts and updates.
  • Worked with various Informatica Power Center objects like Mappings, transformations, Mapplet, Workflows and Session Tasks.
  • Auditing is captured in the audit table and an EOD snapshot of daily entry is sent to the distribution list to analyze if there are any abnormalities.
  • As per business we implemented Auditing and Balancing on the transactional sources so that every record read is either captured in the maintenance tables or wrote to Target tables.
  • Extensively used the tasks like email task to deliver the generated reports to the mailboxes and command tasks to write post-session and pre-session commands.
  • Extensively used the debugger to test the logic implemented in the mappings.
  • Performed error handing using session logs.
  • Involved in production support when required.
  • Monitored Workflows and sessions using PowerCenter workflows monitor.
  • Used Informatica Scheduler for scheduling the workflows in dev for testing.
  • Provided 24*7 support for Production environment jobs.
  • Monitoring the Extraction and loading processes of data and Involved in writing UNIX shell scripting for automating the jobs.

Environment: Informatica Power Center 9.5.1/9.1.1 , IDQ 9.5.1, Power Exchange, Teradata 14, Oracle 11g, Main Frame, My Sql, SFDC, Flat Files, Autosys, Toad, Textpad, SQL Assistant, Putty, UNIX, Windows.

Confidential

ETL Informatica Developer

Responsibilities:

  • Analyzed the Business Requirement Documents (BRD) and laid out the steps for the data extraction, business logic implementation & loading into targets.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Developed Source to Target Mappings using Informatica Power Center Designer from Oracle, Flat files sources to Teradata database, implementing the business rules
  • Modified BTEQ scripts to load data from the Teradata Staging area to Teradata Ware House.
  • Requirement gathering and discussion with Architect for the design plan.
  • Created series of Macros for various applications in Teradata SQL Assistant
  • Working closely with Architects and Lead for the applications assessment to all the Data Masking Team on Proxy server and proving support on the databases and applications.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirements.
  • Extracted, transformed data from various sources such as Flat files, Oracle 11g and transferred data to the target data warehouse Teradata
  • Responsible for building Teradata temporary tables, indexes, macros and BTEQ Scripts for loading/transforming data.
  • Developed scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata
  • Tested raw data and executed performance scripts.
  • Successfully upgraded Informatica 9.1 and 9.5 and responsible for validating objects in the new version of Informatica.
  • Managed postproduction issues and delivered all assignments/projects within specified timelines.
  • Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
  • Supported QA for each region testing using Health Rules and Health Answers. Assisted with QA.
  • Written Technical design document and application workbook and handover applications to the production team.
  • Worked in the Production support team.

Environment: Informatica Power Center 9.5.1/9.1.1 , Oracle 11g, DB2, Teradata 14.0, Flat Files, Teradata SQL Assistant, Toad, WinSCP, Putty, Tivoli Workload Scheduler, UNIX.

Confidential

Data Integration Engineer

Responsibilities:

  • Involved in business requirement analysis and prepared functional requirement document.
  • Involved in the ETL technical design discussions and prepared ETL high-level technical design documents.
  • Involved in migrating the Legacy data into Data Bases Using Snap Logic.
  • Involved in the analysis of source to target mapping provided by data analysts and prepared function and technical design documents.
  • Extracted data from Flat files, a high volume of data sets from data files, Oracle using Informatica ETL mappings/SQL PLSQL scripts and loaded to Data Store Area.
  • Created complex Informatica mappings using Unconnected Lookup, Joiner, Rank, Source Qualifier, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations to extract, transform and load data mart area.
  • Created re-usable transformations/mapplets and used them across various mappings.
  • Developed Informatica workflows/worklets/sessions associated with the mappings using Workflow Manager.
  • Wrote Unit Test cases and executed unit test scripts successfully.
  • Involved in the performance tuning of Informatica code using standard Informatica tuning steps.
  • Involved in the performance tuning of SQL/PLSQL scripts based on the explain plan.
  • Supported during QA/UAT/PROD deployments and bug fixes.
  • Schedule a daily backup of the Informatica environment, restore the Informatica environment when required.
  • Administer projects, roles, user, and privileges across all three environments, Configure and setup LDAP security integration with Informatica, Load balancing of the ETL services.
  • Involved in Code Reviews as per ETL/Informatica standards and best practices.

Environment: Informatica 9.6.1(Repository Manager, Admin Console, Designer, Workflow Manager Workflow Monitor), Toad, Oracle 10g, UNIX, Snap Logic, AWS, Windows.

We'd love your feedback!