We provide IT Staff Augmentation Services!

Sr. Informatica/teradata Developer & Support Engineer Resume

4.00/5 (Submit Your Rating)

Las Angeles, CA

SUMMARY:

  • Eight Years of experience in Analysis, Design, Development, Testing, Implementation, Enhancement and Support of Data Warehousing as a Data Warehouse Consultant.
  • Proficiency in utilizing ETL tool Informatica Power Center 10.2/ 9.6.1/9.5.1 / 9.1.1/8.6.1 for developing the Data warehouse loads with work experience focused in Data Integration as per client requirement.
  • Expertise in designing ETL Architecture involving databases Teradata, Oracle, DB2, My Sql, SQL Server, DB2 and Flat Files (fixed width, delimited), XML.
  • Having knowledge of Dimensional Modeling, Star and Snowflake schema.
  • Loaded Fact and Dimension Tables as per the reporting requirements and ease of future enhancements.
  • Knowledge in Data Flow Diagrams, Process Models, and ER diagrams with modeling tools like ERWIN & VISIO.
  • Expertise in working with relational databases such as Oracle 12c/11g/10g/9i, SQL Server 2005/2008/2017, DB2, Teradata 15/14/13/12 and MySQL.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers and Complex SQL.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations.
  • Extensive experience in designing and developing complex mappings applying various transformations such as lookup, source qualifier, update strategy, router, sequence generator, aggregator, rank, stored procedure, filter, joiner and sorter transformations and Mapplets.
  • Extensive experience in developing the Workflows, Worklets, Sessions, Mappings, and configuring.
  • Proficient in delivering the high data quality by designing, developing and automation of audit process and implementation the reconcile process accordingly.
  • Excellent knowledge in identifying performance bottlenecks and also in tuning the Mappings and Sessions by implementing various techniques like partitioning techniques.
  • Developed Teradata Loading and Unloading utilities like Fast Export, Fast Load, Multi Load, T - Pump & TPT.
  • Extensive knowledge with Teradata SQL Assistant.
  • Developed BTEQ scripts to Load data from Teradata Staging area to Data Warehouse, Data Warehouse to data marts for specific reporting requirements.
  • Tuned the existing BTEQ script to enhance performance using Explain Plan.
  • Extensive Knowledge on Indexes, Collect Stats & identifying the skew ness.
  • Experience in UNIX shell scripting, FTP and file management in various UNIX environments.
  • Strong understanding of Data warehouse project development life cycle. Expertise in documenting all the phases of DWH projects.
  • Expertise in ETL Production Support both in Monitoring & fixing the failures also fixing data issue raised by Customers in reporting layer. Worked with Service Now & other trouble ticket management tools.
  • Worked in Both Waterfall & Agile Methodologies.
  • Expertise with scrum meeting, spring planning & using Agile tools Rally & Jira.
  • Excellent team player and self-starter with good ability to work independently and possess good analytical, problem solving and logical skills.

TECHNICAL SKILLS:

Data Warehousing: Informatica Power Center 10.2/ 9.6.1/9.5.1 / 9.1.1/8.6.1

Databases: Teradata 15/14/14.10/13/12 (Fast-Load, Multi-Load, T-Pump, TPT and BTEQ), Oracle 12c/11g/10g/9i, SFDC, Sybase 12.0, DB2, MySQL

BI Tools: Cognos, Business Objects, Micro Strategy, Report Builder

Data Modeling: MS Visio, ERWIN

Job Scheduling: CA Autosys, BMC Control M, Maestro(TWS), Cron tab, Informatica Scheduler

Programming: Unix Shell Scripting, XML, SQL and PL/SQL

Other Tools: TD SQL Assistant, TD View Point, Toad, Win SQL, Sql Developer, SQL Navigator, Putty, WinScp, SVN, Rally, Jira.

PROFESSIONAL EXPERIENCE:

Confidential, Las Angeles, CA

Sr. Informatica/Teradata Developer & Support Engineer

Environment: Windows, Unix, Linux

Responsibilities:

  • Involved in Requirement gathering along with Business Analyst, business Analysis, Design, and Development, testing and implementation of business rules.
  • Translate customer requirements into formal requirements and design documents.
  • Development of scripts for loading the data into the base tables in EDW using Fast Load, Multi Load and BTEQ utilities of Teradata.
  • Writing Multi Load scripts, Fast Load and Bteq scripts for loading the data into stage tables and then process into BID.
  • Developed Scripts to load the data from source to staging and staging area to target tables using different load utilities like Bteq, Fast Load and Multi Load.
  • Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.
  • Developed mappings in Informatica to load the data from various sources using different transformations like Look up (connected and unconnected), Normalizer, Expression, Aggregator, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Rank and Router transformations.
  • Implemented Type 1 & Type 2 Slowly Changing Dimensions.
  • Used Target Load Plan in Informatica Effectively.
  • Used Mapping Parameters to parameterize the objects for migration.
  • Created Different Tasks (Session, Command, Email, Event Wait, File Watcher) in Informatica for various purposes.
  • Implemented CDC using mapping variable concept.
  • Developing and reviewing Detail Design Document and Technical specification docs for end to end ETL process flow for each source systems.
  • Involved in Unit Testing and Preparing test cases.
  • Modification of views on Databases, Performance Tuning and Workload Management.
  • Maintenance of Access Rights and Role Rights, Priority Scheduling, Dynamic Workload Manager, Database Query Log, Database Administrator, Partitioned Primary Index (PPI), Multi-Value Compression Analysis, Usage Collections and Reporting of Re-Usage, Amp Usage, Security Administration Setup etc. and leading a team of developers for workaround with different users for different, complicated and technical issues.
  • Performed application level DBA activities creating tables, indexes, monitored and tuned Teradata BTEQ scripts using Teradata.
  • Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.
  • Collected Multi-Column Statistics on all the non-indexed columns used during the join operations & all columns used in the residual conditions.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.

Environment: Informatica Power Center 10.2/9.6.1, Teradata 15/14, Oracle, My Sql, Flat Files, Maestro, SVN, Rally, Windows, Unix.

Confidential, San Antonio, TX

Sr.ETL Informatica Developer /Support Engineer

Responsibilities:

  • Involved in all phases of SDLC from requirement, design, development, testing and support for production environment.
  • Extensively used Informatica Client tools like Informatica Repository Manager, Informatica Designer, Informatica Workflow Manager and Informatica Workflow Monitor.
  • Used Teradata utilities Fast Load, Multi Load, Tpump to load data.
  • Created Sources, Targets in shared folder and developed re-usable transformations, mapplets and user defined function (UDF) to re-use these objects in mappings to save the development time.
  • Developed mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Created mappings which involved Slowly Changing Dimensions Type 1 and Type 2 to implement business logic and capturing the deleted records in the source systems.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.
  • Experience with high volume data sets from various sources like Oracle, Text Files and Teradata Tables.
  • Used debugger extensively to identify the bottlenecks in the mappings.
  • Modified PL/SQL stored procedures for Informatica mappings.
  • Created Sessions and Workflows to load data from the SQL server, flat file and Oracle sources that exist on servers located at various locations all over the country.
  • Configured the session properties i.e. high value for commit intervals to increase the performance.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Involved in Migrating the Informatica objects using Unix SVN from Dev to QA Repository.
  • Worked on developing workflows and sessions and monitoring them to ensure data is properly loaded on to the target tables.
  • Responsible for scheduling workflows, error checking, production support, maintenance and testing of ETL procedures using Informatica session logs.
  • Performance tuning on sources, targets mappings and SQL (Optimization) tuning.

Environment: Informatica Power Center 9.6.1/9.5.1, Flat Files, Oracle 11g, Teradata 13/12, SQL, PL/SQL, TOAD, SQL Assistant, Windows XP, Unix, Maestro, SVN.

Confidential, Buffalo, NY

Sr.ETL Informatica Developer

Responsibilities:

  • Involved in gathering user requirements along with business analysts.
  • Developed Informatica mappings by usage of aggregator, SQL overrides in lookups, source filter and source qualifier and data flow management into multiple targets using router transformations.
  • Extensively involved in performance tuning of the ETL process by determining the bottlenecks at various points like targets, sources, mappings, sessions or systems. This led to a better session performance.
  • Extracted data from various sources, Flat file and loaded in to DB2 UDB.
  • Written Unix scripts for various purposes - Scheduling, Archiving.
  • Document the process for further maintenance and support.
  • Worked on test cases and use cases for unit testing.
  • Used caching optimization techniques in Aggregator, Lookup and Joiner transformation.
  • Created reusable transformations and used in various mappings.
  • Used various debugging techniques to debug the mappings.
  • Involved in Informatica Upgrade testing from 9.1.1 to 9.5.1.
  • Developed mapping to implement type 2 slowly changing dimensions.
  • Developed Informatica parameter files to filter the daily source data.
  • Used various Indexing techniques to improve the query performance.
  • Creating Test Documents for Unit Test and Integration Test to check the data quality.

Environment: Informatica 9.5.1/9.1.1, IBM Mainframes, DB2UDB, flat files, UNIX, Windows, Oracle 10g, SQL Server, TSQL, Win SQL, Mercury Quality Center.

Confidential, Louisville, KY

Sr.ETL Informatica/Teradata Developer

Responsibilities:

  • Analyzed business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Worked with Business Analyst & Data Modelers for reviewing Requirements & Data Model.
  • Worked on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, XML and Flat Files.
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.
  • Developed mappings to load Fact and Dimension tables, SCD Type I and SCD Type II dimensions and Incremental loading and unit tested the mappings.
  • Prepared low level technical design document and participated in build/review of the BTEQ Scripts, Fast Exports, Multi loads and Fast Load scripts, Reviewed Unit Test Plans & System Test cases.
  • Creating new and enhancing the existing stored procedure SQL used for semantic views and load procedures for materialized views.
  • Created BTEQ scripts to extract data from EDW to the Business reporting layer.
  • Developed BTEQ scripts for validation & testing of the sessions, data integrity between source and target databases and for report generation
  • Worked on import and export of data from sources to Staging and Target using Teradata MLOAD, Fast Export, TPUMP and BTEQ.
  • Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
  • Analyzed and designed USI and NUSI based on the columns used in join during data retrieval.
  • Worked in the Performance Tuning of SQL, ETL and other processes to optimize session performance.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
  • Tuned Informatica mappings to improve the execution time by applying suitable Partitioning mechanisms and tuning individual transformations inside the mapping.

Environment: Informatica Power Center 9.5.1/9.1.1, Informatica Power Exchange, Oracle 11g, PL/SQL, SQL Server,DB2,Teradata 13.10, Clear Case, Erwin, Business Objects Info view, Windows, UNIX.

Confidential

ETL/Teradata Developer

Responsibilities:

  • Worked closely with end users and business analysts for requirement gathering and analysis
  • Worked with Data Modeler for logical and physical data modeling with dimensional data modeling techniques.
  • Worked on Informatica - Source Analyzer, Target Designer, Mapping Designer and transformations.
  • Extensively used Normal Join, Full Outer Join, Master Outer Join, Detail Outer Join in the Joiner Transformation.
  • Responsible for analyzing and comparing complex data from multiple sources (Teradata, Oracle, flat files).
  • Selecting Primary Index (UPI, NUPI) for a table for uniform data distribution across the AMPS and Compressing Columns for efficient utilization of available disk space.
  • Developed Teradata BTEQ, MLOAD to populate data into Teradata target tables.
  • Worked in debugging using session Log messages.
  • Used Informatica Power Center Workflow manager for session management, and scheduling of jobs to run in batch process.
  • The load ready files are loaded into Teradata Tables using Teradata ODBC/Fast, Multi load connection. The Load Date and Time are also captured in the Teradata table.
  • Designed Complex Teradata SQL's to pull the data from source systems and to populate data into target tables.
  • Performance tuning of Informatica sessions for large data files has been done by increasing the block size, data cache size, and sequence buffer length.

Environment: Informatica Power Center 9.1.1/8.6.1, Oracle 9i, Teradata 12, BTEQ, Multi Load, Teradata SQL Assistant, Flat Files, UNIX, Windows. iTWINE Technology

Confidential

ETL Developer

Responsibilities:

  • Coordinated with source data team and developed all ETL mappings using various transformations like Normalizer, Filter, Source Qualifier, Router, Joiner, Union, Update Strategy, Expression, SQL Transformation, Aggregator, Lookup, Sorter Transformation etc.
  • Developed Mappings based on mapping document.
  • Developed reusable transformations.
  • Developed ETL stage mappings to pull the data from the source system to the staging area.
  • Collecting statistics to make faster data load and retrieval.
  • Involved in internal code review.
  • Involved in Unit test and documenting the Test results.
  • Participated in weekly status meetings.
  • Developed error tables and audit table for loading bad records.
  • Creating source and target definitions and importing them to Reusable folder.
  • Maintained good quality on deliveries, even in complex tasks. Moved the code to Functional Testing without any defects and helped the testing team to understand the functionality.
  • Identifying and Preparing Data for Test Execution.

Environment: Informatica Power center 8.5.1, Oracle, SQL Server, UNIX scripts, PUTTY, Windows.

We'd love your feedback!