We provide IT Staff Augmentation Services!

Etl Informatica /teradata Developer Resume

0/5 (Submit Your Rating)

Cincinnati, OH

SUMMARY

  • Software Professional with 8+Years of experience in Development, Design, Enhancement, Migration, Testing and Data Analysis Projects.
  • Good Knowledge of ETL and Data Warehousing Concepts, Informatica 8.1,Informatica 8.6, Informatica 9.1, Informatica 9.5, Teradata V2R6, V2R12,V2R13, ORACLE 9i/10g/11g, SQL, PL/SQL, SQL*Plus, SQL*Loader. Well versed in using TOAD, PLSQL Developer, SQL Developer 3.0 interfaces.
  • Proficiently used Informatica as an ETL tool to do the data transformations & processing in a corporate wide ETL Solution to pull data from source Systems, Cleanse, Transform and load data into Data Base.
  • Have used Parallelism and Multi File System techniques in Informatica and improved the Performance by using the Push down Optimization.
  • Proficient in various data warehouse techniques - efficient data integration, source data profiling, identifying data dependencies, data cleansing, slowly changing dimension, change data capture, incremental/full load design.
  • Have very good experience in usage of Power Center of Informatica and all the Informatica Interfaces like the Informatica Repository Manager, Informatica Designer, Informatica Manager, Informatica Monitor.
  • Implemented Slowly Changing dimensions Type 1, Type 2 and Type 3methodology.
  • Expertise in tuning the performance of Mappings and sessions in Informatica and determining the performance bottlenecks.
  • Proficient in applying Performance tuning concepts to Informatica Mappings, Session Properties and Databases.
  • Hands on experience in handling data from various source systems such as Flat Files, XML Source, Oracle, MS SQL Server, IBM DB2, Teradata, Excel Files, XML, Web Services, SAP.
  • Proven record in both technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation, Data Integration and Data Mining.
  • Actively involved in Performance Tuning, Error and Exception handling on various ETL processes.
  • Have clear understanding of Data warehousing and Business Intelligence concepts with emphasis on ETL and life cycle development Using Power Center, Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Extensive experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and many more.
  • Extensive knowledge in architecture design of Extract, Transform, Load environment using Informatica Power Center.
  • Expertise in Data Warehousing projects using Teradata Development.
  • Worked extensively on Error Handling, Performance Analysis and Performance Tuning of Informatica ETL Components and Teradata Utilities, UNIX Scripts, SQL Scripts
  • Expert Developer skills in Teradata RDBMS initial Teradata DBMS environment setup, development.
  • Having experience on large data warehouse containing 75 TB.
  • Sound Working Experience in Databases like Oracle, TERADATA RDBMS and Utilities like FAST LOAD, TPUMP, BTEQ, MLOAD, and Fast Export.
  • Have a good Knowledge on TERADATA PARALLEL TRANSPORTER (TPT) and presented White Paper to the Customer.
  • Have profound working knowledge on Teradata Parallel Transport coding and Teradata SQL Performance tuning.
  • Expertise in Query Analyzing, performance tuning and testing.
  • Exceptional skills in writing complex SQL queries and procedures involving multiple tables, constraints and relationships for efficient data retrieval from relational database for data validation using SQL and TOAD.
  • Data Quality & Analysis: 6+ years of strong Business Analysis experience on Data Analysis, User Requirement Gathering, User Requirement Analysis, Gap Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
  • Have worked on analysis Data from multiple sources, profiled the data, gathered the data as per the Business use cases, worked closely with the Business to understand and helped them in Data Analysis.
  • Have performed unit Testing of the data Services built based on the queries written to pull the data from different variety of sources, using the SOAP UI
  • Have good knowledge on Talend and have presented a POC to the customer on implementing in the project. Implemented the working Model of a project on Talend.
  • Worked with SAP and Informatica integration projects. Have very well understanding of the SAP/ALE IDOC Reader, SAP/ALE IDOC Writer, IDOC Processing, and IDOC Scheduling.
  • Have Worked on Projects with requirements in UNIX, FTP, SSH and SFTP in combination with Informatica.
  • Have worked on Three Projects and managed a project single handedly till the end and also achieved Customer appreciation for all the three Projects.
  • Programming: Well experienced in Oracle PL/SQL scripts and UNIX shell scripting. Well-versed with Configuration Management using version control systems and working under structured and controlled Change Management system.
  • Expertise in preparing Functional & Technical Specifications besides User Guides. Good experience in providing ETL Effort duration estimates with the WBS (Work Breakdown Structure) activities involved.
  • Experienced in the use of agile approach SCRUM and Certified in Agile Delivery.
  • Experienced in Coordinating and leading off shore Development team.
  • Can perform Ad hoc data analysis and communicates with business users effectively.
  • Expertise in fine tuning of Physical data Models for Oracle and Teradata DB systems.
  • Extensive experience in developing Unit, Integration and UAT Test plans and Cases and also has experience in generating/executing SQL Test Scripts and Test results.
  • Expertise in Off-shore/On-site work culture, leading and mentoring Data Warehouse teams.
  • Provided 24 x 7 supports for the projects during go-live and post production support activities.
  • Achieved PAT ON BACK AWARD (POB) from Tech Mahindra and Customer Appreciation Certificates
  • Achieved Associate of the Month (AOM) from Tech Mahindra
  • Achieved BRAVO Certificate from Tech Mahindra
  • STAR performer for all 4 years at Tech Mahindra

TECHNICAL SKILLS

Primary Skill category: Informatica 8.1/8.6/9.1/9.5 , Talend, Teradata V2R6, V2R12,V2R13, ORACLE 9i/10g/11g, SQL, PL/SQL, SQL*Plus, SQL*Loader, SOAP UI

Project Acquired skills: Unix, FTP, SFTP, SSH, Shell Scripting

Trained Skills: C, RDBMS, SQL/PL-SQL, ETL and DW Concepts, Informatica 8.1/8.6/9.1/9.5 and Teradata V2R6, V2R12,V2R13

Operating Systems: Windows 9x/XP/Vista/7/8, DOS, UNIX, Linux

Databases: ORACLE, Teradata

Tools: Informatica 7x/8x/9x, Talend, Oracle’s Toad, SQL Developer 3.1/3.5, TD SQL Assistant, Fast load, Multi load, Bteq, Fast Export, Tpump, Teradata Parallel Transporter, Putty, Telnet, Soap UI, HP Quality Center, MS Office.

PROFESSIONAL EXPERIENCE

Confidential, Cincinnati, OH

ETL Informatica /Teradata Developer

Responsibilities:

  • Working on this Project which requires Siebel data present in Oracle Database to be loaded to Flat Files, basically these requirements being Ad-hoc, have to handle these with great understanding.
  • Working with different Teams in building the Data Services and Web Services which cater the needs of the requirements.
  • Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, TPump, DDL Commands and DML Commands (SQL).
  • Basically this is ETL reporting where based on the Flat Files data the requirements keep changing instantly.
  • Used Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, FastLoad, Multi Load, Fast Export, Tpump, TPT on UNIX/Windows environments and running the batch process for Teradata.
  • Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads from various sources such as Oracle, ASCII delimited Flat Files, EBCDIC files, COBOL & DB2.
  • Solid experience in performance tuning on Teradata SQL Queries and Informatica mappings.
  • Worked on Teradata SQL Assistant, Teradata administrator, Teradata view point and BTEQ scripts.
  • Experience in migrations of project’s application interfaces.
  • Implemented PL/SQL queries, triggers and Stored Procedures as per the design and development related requirements of the project.
  • Developing Informatica mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer.
  • Developing Mappings with various Transformations like Expression, Normalizer, Union, Filter, Router, Joiner and Lookups for better data messaging and to migrate clean and consistent data.
  • Extracting data from various sources across the organization (Oracle, SQL Server and Flat files) and loading into staging area.
  • Developing Informatica Mappings and Workflows to extract data and to load into Teradata staging area using Fast Load/Tpump utilities.
  • Handling the File transfer through SFTP Scripts, which are running through Informatica and also having some Unix Shell Scripts used to send mails to Clients whenever there is success or failure depending upon the Business requirements to the Customers.
  • As it is a Single resource Project have to take complete care of the code and the requirements from preparing the TDD to Code migration.
  • Worked on moving the code from DEV to QA to PROD and performed complete Unit Testing, Integration Testing, user Acceptance Testing for the three environments while moving the code.

Environment: Oracle, Teradata, Informatica 9.1, Toad, SQL Developer, Putty, UNIX, Windows

Confidential, Cincinnati, OH

ETL/Teradata Developer

Responsibilities:

  • Working on this Project which requires Siebel data present in Oracle Database to be loaded to Flat Files, basically these requirements being Ad-hoc, have to handle these with great understanding.
  • Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, TPump, DDL Commands and DML Commands (SQL).
  • Understanding of Unix Shell Scripting along with FTP and SFTP of the Files from Development Server to the Corporate Server is required.
  • Basically this is ETL reporting where based on the Flat Files data the requirements keep changing instantly.
  • Used Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, FastLoad, Multi Load, Fast Export, Tpump, TPT on UNIX/Windows environments and running the batch process for Teradata.
  • Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads from various sources such as Oracle, ASCII delimited Flat Files, EBCDIC files, COBOL & DB2.
  • Solid experience in performance tuning on Teradata SQL Queries and Informatica mappings.
  • Worked on Teradata SQL Assistant, Teradata administrator, Teradata view point and BTEQ scripts.
  • Experience in migrations of project’s application interfaces.
  • Implemented PL/SQL queries, triggers and Stored Procedures as per the design and development related requirements of the project.
  • Developing Informatica mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer.
  • Developing Mappings with various Transformations like Expression, Normalizer, Union, Filter, Router, Joiner and Lookups for better data messaging and to migrate clean and consistent data.
  • Extracting data from various sources across the organization (Oracle, SQL Server and Flat files) and loading into staging area.
  • Developing Informatica Mappings and Workflows to extract data and to load into Teradata staging area using Fast Load/Tpump utilities.
  • Handling the File transfer through SFTP Scripts, which are running through Informatica and also having some Unix Shell Scripts used to send mails to Clients whenever there is success or failure depending upon the Business requirements to the Customers.
  • As it is a Single resource Project have to take complete care of the code and the requirements from preparing the TDD to Code migration.
  • Worked on moving the code from DEV to QA to PROD and performed complete Unit Testing, Integration Testing, user Acceptance Testing for the three environments while moving the code.

Environment: Oracle, Teradata, Informatica 9.1, Toad, SQL Developer, Putty, UNIX, Windows

Confidential

ETL/Teradata Developer

Responsibilities:

  • Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Familiar with the compatibility problems of the tool with the older version.
  • Dealt with Incremental data as well Migration data to load into the Teradata.
  • Involved in Designing the ETL process to Extract translates and load data from OLTP Oracle database system to Teradata data warehouse.
  • Created proper PI taking into consideration both planned access and even distribution of data across all the available AMPS.
  • Migration of jobs from Informatica8.5 version to Informatica9.1.
  • Involved in Coding, Unit Testing, User Acceptance Testing.
  • Analyzing the Requirement and Prepared Design Documents.
  • Involved in writing the FLOAD, MLOAD and TPT API Scripts to load the data into Teradata database by using Teradata Utilities.
  • Checking Teradata Explain plan to analyze and improve query performance.
  • Involved in writing complex SQL queries based on the given requirements and for various business tickets to be handled.
  • Created several SQL queries and created several reports using the above data mart for UAT and user reports.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc...
  • Created Several Informatica jobs to populate the data into dimensions and fact tables. Developed jobs in Informatica to load the data from various Sources using different Transformations like Lookup, Joiner, aggregator, Union, Expression, etc.
  • Extensively worked on Power Center Designer (Source Analyzer, Warehouse designer, Transformation Developer, Mapping Designer and Mapplet Designer).
  • Worked on the Performance improvement of the jobs.
  • Used the Teradata utilities like BTEQ, FLOAD, MLOAD, etc to load the history data and also incremental data for faster performance.
  • Extensively worked on Unix Shell Scripting and PMCMD commands.

Environment: Oracle, Teradata, Informatica 8.6.0/9.1, Toad, TD SQL Assistant, Power Center, UNIX, Windows

Confidential

ETL/Teradata Developer

Responsibilities:

  • Understood the functionality and the data flow in Aviation domain.
  • Gathered the requirements from business teams based on the BRD.
  • Created new tables and designed the databases. Created new indexes on tables to fasten the database.
  • Involved in gathering the business requirements from the Business Analyst.
  • Use SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance.
  • Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Have worked in understanding and translating business requirements to ETL processes.
  • Extensively worked on Filter, Router, Sequence Generator, Look Ups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter and Aggregator Transformations in Informatica.
  • Debug the Informatica mappings and validate the data in the target tables once it is loaded with mappings.
  • Transform and Load data into Enterprise Data Warehouse tables using Informatica from the legacy systems and load the data into targets by ETL process through scheduling the workflows.
  • Developed Informatica Objects - Mappings, sessions, Workflows based on the design documents.
  • Debug the Informatica Mappings and validate the data in the target tables once it was loaded with mappings.
  • Have used the concept of Parameter files, Variables etc in Informatica.
  • Developed Informatica SCD Type-I, Type-II and Type III mappings and tuned them for better performance.
  • Worked with different Data sources ranging from SAP, MDB, Teradata, flat files, XML, Oracle, and SQL server databases.
  • These objects extract data from SAP and Teradata and stores in oracle backend.
  • Worked in life cycle development including Design, ETL strategy, troubleshooting and Reporting.
  • Identifying facts and dimensions. Proficient in writing Mload, FastLoad and TPump scripts from windows, UNIX and Mainframes environments.
  • Performed Code review and wrote Unit Test Cases.
  • Have also worked on Testing the Code in DEV, QA and Production and addressing the issues at all levels.
  • Have performed Code promotion from Development to QA and to Production.
  • Have Worked on Scheduling of Power Center Workflows.
  • Have worked on Design Error Handling process in ETL.

Environment: Oracle, Informatica 8.6.0, Toad, SQL Developer, Putty, UNIX, Windows

We'd love your feedback!