We provide IT Staff Augmentation Services!

Etl Informatica/teradata Developer Resume

4.00/5 (Submit Your Rating)

TexaS

SUMMARY:

  • ETL Informatica/Teradata Developer / Architect /Data Analyst
  • 11 years of experience in the Design, Development and Implementation of Data warehousing, Data Integration & Data Migration.
  • Extensive experience in handling ETL using Teradata.
  • Extensive experience in handling ETL using Informatica Power Center 9.x/8.x/7.x.
  • Extensive experience in handling ETL using Vertica.
  • Extensive experience in solving production data issues.
  • Extensive experience in handling Erwin modelling.
  • Extensive experience in designing and developing ETL involving Source databases Oracle, flat Files (fixed width, delimited) and Target databases Teradata, Oracle, SQL Server 2005 and Flat Files (fixed width, delimited).
  • Strong experience in Build and maintain SQL scripts, Indexes, and complex queries for data analysis and extraction.
  • Developed complex Mapplets and Mappings, and SQL Stored Procedure, and Triggers.
  • Extensively used Informatica tools such as Informatica Server and Client tools like Designer, Workflow manager, Workflow Monitor, Repository Manager.
  • Experience in implementing the complex business rules by creating transformations, re - usable transformations (Expression, Aggregator, Filter, Connected and Unconnected Lookup, Router, Rank, Joiner, Update Strategy, Normalizer).
  • Extensive knowledge in architecture design of Extract, Transform, Load environment using Informatica Power Center tool.
  • Worked on all phases of data warehouse development lifecycle, from gathering requirements to testing, implementation, and support using Informatica Power Center.
  • Strong experience in developing Teradata Loading and Unloading utilities like Fast Export, Fast Load, and Multiload, Bteq, tpump.
  • Experience with UNIX shell scripting for File validation, SQL programming, automation of workflows.
  • Practical understanding of Star Schema and Snowflake Schema Methodology using Data Modelling tool Erwin 4.0/4.2.
  • Exposure with attending project meetings and working with team, Project manager, team members / associates, business analysts, statisticians, and internal customers/end users, working with multiple tasks.

TECHNICAL SKILLS:

Hardware/OS: Windows (Windows NT/2000/XP Professional), UNIX.

Databases: Oracle 8i/9i/10g/11g, SQL, SQL Server 2000/2005/2008 , Teradata,Vertica.

Technologies:: Java Script, VB Script, IIS 4.0, HTMLXML UNIX Shell Scripting

Tools: /Utilities: Informatica Power Center 9.x/8.6/8.1, TOAD, Hive, HBASE,HDFS PL/SQL developer, Oracle SQL Developer, Teradata SQL AssistantFTP, NDM/Connect Direct, OLTP, IBM Tivoli Workload SchedulerMercury Quality Center(QC), MS-Office tools.

PROFESSIONAL EXPERIENCE:

Confidential, Texas

ETL Informatica/Teradata Developer

Responsibilities:

  • Design, development and implementation of the ETL process end to end.
  • Responsible for Source Attribute Analysis discussions, ETL technical design discussions and prepared ETL high level technical design document and Detail Design Documents.
  • Created Dimension/Fact tables using Erwin data modelling.
  • Responsible for analysing the source data volumes and estimate the Unix server space and Oracle Space in eCDW Alliance servers.
  • Extracted data from Oracle Golden Gate tables, flat files using Informatica ETL mappings and loaded to Data Mart.
  • Loading data for various subject areas (Subscriber, Account, Product, Party, Address) into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD.
  • Identifying and eliminating spool spaces, skewness issues during the testing.
  • Extracted the aggregated month to date data Confidential dimension level with measures, Activities using TPUMP and sent to downstream for reporting.
  • Designed and developed ETL jobs for the data management teams in various projects.
  • Build and maintain SQL scripts, Indexes, and complex queries for data analysis and extraction.
  • Developed complex Source qualifier queries, Lookup override queries to improve the performance of the ETL load jobs.
  • Interacting with onsite and offshore team to assign Development tasks and scheduling weekly status calls with Offshore team on status.
  • Created complex Informatica mappings using transformations, Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, Lookup and Router transformations to extract, transform and loaded data mart area.
  • Worked extensively on shell scripting for file management and parameter file creation.
  • Created re-usable transformations/mapplets and used across various mappings
  • Developed Informatica workflows/worklets/sessions associated with the mappings using Workflow Manager
  • Automated scheduling of Unix Scripts and Informatica workflows using IBM Tivoli work load Scheduler.
  • Expert in performance tuning of Informatica code using standard informatica tuning steps.
  • Supported during QA/UAT/PROD deployments and bug fixes
  • Involved in code Reviews as per ETL/Informatica standards and best practices.
  • Worked on Production tickets and production support.

Environment: Informatica Power Center 9.x/8.x, Oracle 11i, Teradata, Flat Files, Toad 9.1, SQL Developer, Unix Shell Scripting and Windows7, Teradata SQL Assistant, IBM Requisite Pro, Prism, IBM Tivoli work load Scheduler, MS-Visio, Erwin Data Modeler.

Confidential

ETL Informatica/Teradata Developer

Responsibilities:

  • Design, development and implementation of the ETL process end to end.
  • Responsible for Source Attribute Analysis discussions, ETL technical design discussions and prepared ETL high level technical design document and Detail Design Documents.
  • Responsible for analysing the source data volumes and estimate the Unix server space and Oracle Space in eCDW Alliance servers.
  • Extracted data from flat files using Informatica ETL mappings and loaded to Data Mart.
  • Designed and developed ETL jobs for the data management teams in various projects.
  • Build and maintain SQL scripts, Indexes, and complex queries for data analysis and extraction.
  • Loading data for various subject areas (Subscriber, Account, Product, Party, Address) into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD.
  • Identifying and eliminating spool spaces, skewness issues during the testing.
  • Developed complex Source qualifier queries, Lookup override queries to improve the performance of the ETL load jobs.
  • Interacting with onsite and offshore team to assign Development tasks and scheduling weekly status calls with Offshore team on status.
  • Created complex Informatica mappings using transformations, Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, Lookup and Router transformations to extract, transform and loaded data mart area.
  • Worked extensively on shell scripting for file management and parameter file creation.
  • Created re-usable transformations/mapplets and used across various mappings
  • Developed Informatica workflows/worklets/sessions associated with the mappings using Workflow Manager
  • Automated scheduling of Unix Scripts and Informatica workflows using IBM Tivoli work load Scheduler.
  • Expert in performance tuning of Informatica code using standard informatica tuning steps.
  • Supported during QA/UAT/PROD deployments and bug fixes
  • Involved in code Reviews as per ETL/Informatica standards and best practices.
  • Worked on Production tickets and production support.

Environment: Informatica Power Center 9.x/8.x, Oracle 11i, Teradata, Flat Files, Toad 9.1, SQL Developer, Unix Shell Scripting and Windows7, Teradata SQL Assistant, IBM Requisite Pro, Prism, IBM Tivoli work load Scheduler, MS-Visio, Erwin Data Modeler.

Confidential, Texas

ETL Informatica/Teradata Developer

Environment: Infomatica 8.6.1, Teradata

Responsibilities:

  • Responsible for Source Attribute Analysis discussions, ETL technical design discussions and prepared ETL high level technical design document and Detail Design Documents.
  • Responsible for analysing the source data volumes and estimate the Unix server space and Oracle Space in eCDW Alliance servers.
  • Extracted data from Oracle Golden Gate tables, flat files using Informatica ETL mappings and loaded to Data Mart.
  • Designed and developed ETL jobs for the data management teams in various projects.
  • Build and maintain SQL scripts, Indexes, and complex queries for data analysis and extraction.
  • Developed complex Source qualifier queries, Lookup override queries to improve the performance of the ETL load jobs.
  • Interacting with onsite and offshore team to assign Development tasks and scheduling weekly status calls with Offshore team on status.
  • Created complex Informatica mappings using transformations, Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, Lookup and Router transformations to extract, transform and loaded data mart area.
  • Worked extensively on shell scripting for file management and parameter file creation.
  • Created re-usable transformations/mapplets and used across various mappings
  • Developed Informatica workflows/worklets/sessions associated with the mappings using Workflow Manager
  • Automated scheduling of Unix Scripts and Informatica workflows using IBM Tivoli work load Scheduler.
  • Expert in performance tuning of Informatica code using standard informatica tuning steps.
  • Supported during QA/UAT/PROD deployments and bug fixes
  • Involved in code Reviews as per ETL/Informatica standards and best practices.
  • Worked on Production tickets and production support.

Environment: Informatica Power Center 9.x/8.x, Oracle 11i, Teradata, Flat Files, Toad 9.1, SQL Developer, Unix Shell Scripting and Windows7, Teradata SQL Assistant, IBM Requisite Pro, Prism, IBM Tivoli work load Scheduler, MS-Visio, Erwin Data Modeler.

Confidential

Data Analyst

Environment: Infomatica 8.6.1, Oracle

Responsibilities:

  • Development and implementation of the ETL projects end to end.
  • Responsible for Source Attribute Analysis discussions, ETL technical design discussions and prepared ETL high level technical design document and Detail Design Documents.
  • Responsible for analysing the source data volumes and estimate the Unix server space and Oracle Space in eCDW Alliance servers.
  • Extracted data from Oracle Golden Gate tables, flat files using Informatica ETL mappings and loaded to Data Mart.
  • Designed and developed ETL jobs for the data management teams in various projects.
  • Build and maintain SQL scripts, Indexes, and complex queries for data analysis and extraction.
  • Developed complex Source qualifier queries, Lookup override queries to improve the performance of the ETL load jobs.
  • Interacting with onsite and offshore team to assign Development tasks and scheduling weekly status calls with Offshore team on status.
  • Created complex Informatica mappings using transformations, Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, Lookup and Router transformations to extract, transform and loaded data mart area.
  • Worked extensively on shell scripting for file management and parameter file creation.
  • Created re-usable transformations/mapplets and used across various mappings
  • Developed Informatica workflows/worklets/sessions associated with the mappings using Workflow Manager
  • Automated scheduling of Unix Scripts and Informatica workflows using IBM Tivoli work load Scheduler.
  • Expert in performance tuning of Informatica code using standard informatica tuning steps.
  • Supported during QA/UAT/PROD deployments and bug fixes
  • Involved in code Reviews as per ETL/Informatica standards and best practices.
  • Worked on Production tickets and production support.

Environment: Informatica Power Center 9.x/8.x, Oracle 11i, Teradata, Flat Files, Toad 9.1, SQL Developer, Unix Shell Scripting and Windows7, Teradata SQL Assistant, IBM Requisite Pro, Prism, IBM Tivoli work load Scheduler, MS-Visio, Erwin Data Modeler.

We'd love your feedback!