We provide IT Staff Augmentation Services!

Sr. Teradata Developer Resume

5.00/5 (Submit Your Rating)

Westchester, PA

SUMMARY:

  • Around 8 years of experience in ETL (Extract Transform Load), Data Integration and Data Warehousing using Informatica, Ab Initio, Teradata and Oracle technologies.
  • Around 1 year of hands on experience in Hadoop tools like Hive, Sqoop and Hbase, Mapreduce.
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node, and MapReduce programming paradigm.
  • Extensive experience various business domains like Healthcare, Financial, Investment and Retail
  • Strong expertise in Analysis, Design, Development, Implementation, Modeling, Testing, and support for Data warehousing applications.
  • Experience working with Teradata Parallel Transporter (TPT), BTEQ, Fastload, Multiload, TPT, SQL Assistant, DDL and DML commands.
  • Proficient in Teradata EXPLAIN plans, Collect Stats option, Primary Indexes (PI, NUPI), Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Join Indexes (JI), Volatile, global temporary, derived tables etc.
  • Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).
  • Working Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used in Relational, Dimensional and Multidimensional data modelling.
  • Proficient in Database Performance, SQL Query tuning
  • Extensive experience in implementing slowly changing dimensions to maintain historical data and for change data capture(CDC)
  • Sound knowledge in Data Migration from DB2 and Oracle to Teradata using automated unix shell scripting, Oracle/TD SQL, TD Macros and Procedures etc..
  • Proficient knowledge in ER and Dimensional Modeling, identifying Fact and Dimensional Tables with data modeling tools ERWIN and ER Studio.
  • In - depth expertise in the Teradata cost based query optimizer, identifying potential bottlenecks with queries from the aspects of query writing, skewed redistributions, join order, optimizer statistics, physical design considerations (PI/USI/NUSI/JI etc.) etc.
  • Substantial development knowledge in developing mappings, workflows using Informatica PowerCenter (Repository Admin Console, Repository Manager, Designer, Workflow Manager and Workflow Monitor).
  • Extensive experience in developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager
  • Experience in developing complex Mappings using Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility
  • Experience with various source and target systems like flat files, Xmls, Cobol files, Web Services etc..
  • Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Performed system Analysis and QA testing and involved in Production Support.
  • Scheduling the automated jobs for daily, weekly and monthly jobs using UNIX Shell scripts for Autosys scheduling.
  • Very strong in Shell Scripting (KSh, Bourne Shell) and scheduling using Crontab.
  • Experience in UNIX working environment, writing UNIX shell scripts for Informatica pre & post session operations.

TECHNICAL SKILLS:

Teradata Utilities: BTEQ, FastLoad, MultiLoad, TPT, TPump, SQL Assistant, Viewpoint, Query Monitor .

ETL Tools: Informatica PowerCenter 9.x/8.x/7.x (Source Analyzer, Repository Manager, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor, Warehouse Designer and Informatica Server), Informatica Data Quality (IDQ).

Databases: Teradata 14.10/14/13.10/13, Oracle 11g/10g,8i, DB2/UDB, SQL Server

Languages: SQL, PL/SQL, XML, UNIX Shell Scripting

OperatingSystems: Windows 95/98/NT/2000/XP, UNIX, Linux, NCR MP-RAS UNIX

Data Modeling: Erwin, ER Studio

Tools: /Utilities: PLSQL Developer, TOAD, Hadoop, Hive, Pig, SQOOP, SQL Developer, Erwin, Microsoft Visio, Talend, DataStage,Mainframes

Scheduler: UC4, Control M, Autosys

PROFESSIONAL EXPERIENCE:

Confidential, Westchester, PA

Sr. Teradata Developer

Responsibilities:

  • Responsible for requirements gathering for an enhancement requested by client. Involved in analysis and implementation for an Intranet Based Information Management Information System.
  • Responsible for designing ETL strategy for both Initial and Incremental loads.
  • Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging to Journal then move data from Journal into Base tables
  • Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the DataMart.
  • Provided scalable, high speed, parallel data extraction, loading and updating using TPT.
  • Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatile table and derived queries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
  • Developed UNIX scripts to transfer the data from operational data sources to the target warehouse.
  • Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
  • Performed Configuration Management to Migrate Informatica mappings/sessions /workflows from Development to Test to production environment.
  • Extracted data from various source systems like Oracle, Sql Server and flat files as per the requirements
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to DataMart.
  • Used Mapplets for use in mappings thereby saving valuable design time and effort
  • Used Informatica Workflow Manager to create, schedule, execute and monitor sessions, Worklets and workflows.
  • Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Analyzed and identified the complex and underperforming mappings which are right candidates for PDO. Implemented source-side Pushdown Optimization (PDO) to optimize performance issues of these complex mappings involving numerous transformations.
  • Implemented full pushdown Optimization (PDO) for Semantic layer implementation for some of the complex aggregate/summary tables instead of using the ELT approach.
  • Extensively worked on Informatica cloud integration tool for moving data from Sales Force application and other source systems.
  • Worked on InformaticaPower Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor and IDQ.
  • Performed the data profiling and analysis making use ofInformatica Data Explorer (IDE) and Informatica Data Quality (IDQ)
  • Extracted the data from Teradata into HDFS using the Sqoop and Exported the patterns analyzed back to Teradata using Sqoop .
  • Using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Extensively developed UC4 jobs to schedule Powercenter workflows and Data Quality workflows.
  • Provided 24/7 On-call Production Support for various applications and provided resolution for night-time production job, attend conference calls with business operations, system managers for resolution of issues.

Environment: Teradata 14, Informatica Power Center 9.1/9.5, Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Transformation developer, Map let Designer, Mapping Designer, Repository manager, Informatica Cloud, Informatica Data Quality (IDQ), UC4, Control-M, UNIX, SSH (secure shell), TOAD, ERWIN.

Confidential

Sr.ETL / Informatica developer

Responsibilities:

  • Analyzed the requirements of United Health Group to identify the necessary tables that need to be populated into the staging database
  • Created mappings using transformations like Source Qualifier, Aggregator, Expression, lookup, Router, Filter, Update Strategy, Joiner, Union, and Stored procedure, and XML transformations.
  • Worked on Informatica Power Center tools - Source Analyzer, Warehouse Designer, Mapping &Mapplet Designer, and Transformation Developer.
  • Converted existing PL/SQL Packages to ETL Mappings using Informatica Power Center.
  • Used Error handling strategy for trapping errors in a mapping and sending errors to an error table.
  • Implementedparallelismin loads bypartitioningworkflows usingPipeline, Round-Robin, Hash, Key Range and Pass-through partitions.
  • Worked on Exchange Management Console(EMC) or theExchange Management Shell(EMS) related topics.
  • Extensively used Informatica Power Center to extract data from various sources, which included Flat files, Sqlserver, Oracle, Ms-Access and XML.
  • Extensively involved in performance tuning, recommending SQL queries for better performance.
  • Developed in SCRUM iterations using Agile Methodology, Iterative development and Sprint Burn down with Story Boards.
  • Estimates and planning of development work using Agile Software Development.
  • Used Change Data Capture (CDC) to simplify ETL in data warehouse applications
  • Used debugger to debug mappings to gain troubleshooting information about data and error conditions.
  • Used Mapping variables for Incremental Extraction of operational data.
  • Wrote UNIX Shell scripts to automate workflows.
  • Schedule and Run Extraction and Load process, monitor task and workflow using the Workflow Manager and Workflow monitor.
  • Used Workflow Manager for creating workflows, work lets, email and command tasks.
  • Used Informatica features to implement Type I, II changes in slowly changing dimension tables.
  • Used FTP services to retrieve Flat Files from the external sources.
  • Involved in Performance Tuning at various levels including Target, Source, Mapping, and Session for large data files.
  • Extracted data from MELD environment for data profiling.
  • Worked on Migration Strategies between Development, Test and Production Repositories.
  • Supported the Quality Assurance team in testing and validating the Informatica workflows.
  • Did unit test and development testing at ETL level in my mappings.

Environment: Informatica Power Center 9.5.1( Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor),,Teradata 14.10, Oracle 11g, PLSQL Developer, SQL, PLSQL, UNIX Shell Scripting, Autosys, Hive .

Confidential

Teradata Developer/ETL Developer

Responsibilities:

  • Development of scripts for loading the data into the base tables in EDW and to load the data from source to staging and staging area to target tables using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.
  • Performed application level DBA activities creating tables, indexes and monitored and tuned TeradataBETQ scripts using Teradata Visual Explain utility.
  • Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
  • Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables.
  • Performed Space Management for Perm & Spool Space.
  • Reviewed the SQL for missing joins & join constraints, data format issues, miss-matched aliases, casting errors.
  • Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
  • Dealt with initials, delta and Incremental data as well Migration data to load into the Teradata.
  • Analyzing data and implementing the multi-value compression for optimal usage of space.
  • Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.
  • Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
  • Used extensively Teradata Analyst Pack such as Teradata Visual Explain, Teradata Index Wizard and Teradata Statistics Wizard.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.…
  • Flat files are loaded into databases using Fast Load and then used in the queries to do joins.
  • Use SQL to query the databases and do as much crunching as possible in Teradata, using very complicatedSQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc.) to achieve better performance.
  • Use PMON, Teradata manager to monitor the production system during online day.
  • Excellent experience in performance tuning and query optimization of the Teradata SQLs.
  • Developedmappings in Ab Initio to load the data from various sources using various Ab Initio Components such as Partition by Key, Partition by round robin, Reformat, Rollup, Join, Scan, Normalize, Gather, Merge etc.
  • Created checkpoints, phases to avoid dead locks and tested the graphs with some sample data then committed the graphs and related files into Repository from sandbox environment.
  • Then schedule the graphs using Autosys and loaded the data into target tables from staging area by using SQL Loader.
  • Implemented Data parallelism by using Multi-File System, Partition and De-partition components and also preformed repartition to improve the overall performance.

Environment: Teradata V2R6, Informatica 8.6/8.1(Designer, Repository Manager, Workflow Manager, Workflow Monitor), Informatica 8x, Oracle 10G, UNIX, Citrix, Toad, Putty

Confidential

 Informatica Developer

Responsibilities:

  • Developed Informatica mappings, Reusable transformations. Developed and wrote procedures for getting the data from the Source systems to the Staging and to Data Warehouse system.
  • Extensively used transformations to implement the business logic such as Sequence Generator, Normalizer, Expression, Filter, Router, Rank, Aggregator, Look Up (Target as well as Source), Update Strategy, Source Qualifier and Joiner, Designed complex mappings involving target load order and constraint based loading
  • Create/build and run/schedule Workflows and Worklets using the Workflow Manager.
  • Extensively worked in the performance tuning of the programs, ETL Procedures and processes. Coded Database triggers, functions and stored procedures and written many SQL Queries. Helped coding shell scripts for various administration activities for daily backup. Involved in Physical schema implementation, objects like table-space, table rollback segments, Created database structures, objects and their modification as and when needed.
  • Performance Tuning of the Informatica Mappings by adopting Explain plans, cutting down query costs using Oracle hints, changing the mapping designs.
  • Responsible to tune ETL procedures and STAR schemas to optimize load and query Performance.
  • Optimizing/Tuning mappings for better performance and efficiency, Creating and Running Batches and Sessions using the Workflow Manager, Extensively used UNIX Shell scripts for conditional execution of the workflows. Optimized the performance of Mappings, Workflows and Sessions by identifying and eliminating bottlenecks
  • Performed Unit Testing at development level, Source code migration and documentation
  • Managed users & roles for Database security, Maintained system security, Control and Monitor user access to database.
  • Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
  • Managed the Metadata associated with the ETL processes used to populate the data warehouse.
  • Assigned predefined profiles and roles to the users to maintain database security, CPU activity, idle time and quota management on table-spaces.

Environment: Informatica Power Center 7.1, Oracle 8i, PL/SQL, UNIX (AXI), Erwin and Toad

We'd love your feedback!