Teradata/informatica Developer Resume
IL
SUMMARY
- Around 5 years of experience in ETL, Data Integration and Data Warehousing using Informatica, Teradata and Relational Databases.
- Experience working with Teradata Parallel Transporter (TPT), BTEQ, Fast load, Multiload, TPT, SQL Assistant, DDL and DML commands.
- Proficient in Teradata EXPLAIN plans, Collect Stats option, Primary Indexes (PI, NUPI), Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Join Indexes (JI), Volatile, global temporary, derived tables etc.
- Good Programming experience in BTEQ, Macros and Triggers.
- Hands on experience in monitoring and managing varying mixed workload of an active data warehouse using various tools likeTeradataWorkload Analyzer,TeradataDynamic Workload Manager andTeradataManager
- Assisted Business Analyst with analyzing the requirements, implementing design and development of various components of ETL for various applications
- Strong experience with large and midsize data warehousing implementation using Informatica Power center/ power mart, Oracle, SQL server, UNIX and Windows platforms.
- Experience as an ETL developer using Informatica Power Center which includes Source analyzer, warehouse designer, mapping designer, Mapplet designer, transformation designer and developer, Repository manager, Workflow manager and monitor.
- Familiar with creating Secondary indexes and join indexes.
- Good experience in Data warehousing applications, responsible for Extraction, cleansing, Transformation and Loading of data from various sources to the data warehouse.
- Implemented Performance Tuning Techniques and Error Handling at various levels like source and Target mappings.
- Experience in creating ETL mappings using XML Source Qualifier, XML Parser and XML Generator to load data from XML Applications to downstream systems.
- Good expertise in Performance Tuning of mappings, ETL Procedures and process.
- Comfortable with technical and functional applications of Data Mapping, Data Management, Data transportation, Data Staging, Designing and development of Relational Database Systems (RDBMS)
- Extensive experience on Star Schema and Snowflake Schema, Fact Dimension Tables, Slowly Changing Dimensions. Interacting with clients and users as well to know their requirements
- Good knowledge on Github for maintaining versioning of the code.
- Extensive production support by organizing and proactive solutions to maintain high performance of the ETL batch processes.
- Involved in implementing and translating the business requirements to High Level and Low - level design for ETL processes in Teradata. Experience in issue tracking and agile project management usingJIRA.
- Sound knowledge of all phases of the Software Development Life Cycle (SDLC) Methodology and Agile method.
- Experience in UNIX working environment, writing UNIX shell scripts for Informatica pre-& post session operations.
TECHNICAL SKILLS
Data Warehousing and ETL Tools
: Teradata Manager, DBQL, TDQL, DBW, Priority Scheduler
Databases: Teradata 12.0/13.x/14.10, SQL/MYSQL server and Oracle
Informatica Tools: Informatica Power Center 9.x, power center exchange, IDQ GITHUB
Programming Skills: SQL, PL/SQL, Teradata SQL, Oracle SQL
Operating Systems: Windows, UNIX, LINUX
Teradata Utilites: BTEQ, SQL Assistant, Database Window, FastLoad, MultiLoad, Fast Export, TPT
Teradata DBA Utilities: VIEWPOINT, Query Monitor
Scheduling tool: Control-M Scheduler
Scripting Languages: UNIX Shell Scripting, BTEQ
PROFESSIONAL EXPERIENCE
Confidential, IL
Teradata/Informatica Developer
Responsibilities:
- Gathered requirements and created functional and technical design documents study, business rules, data mapping and workflows.
- Responsible for creating Technical Design documents, Source to Target mapping documents and Test Case documents to reflect the ELT process.
- Extensively worked with Teradata utilities likeBTEQ, Fast Export, Fast Load, Multi Loadto export and load data to/from different source systems including flat files.
- Creatingdatabases, users, tables, triggers,macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
- Extracted data from various source systems like Oracle, Sql Server and flat files as per the requirements.
- Used extensively Teradata Analyst Pack such as Teradata Visual Explain, Teradata Index Wizard and Teradata Statistics Wizard.
- Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems
- Implemented Data parallelism by using Multi-File System, Partition and De-partition components and also preformed repartition to improve the overall performance.
- Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables.
- Performed application level DBA activities creating tables, indexes and monitored and tuned Teradata BTEQ scripts using Teradata Visual Explain utility.
- Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
- Flat files are loaded into empty tables using Fast Load and then used in the queries to do joins
- Scheduling the automated jobs for daily, weekly and monthly jobs using UNIX Shell scripts for Control-M scheduling.
- Worked on different tasks in Workflows like sessions, Event raise, Event wait, Decision, E-mail, Command, Assignment, Timer, Worklets and scheduling of the workflow
- Designing and developing mapping, transformation logic and processes in Informatica for implementing business rules and standardization of source data
- Worked on UNIX shell scripting for automating Informatica workflows.
- Developed Informatica mappings, Reusable transformations. Developed and wrote procedures for getting the data from the Source systems to the Staging and to Data Warehouse system.
- Extensively used transformations to implement the business logic such as Sequence Generator, Normalizer, Expression, Filter, Router, Rank, Aggregator, Look Up (Target as well as Source), Update Strategy, Source Qualifier and Joiner, designed complex mappings involving target load order and constraint based loading
- Performance Tuning of the Informatica Mappings by adopting Explain plans, cutting down query costs using Oracle hints, changing the mapping designs.
- Responsible to tune ETL procedures and STAR schemas to optimize load and query Performance.
- Maintained and deployed the source code using GIT hub version control
- Well versed with all stages ofSoftware Development Life Cycle (SDLC)i.e.
Environment: Informatica Power Center 9.1/9.5, SAS DI Studio 9.4, Teradata 1 .x/12.x, Oracle 10G/11G, UNIX, SQL Server 2012/2008, GITHUB and Windows, Control-M.
Confidential, Dallas, TX
Sr. Informatica Developer.
Responsibilities:
- Set and follow Informatica best practices, such as creating shared objects in shared for reusability and standard naming convention of ETL objects, design complex Informatica transformations, maplets, mappings, reusable sessions, worklets and workflows.
- Evaluated business requirements to come up with Informatica mapping design that adheres to Informatica standards.
- Implemented performance tuning on a variety of Informatica maps to improve their throughput.
- Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor and IDQ.
- Performed the data profiling and analysis making use ofInformatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
- Reusable transformations and Mapplets are built wherever redundancy is needed.
- Performance tuning is performed at the Mapping level as well as the Database level to increase the data throughput.
- Used Informatica Power Center 9X for extraction, transformation and load (ETL) of data in the data warehouse from various sources.
- Worked on Informatica Power Center tools like Designer, Repository Manager, Workflow Manager, and workflow Monitor
- Participated in performance tuning and optimized various complex SQL queries.
- Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to DataMart.
- Resolving issues related to Enterprise data warehouse (EDW), Stored procedures in OLTP system and analyzed, design and develop ETL strategies. Analysed and reviewed functional requirements, mapping documents and troubleshooting, problem solving.
- Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
- Extracted, Transformed, loaded data into data warehouse using Informatica Power Centre and generated various reports on a daily, weekly monthly and yearly basis.
- Worked on Oracle PL/SQL to createstored procedures, packages and various functions.
- Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
- Optimizing/Tuning mappings for better performance and efficiency, Creating and Running Batches and Sessions using the Workflow Manager, extensively used UNIX Shell scripts for conditional execution of the workflows. Optimized the performance of Mappings, Workflows and Sessions by identifying and eliminating bottlenecks
- Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
- Developed UNIX shell scripts for automation of daily loading with CONTROL-M scheduling.
- Built UNIX Shell Scripts for processing/cleansing incoming text files, archive processed files after certain period.
Environment: Informatica Power Center 9.1/9.5, Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Transformation developer, Map let Designer, Mapping Designer, Repository manager, Informatica Cloud, Informatica Data Quality (IDQ), Control-M, UNIX, SSH (secure shell)
Confidential
Informatica/Teradata Developer
Responsibilities:
- Designed and developed Mapplets, Reusable transformations, Reusable sessionsand createdparameter fileswherever necessary to facilitatereusability.
- Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
- Involved in the Data profiling, and understanding the whole process of how the quality of data is going from end to end ETL process.
- Fixed the invalid mappings and troubleshoot the technical problems of the database.
- Developed various transformations like Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.
- Performed application level DBA activities creating tables, indexes and monitored and tuned Teradata BETQ scripts using Teradata Visual Explain utility.
- Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
- Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging to Journal then move data from Journal into Base tables
- Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor and IDQ.
- Extensively used transformations to implement the business logic such as Sequence Generator, Normalizer, Expression, Filter, Router, Rank, Aggregator, Look Up (Target as well as Source), Update Strategy, Source Qualifier and Joiner, Designed complex mappings involving target load order and constraint based loading
- Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
- Developed FRD (Functional requirement Document) and data architecture document and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis
- Experience in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
- Worked with heterogeneous source to extract data from Oracle database, XML and flat files then loaded to a relational Oracle warehouse.
- Implementation of documentation standards and practices to make mappings easier to maintain and experience in performance tuning of the Informatica mappings using various components.
- Involved in Creating UNIX Scripts for triggering the Stored Procedures and Macro.
- Coding using BTEQ SQL of TERADATA, wrote UNIX scripts to validate, format and execute the SQL’s on UNIX environment.
- Compared the actual test results with expected results using FTP with UNIX scripts.
- Extensive experience on different types of transformations like aggregator, lookup, source qualifier, expression, filter, update strategy, stored procedure, sequence generator, joiner, XML.
Environment: Informatica Power Center 9.5.1( Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor),,Teradata 14.10, Oracle 11g, PLSQL Developer, SQL, PLSQL, UNIX Shell Scripting.
Confidential
Informatica/Teradata Developer
Responsibilities:
- Performed unit testing at various levels of ETL and actively involved in team code reviews.
- Developed data Mappings from source systems to warehouse components. Also created and monitored batches and sessions using Informatica Power Center Server.
- CreatedReusableTransformationsandMappingsusingInformaticaDesignerand processing tasks usingWorkflow Managerto move data from multiple sources into targets.
- Extensively used transformations to implement the business logic such as Sequence Generator, Normalizer, Expression, Filter, Router, Rank, Aggregator, Look Up (Target as well as Source), Update Strategy, Source Qualifier and Joiner, Designed complex mappings involving target load order and constraint based loading
- Responsible to tune ETL procedures and STAR schemas to optimize load and query Performance.
- Optimizing/Tuning mappings for better performance and efficiency, Creating and Running Batches and Sessions using the Workflow Manager, Extensively used UNIX Shell scripts for conditional execution of the workflows. Optimized the performance of Mappings, Workflows and Sessions by identifying and eliminating bottlenecks
- Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata FastExport.
- Dealt with initials, delta and Incremental data to load into the Teradata tables
- Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.
- Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.
- Proficient in Teradata EXPLAIN plans, Collect Stats option, Primary Indexes (PI, NUPI), Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Join Indexes (JI), Volatile, global temporary and derived tables
- Developed UNIX shell scripts for automation of daily loading with CONTROL-M scheduling.
- Data conversion was done using Teradata Utilities, SQL and UNIX shell script and load them into Teradata tables.
- Responsible to deal with ETL procedures and STAR Schemas to optimize load and query performance. Involved in creating universes and generated reports using Star Schema.
- Used Teradata as a source system and prepared Technical Design documents and Test cases.
- Used Autosys for scheduling various data cleansing scripts and loading processes, maintained the batch processes using UNIX Scripts.
Environment: Teradata 13.10, Teradata SQL, Tools & Utilities(BTEQ, Fast Export, Multi Load, Fast load, Teradata SQL Assistant, Teradata Manager and Teradata Performance Monitor), Control-M, Hadoop, Windows and Unix.