Senior Teradata/ Etl Developer Resume
SUMMARY
- Teradata/ETL Developer with 8+ years of experience in analyzing, designing, developing application, integrating with various databases and data sources for the application as per the requirement in Data Warehousing Environment.
- Strong data warehousing experience using Informatica Power Mart, Power Center, Power Exchange, Power Connect as ETL tool.
- Experienced with different Relational databases like Teradata, Oracle and SQL Server.
- Hands on experience on Big Data/Hadoop components like HDFS, MapReduce, YARN, Sqoop, Spark, HIVE and OOZIE.
- Strong Knowledge in Data Warehousing using fact & dimension tables, star and snowflake schema modeling.
- Expert in Teradata SQL, Teradata Utilities (BTEQ, FastLoad, FastExport, MultiLoad, TD Administrator, Teradata SQL Assistant.
- Experienced with different Relational databases like Teradata, Oracle and SQL Server.
- Has extensive experience in Gathering functional requirements from business users and System Design, Coding, Testing and production support for number of large projects. Participated in development and enhancement of large projects using Informatica.
- Performing Data validation, Data integrity, Data Quality checking before delivering data to operations, Business, Financial analyst.
- Participated in various stages of Software Development Life Cycle (SDLC) such as Analysis, designing, developing, debugging, conversion, testing, implementation, and production support.
- Experience in performance tuning of ETL Sources, Targets, Mappings, transformations & Sessions.
- Experience in troubleshooting Teradata scripts, fixing bugs and addressing production issues and performance tuning.
- Experience in troubleshooting complex SQL queries and addressing production issues and performance tuning.
- Experienced in writing and tuning complex SQL queries, Triggers and Stored procedures in SQL Server, Oracle, Teradata.
- Strong experience in designing and developing Business Intelligence solutions in Data Warehousing/Decision Support Systems using Informatica Power Center and SQL Server suite of product.
- Experience with UNIX Shell Scripting (KSH - Korn Shell Scripting).
- Good understanding in entity relationship and Data Models.
- Worked on code changes to increase the performance of the process and enable re-usability.
- Schedule, Run Extraction, load process, monitor and supporting ETL batch jobs.
- Extensive experience in implementation of Data Cleanup procedures, transformations, Scripts, Stored Procedures and execution of test plans for loading the data successfully into the targets.
- Used SQL & STORED PROCEDURES to write complex queries to retrieve/manipulate data from various tables to match business functionality.
- Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages and Triggers.
- Creating reports using SQL Reporting Services (SSRS) for customized and ad-hoc Queries.
- Extracted and transferred source data from different databases like Oracle, SQL SERVER, and DB2and flat file into Oracle.
- Data modeling experience using Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 4.x.
- Exceptional communication, collaboration, analytical, interpersonal and team building skills with proficiency at grasping new technical concepts quickly and utilize the same in productive manner.
- Expertise in querying and testing RDBMS such as Oracle, MS SQL Server using SQL for data integrity.
- Complex ETL development using Informatica Power Center Transformations - Lookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, Sequence Generator and Slowly Changing Dimensions.
- Data modeling experience using Dimensional Data Modeling, Star Schema, Snow-Flake, Fact and Dimension.
TECHNICAL SKILLS
- Operating Systems: Unix, Windows
- Databases: Teradata, Oracle, SQL Server, DB2.
- Teradata Tools & Utilities: Query Facilities: SQL Assistant, BTEQ Load & Export: FastLoad, MultiLoad, Tpump, TPT, Fast Export, DataMover.
- Data Modeling: Erwin, Visio
- ETL tools: Informatica, Hadoop (HDFS, MapReduce, Hive, Sqoop, Pig).
- Programming languages: C, C++, Shell scripting (K - shell, C-Shell), SQL.
- Reporting Tools: Micro Strategy, Tableau
- Scheduling tools: ESP, Control-M, Autosys, One Automation.
PROFESSIONAL EXPERIENCE
Confidential, FL
Senior Teradata/ ETL developer
Responsibilities:
- The purpose of Project is to loading data into different layer (Source Stage, Stage Journal, Journal Base and Base Semantic) using existing framework (TPT/BTEQ).
- Understand business requirement, and the current system implementation of that functionality and perform system analysis with the proposed changes.
- Responsible for developing, support and maintenance for the ETL processes using Informatica Power Center, Informatica Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets Designer, Informatica Repository Manager and Informatica Workflow Manager using Teradata Load Utilities like TPT, Fast Load and Multi Load.
- Experience in integration of heterogeneous data sources like Oracle, SQL Server and Flat Files (Fixed & delimited) into Staging Area using Informatica mappings, workflows and session objects.
- Extensively worked with various look up cache like Static Cache, Dynamic Cache, Persistent Cache, and Shared Cache.
- Extensively worked on several types of transformations like expression, filter, aggregator, lookup, stored procedure, sequence generator, and joiner etc.
- Created parameter files and Unix Scripts for operations like file validation, file processing, report generation and Data reconciliation check between the source and target.
- Created Wrapper shell script to connect Informatica repositories and Autosys server to execute the jobs and report generation.
- Implemented NDM, SFTP, FTP services to retrieve Flat Files from the external sources, create shell script to connect external servers using NDM, SFTP and FTP process.
- Increased Code Reusability using Shortcuts, Mapplets, Reusable Transformations and Reusable Sessions to reduce redundancy in code.
- Create/modify Teradata Tables, Views and Stored Procedures. Create Semantic layer views for business users.
- Prepare Teradata Stored Procedure for masking data when data is copied from Production Environment to the Lower Environment for Development.
- Prepare Re-Usable Macro to Switch the View Layer to Maintain Referential Integrity between Parent and child profiles during data processing.
- Prepare Teradata BTEQ Scripts for Change Data Capture (CDC) between Historical Data and Current Data.
- Coding and Unit Testing for Teradata SQLs, TPT (LOAD, UPDATE, STREAM, SQL Selector, SQL Insertor, ODBC, Fast Export, Data Connector Operators), Fast Load, Multi Load, Fast Export and BTEQ Script.
- Create Autosys jobs invoking Unix Scripts to run TPT scripts, use command line, local job variable file and global job variable file.
- Apply Teradata Multi Value Compression feature to save space in the Teradata Target Database.
- Create XML Script to create Send Reconciliation Report and Daily success/Failure Job Status to Business Users.
- Create Join Index to improve performance of Teradata Queries which are used by User Interface Transaction monitoring system.
- Interact with the business in defect resolution and user support in terms of any design/requirement questions.
Environment: Teradata, Informatica, Teradata Utilities (TPT, Fload, Mload, BTEQ Export and import), Oracle, UNIX shell scripting, ESP, OOZIE.
Confidential, Des Moines, IA
Senior Teradata/ETL Developer
Responsibilities:
- Working as onsite coordinator to coordinate Requirements between Business teams and off shore Teams.
- Involved in preparation of Functional and Technical Specification Documents.
- Involved in design, development and common error handling solution for the enterprise.
- Extensively worked on Informatica for creating multiple interfaces to satisfy the business requirements.
- Extensively worked on loading of data into Teradata from legacy systems and flat files using complex Multiload, BTEQ, created & modified databases, performed capacity planning, allocated space, granted rights for all objects within databases, etc.,
- Extensively used Push down Optimization in Teradata to gain maximum Performance by using Full PDO.
- Worked on FastLoad, Multiload, Tpump and FastExport loading techniques through Informatica into Teradata.
- Expert Developer skills in Teradata RDBMS, initial Teradata DBMS environment setup, development and production DBA support, use of FASTLOAD, MULTILOAD, and Teradata SQL and BTEQ Teradata utilities.
- Involved in creating, monitoring, modifying, & communicating the project plan with other team members.
- Extensively used transformations such as Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner, Transaction Control and Stored Procedure.
- Extensively handled support of hundreds of interfaces from various releases.
- Coordinated testing in multiple environments with multiple teams.
- Extensively worked with Informatica Tools like Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Repository server and Informatica server to load data from flat files, legacy data.
- Worked on UNIX scripting for data preprocessing before loading them to DB.
- Created Teradata Bteq scripts and mload scripts for ETL loads and automation processes.
- Worked on Oracle Sql Queries for Data validations, verifications and testing purposes.
- Coordinated between Dev, QA and production migration teams
- Resolved critical issues in production environment in numerous instances as I took care of Production support as well.
- Created and ran sessions using Informatica power center workflow manager.
- Created Informatica mappings with PL/SQL Procedures / Functions/ Triggers to build business rules to load data.
- Used PMCMD command to start and run the workflow from the UNIX environment.
- Involved in regular monitoring, exception reporting as part of production support project.
- Involved in code review of the interfaces developed and defining best practices.
- Developed various documentation including onboarding documentation, troubleshooting, migration etc.
- Involved in creation and deployment of automated tools which have significantly reduced turnaround time for our support process.
Environment: Informatica Power Center, Teradata Utilities (TPT, Fload, Mload, BTEQ Export and import), Teradata Sql Assistant, One Automation, control -M, UNIX Shell Scripting.
Confidential
Informatica / Teradata Developer
Responsibilities:
- Participated in requirement gathering, Business Analysis, user meetings, discussing the issues to be resolved and translating user inputs into ETL design documents.
- Created ER diagram of the data model using Erwin data modeler to transform business rules into logical model.
- Created mappings to load Incremental data into Teradata and also involved in Production Support
- Imported data from flat files using Teradata load utilities like FastLoad, Multiload, and Tpump
- Used BTEQ and SQL Assistant front-end tools to issue SQL commands matching the business Requirements to Teradata RDBMS.
- Involved in loading of data into Teradata from legacy systems and flat files using complex Multiload, BTEQ, created & modified databases, performed capacity planning, allocated space, granted rights for all objects within databases, etc.,
- Creating BTEQ (Basic Teradata Query) scripts to generate Keys.
- Extensively used tools of Mload, Bteq, FastExport and FastLoad to design and develop dataflow paths for loading transforming and maintaining data warehouse.
- Involved in the extraction, transformation and loading of data from source flat files and RDBMS tables to target tables.
- Worked with Informatica Power Center Mapping Designer, Workflow Manager, Workflow Monitor and Admin Console.
- Used Source Qualifier, Aggregator, Lookup, Expression, Stored Procedure Transformations.
- Extensively worked with Informatica Designer. Designed and developed Informatica mappings for data loads and data cleansing.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based on commit interval.
- Created Sessions and Batches to run Workflows.
- Responsible for writing unit test cases and performing the unit test.
- Extensively used Debugger to identify data transfer problems in mappings.
- User acceptance testing.
- Ensuring that all production changes are processed according to release management policies and procedures.
- Ensuring that application changes are fully documented, supportable.
- Proactively identifying opportunities for change within the production environment.
- Creating shell scripting, file archiving, emailing to end users.
Environment: : Informatica Power Center, Informatica Multi Domain, Oracle, PL/Sql,Teradata,, DB2, UNIX Shell Scripting .
Confidential
Informatica Developer
Responsibilities:
- Interacting with business users for requirement gathering, analyzing the business requirements and translating them in to functional and technical specifications.
- Involved in taking requirements from business users and project planning.
- Understanding existing business model and customer requirements.
- Analyzing the issue if any in relational database, flat file and Communicate with concerned person to resolve issue.
- Worked with Informatica Power Center Mapping Designer, Workflow Manager, Workflow Monitor and Admin Console
- Requirement Analysis and generating the Business Requirement Specification involved in data.
- Extraction from Oracle, Flat files using Informatica.
- Extensively used Debugger to identify data transfer problems in mappings.
- Designed and developed to Expression, look up, Filter, Router, Update strategy transformation rules to generate consolidated data identified by dimensions using Informatica ETL (Power Center) tool.
- Created mappings to load Incremental data into Teradata and also involved in Production Support
- Created sessions, database connections and batches using Informatica Server Manager.
- Scheduled and monitored transformation process using Informatica Server Manager.
Environment: : Informatica, Oracle, Teradata, Teradata SQL Assistant, PL/SQL, Teradata Utilities (BTEQ, Mload, FastLoad, Fast Export and TPT), UNIX.