Teradata/etl Developer Resume
Hillsboro, OregoN
SUMMARY
- Having 6 years of experience in design, development, implementation, testing, maintenance and administration of Teradata data warehouse, Informatica, oracle and having strong knowledge on big data technology like Hadoop eco systems.
- Experience in using Teradata tools and utilities like BTEQ, SQL Assistant, Teradata Manager, Teradata Workload Analyzer, Teradata Visual Explain, Teradata Index Wizard and Informatica Workflow design.
- Designed, developed, testing and supported Extract, Transform and Load (ETL) processes necessary to load and validate data warehouse.
- Having strong knowledge on Teradata Database Architecture, Database Features and Teradata tools.
- Experience on ETL tool Informatica Power Center on OLAP and OLTP environment for Health care clients.
- Strong knowledge in data modeling for OLTP relational databases and Data Warehouse applications using Ralph Kimball and Bill Inman design principles including facts and dimensions tables, Slowly Changing Dimensions (SCD), Change Data capture (CDC) and Dimensional Modeling - Star and Snowflake Schema.
- Strong hands on experience on Teradata tools such as BTEQ, FLOAD, MLOAD, TPUMP, FAST EXPORT, TPT (Teradata Parallel Transporter).
- Expertise in creating databases, users, tables, views, functions, Packages, joins and hash indexes in Teradata database.
- Use Erwin for logical and physical database modeling of the warehouse, responsible for database schema creation based on the logical models.
- Working knowledge on Data warehouse techniques and experience including ETL process, dimensional Star Schema and Snowflake schema, Fact dimensional table OLTP and OLAP.
- Having good knowledge on Hadoop Eco-systems like HDFS, MapReduce, PIG, HIVE, HBASE
- Developed mappings in Informatica to load the data from various sources into the Data warehouse different transformations like Source Qualifier, Expression, Lookup, Aggregate, Update Strategy and Joiner.
- Experience in resolving on-going maintenance issues and bug fixes and monitoring informatica sessions as well as performance tuning of mapping and sessions
- Extensively knowledge with Teradata SQL Assistant. Developed BTEQ scripts from to load data from Teradata. Staging area to Data Warehouse, Data Warehouse to data marts for specific reporting requirements. Tuned the existing BTEQ script to enhance performance.
- Strong Knowledge in Relational Database Concepts, Entity Relation Diagrams, Normalization and De-normalization Concepts.
- Good Knowledge on Amazon cloud
- Experience with software development life cycle (SDLC), Waterfall and Agile Project Management Methodologies.
- Highly interacted with the Business Users to gather requirements, Design Tables, standardize interfaces for receiving data from multiple operational sources, coming up with load strategies of loading staging area and data marts. Deal with SCD Type1/Type2/Type3 loads.
- Generated SQL and PL/SQL scripts to install create and drop database objects including tables, views, primary keys, indexes, and constraints.
- Generated Reports and created Dashboards in tableau Reporting Tool.
- Tuned various queries by COLLECTING STATISTICS on columns in the WHERE and JOIN expressions.
- Involved in troubleshooting the production issues and providing production support
- Extensive ability to debug, to troubleshoot, and solve complex PL/SQL issues, and the ability to reverse engineer existing PL/SQL code into functional requirements.
TECHNICAL SKILLS
Programming Languages: SQL, PL-SQL, shell scripting, Python
Datawarehouse and database: Teradata, Oracle, Hadoop echo-systems HDFS, Map Reduce, Pig, Hive
Teradata Load and Unload Utilities: FLOAD, FASTEXPORT, MLOAD, TPUMP, TPT
Query management tools: BTEQ, SQL assistant
ETL Tools: Informatica power center, Informatica power Exchange
Reporting Tool: Tableau
Data Modelling Tool: Erwin
Operating Systems: LINUX and Windows
PROFESSIONAL EXPERIENCE
Confidential, Hillsboro, Oregon
Teradata/ETL developer
Responsibilities:
- Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
- Extracted data from multiple sources through Informatica to Oracle to Teradata DWH.
- For every system to system using Informatica Power center to extract and load the data.
- Granting privileges to roles, adding users to roles based on requirements.
- Prepared the ETL specifications, Mapping document, ETL framework specification
- Write complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
- Extracted data from Oracle then used Teradata for data warehousing.
- Implemented slowly changing dimension methodology for accessing the full history of data
- Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
- Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
- Prepared functional and technical specific, design documents.
- Responsible for data profiling, data cleansing and data conformation
- Proficient in understanding Teradata EXPLAIN plans, Collect Stats option, Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Volatile, global temporary, derived tables etc.
- Used Informatica Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets Designer, Informatica Repository Manager and Informatica Workflow Manager.
- Developed various complex mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
- Handled jobs Recovery, Failed Jobs and alerting mechanisms while Batch jobs running.
- Developed and created the new database objects including tables, views, index, stored procedures and functions.
- Created PL/SQL packages to load data from Oracle staging area to outbound of Oracle.
- Optimized and Tuned SQL queries used in the source qualifier of certain mappings to eliminate Full Table scan.
- Compare the Teradata Objects in QA and PROD using python script.
- Unit Testing, Database testing, Data verification and validation of the code developed and compare the results with Oracle and Teradata systems.
- Identify the performance bottlenecks in the production processes and identify key places where SQL can be tuned to improve overall performance of the production process.
Environment: Teradata 16.0 Sql Assistant, Oracle 12c, Informatica Power Center 9.6 Alteryx, Unix, shell scripting, FASTEXPORT, Python, Git hub, VSS, Tableau, SAP, Autosys
Confidential
Teradata/ETL Developer
Responsibilities:
- Co-ordinated with the Business Analysts, Data Architects, DM’s and users to understand business rules.
- Designed and developed mapping, transformation logic and processes in Informatica for implementing business rules and standardization of source data from multiple systems into the data warehouse.
- Involved in preparing Technical design documents and getting approvals from business team.
- Created MLOAD scripts to load data related to ECOM orders into staging tables.
- Development of scripts for loading the data into the base tables in EDW and to load the data from source to staging and staging area to target tables using Fast Load, Multiload and BTEQ utilities of Teradata.
- Created stored procedures to load staging data into corresponding target tables.
- Involved in error handling, performance tuning of SQLs, testing of Stored Procedures.
- Strong knowledge to use Primary, Secondary, PPI, and Join Indexes.
- Involved in scheduling Teradata and UNIX objects to run the jobs on daily/weekly basis depending on business requirement.
- Worked on various tables to create Indexes to improve query performance. Involved in writing several queries and procedures.
- Wrote complex sql Queries involving multiple tables with joins and generated queries to check for consistency of the data in the tables and to update the tables as per the Business requirements.
- Creating SQL and BTEQ scripts in Teradata to load the data into EDW as per the business logic.
- Prepared Validation scripts to Validate data in target tables with source data.
- Involved in writing scripts for loading data to target data Warehouse for Bteq and Multi Load.
- Involved in performance tuning by collect stats and join index creation in different scenarios according to the requirement.
- Helped testing team to prepare test cases with various test scenarios.
Environment: Teradata 15.0 SQL Assistant, BTEQ, Fast load, Multiload, Oracle 12c, Fast Export, Control-M, Unix, Informatica power center, Tableau