Sr. Etl / Teradata Developer Resume
Chicago, IL
SUMMARY:
- Over 8+ years of total IT experience and technical proficiency in the Data Warehousing involving Business requirements Analysis, Application Design, Data Modeling, Development, Testing and Documentation.
- Proficient in Teradata Database design, implementation, and maintenance mainly in large - scale Data Warehouse environments
- Hands on experience in Teradata RDBMS using Fast Load, MultiLoad, Tpump, TPT, Fast Export, Teradata SQL Assistant, Teradata Administrator and BTEQ.
- Expert in tuning Complex Teradata SQL Queries including Joins, correlated sub queries and Scalar Sub queries.
- Development Implementation and Testing Data Warehouse and Database applications for Health care, Telecommunications Services, Retail, Commercial and Financial Services using Informatica, Oracle PL/SQL, Teradata and My SQL.
- Strong understanding of the Relational database concepts.
- Excellent Understanding of concepts like star-schema, snowflake schema using fact and dimension tables and relational databases (Oracle), Teradata and Client/Server Applications.
- Skilled in Tableau Desktop for Data Visualization, Reporting and Analysis, Cross Map, Scatter Plots, Geographic Map, Pie charts and Bar charts.
- Proficiency in design and developing the ETL objects using Informatica Power center with various Transformations like Joiner, Aggregate, Expression, SQL, Lookup, Filter, Update Strategy, Stored Procedures, Router, Rank, normalizer transformations etc.
- Hands on experience in monitoring and managing varying mixed workload of an active data warehouse using various tools like Teradata Workload Analyzer, Teradata Dynamic Workload Manager and Teradata Manager.
- Involved in Data Migration projects from DB2 and Oracle to Teradata. Created automated scripts to do the migration using UNIX shell scripting, Oracle/TD SQL.
- Expert in Coding Teradata SQL, Teradata Stored Procedures, Macros and Triggers.
- Involved in full Life Cycle of various projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support.
- Collected statistics every week on the tables to improve performance.
- Expertise in Query Analyzing, performance tuning and testing.
- Proficiency in Microsoft Business Intelligence technologies like MS SQL Server Integration Services(SSIS) and MS SQL Server Reporting Services(SSRS).
- Experience in writing UNIX korn shell scripts to support and automate the ETL process.
- Technical expertise in ETL methodologies, Informatica Powercenter, Powermart, Client tools - Mapping Designer, Workflow Manager/Monitor and Server tools - Informatica Server Manager, Repository Server Manager, and Power Exchange.
- Expertise in testing the applications and creating test cases for Unit Testing, Integration Testing and make sure the data extracted from source is loaded into target properly in the right format.
- Experience working with clients of all sizes in the Financial, Healthcare, Retail and Manufacturing industries.
- Very good understanding of Reporting tools like Cognos, Tableau and Business Objects.
- Good Analytical and strong interpersonal and good communication skills.
TECHNICAL SKILLS:
Operating Systems: Sun Solaris 2.6/8/9/5.1/5.2 , RedHat Linux RHEL 2.1/3.x/4.x, Windows 10/8/7/XP/Vista.
Programming Languages: SQL, PL/SQL, C, C++, C#, JAVA.
Databases: Teradata(V2R6/V2R 5/12/13/13.10/14/14/10 ), MS-SQL Server, Oracle(9i/10g/11g), Netezza, SQLServer2005.
Software/Tools/Applications: TASM, BTEQ, Multi Load, Fast Load, FastExport, TPump, TPT, Teradata Manager, SQL Assistant, Teradata Administrator, TSET, Index Wizard, Statistics Wizards, Viewpoint, TOAD 7.6/7.5, Real Applications Cluster(RAC), STATSPACK, ERWIN, EMC Storage, Trillium 7.6/6.5, Informatica Powercenter(9.x/8.x/7.x), SSIS, SSRS, Data Mover.
Reporting Tools: Tableau, Cognos, Business Objects
Process/Methodologies: Waterfall, Agile Methodology
Scripting languages: UNIX Shell Scripting, Java Scripting.
PROFESSIONAL EXPERIENCE:
Confidential, Chicago, IL
Sr. ETL / Teradata Developer
Responsibilities:
- Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
- Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, FastLoad, MultiLoad, and Tpump.
- Defining the schema, staging tables, and landing zone tables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
- Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatile table and derived queries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
- Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Developing as well as modifying existing mappings for enhancements of new business requirements mappings to load into staging tables and then to target tables in EDW. Also created Mapplets to use them in different mappings.
- Working on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, worklets and scheduling of the workflow.
- Using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
- Running and monitoring daily scheduled jobs by using Work Load manager for supporting EDW (Enterprise Data Warehouse) loads for History as well as incremental data.
- Experience on working with Teradata Campaign Management tool.
- Investigating failed jobs and writing SQL to debug data load issues in Production.
- Interacting with the Source Team and Business to get the Validation of the data.
- Writing SQL Scripts to extract the data from Database and for Testing Purposes.
- Supported the code after postproduction deployment.
- Implemented flexible view of Dashboard by using Tableau.
- Reviewed basic SQL queries and edited inner left, and right joins in Tableau Desktop by connecting live/dynamic and static datasets.
- Involved in Transferring the Processed files from mainframe to target system.
- Familiar with Agile software methodologies (scrum).
Environment: Teradata v14, UNIX, Teradata SQL Assistant, BTEQ, FastLoad, MultiLoad, Tpump, Shell Scripts, Informatica Powercenter 9.6, Tableau, Agile.
Confidential, Long Beach, CA
Sr. ETL / Teradata Developer
Responsibilities:
- Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules
- Design, development and documentation of the ETL (Extract Transformation and Load) strategy to integrate different interfaces with Exigent.
- Work with the data modelers, architects and Cognos developers to create a consumption layer in Teradata and Cognos.
- Performance optimization on the data and consumption layer in Teradata.
- Create and Maintain Teradata Tables, Views, Macros, Triggers and Stored Procedures.
- Coding using Teradata Analytical functions, write UNIX scripts to validate, format and execute the SQLs on UNIX environment.
- Created a BTEQ script for pre-population of the work tables prior to the main load process.
- Involved in Performance tuning for the long running queries.
- Developed UNIX shell scripts to run batch jobs in production.
- Exclusively used Informatica Designer to create en-manupulate source definitions, target definitions, mappings, mapplets, transformation, etc.
- Apply broad in-depth business and technical knowledge to resolve production support and sustainment activities.
- Developed Shell Scripts to perform tasks like starting the Informatica workflow from the Script, used looping techniques, naming the file with the Date file was generated.
- Extensive work experience in the design and development using UNIX shell scripting and SQL
- Effectively used Persistent caches, Informatica Lookup Overrides and other optimizing techniques to improve the Session performance considerably.
- Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system.
Environment: Teradata v13/v14, Cognos, UNIX Shell Scripts, Informatica, Teradata SQL Assistant, BTEQ, FastLoad, MultiLoad, Tpump, Agile.
Confidential, NYC, NY
Sr. ETL Developer
Responsibilities:
- Migrated data from Informatica Powercenter Designer and using transformations like Source Qualifier, Expression, Filter, Router, Joiner, Sequence developer, Update Strategy, Lookup, Sorter, Aggregator, Normalizer, XML Source Qualifier, Stored Procedure etc. for extraction, transformation and loading of data.
- Worked on Netezza as relational Source data base which is dimensional model to generate the existing BO reports.
- Assisted in support functions and maintenance of Netezza data warehousing applications as per program requirements.
- Created complex jobs and implemented Slowly Changing Dimensions (Type 1, Type 2 and Type 3) for data loads.
- Extensively used Erwin for Logical and Physical data modeling and designed Star Schemas.
- Extensively Used Environment SQL commands in workflows prior to extracting the data in the ETL tool.
- Worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source data base Netezza.
- Created databases, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
- Expertise using ERWIN for Data modeling and Database design.
- Experience working with Informatica Power Exchange for Netezza and migrating the data using Talend Studio into Teradata.
- Hands on Experience Business Intelligence experience using Business Objects.
- Migrated reports from Business Objects to Tableau.
- Experience in Informatica providing Data warehouse solutions using Power Center and data modeling with ERWIN tool.
- Used Erwin tool for physical and logical data modeling.
- Used Informatica power center to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
- Used Talend extensively for profiling the data and accessing the data quality of the source data.
- Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.
- Migrated Informatica folders, mappings from one environment to another and in the process of change control using Talend and yaml files.
- Implemented Flexible view of dashboard by using Tableau.
- Skilled in Tableau Desktop for Data Visualization, Reporting and Analysis, Cross Map, Scatter Plots, Geographic Map, Pie charts and Bar charts.
- Created jobs in Talend Studio in the development environment and tested in QA and moved into Production environment as well.
- Created and used volatile tables for testing purposes.
- Involved in Code Reviews & Unit Testing for the ETL code developed.
Confidential
Sr. ETL Developer
Environment: Teradata v13/v14, Netezza, ERWIN, ETL tool, Informatica, Tableau, Oracle, Informatica Workflow Manager, BTEQ, Fast Export, FastLoad, MultiLoad.
Responsibilities:
- Designed the ETL processes using Informatica to load data from Oracle, SQL Server, Flat Files, (Fixed Width), XML files and Excel files to target Teradata Database
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 7.x and 8.x.
- Actively involved in Data Gathering, recognizing, and confirming the data sources/elements
- Extensively used ETL to load data from flat files into Staging area and then to Data warehouse
- Involved heavily in writing complex SQL queries based on the given requirements and used volatile tables, temporary tables, derived tables for breaking up complex queries into simpler queries
- Implemented various Teradata specific features like selection of PI, USI/NUSI, PPI and Compression based on requirements
- Implemented access rights mechanism using restricted views, macros and roles
- Provide ongoing Teradata system performance management using Teradata manager and PMON
- Reviewing and monitoring priority scheduler and TDQM settings
- Gather information from different data warehouse systems and loaded into warehouse using FastLoad, FastExport, MultiLoad, BTEQ and Unix shell scripts
- Involved in unit testing, systems testing, integrated testing and user acceptance testing.
Environment: Teradata v13, ETL tool, Informatica Powercenter 7.x/8.x, SQL Server, Oracle, Teradata Manager, BTEQ, Fast Export, Fast Load, Multi Load, Shell Scripting.
Confidential
ETL Developer
Responsibilities:
- Involved in analysis and understanding of end user requirements and business rules.
- Involved in calculating project estimates.
- Development of scripts for loading the data into the base tables in EDW using Fast Load, Multiplied and BTEQ utilities of Teradata
- Identify all potential issues during the requirement understanding phase and to describe actions to address those issues.
- Performed tuning and optimization of complex SQL queries using Teradata Explain
- Created a BTEQ script for pre-population of the work tables prior to the main load process
- Creation of HLD and LLD design of the system.
- Created the study and TDD for the new requirements.
- Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse
- Involved heavily in writing complex SQL queries to pull the required information from Database using Teradata SQL Assistance
- Designing the technical solution of Business requirement.
- Loading data by using the Teradata loader connection, writing Teradata utilities scripts (Fast load, Multiplied) and working with loader logs
- Created and automate the process of loading using Shell Script, Multi load, Teradata volatile tables and complex SQL statements
- Performed tuning and optimization of complex SQL queries using Teradata Explain.
- Recommend solutions for data issues.
- Streamlined the Teradata scripts and shell scripts migration process on the UNIX box
- Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system
- Fine-tune the existing process to achieve increased performance and reduced load times for faster user query performance.
- Creating and maintaining source-target mapping documents for ETL development team.
- Developed unit test plans and involved in system testing customer information. The main scope in this project is dealing with Enterprise Data Warehouse (EDW) solution that will support ad-hoc reporting & analysis.
Environment: Teradata v13, ETL tools, Teradata SQL Assistant, BTEQ, Fast Export, Fast Load, Multi Load, Shell Scripting.