Sr.informatica/sr.teradata Developer Resume
Boston, MA
SUMMARY
- Teradata professional with around 6+ years of experience that composes of strong Technical and Problem - solving skills in Business Intelligence, ETL Informatica Power Center and Teradata database developer.
- Worked with Teradata versions, Informatica Power Center as ETL tool for extracting, transforming and loading data from various source data inputs to various targets.
- Certified Teradata Consultant with experience in Teradata Physical implementation and Database Tuning.
- Worked extensively with Teradata utilities - Fast load, Multi load, TPump and Teradata Parallel Transporter (TPT) to load huge amounts of data from flat files into Teradata database.
- Broadly used Fast export to export data from Teradata tables.
- Generated BTEQ scripts to invoke various load utilities transform the data and query against Teradata database.
- Extensive experience in Administration and Maintenance of Dev, Stage, prod and standby databases for DSS and Data Warehousing environments.
- Comfortable with both technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging.
- Solid Experience in designing and developing of Extraction transformation and Loading (ETL) process using the Ab- Initio Software.
- Experience in different database architectures like Shared Nothing and Shared everything architectures. Very good understanding of SMP and MPP architectures.
- Experience in working with various heterogeneous Source Systems like Mainframe OLTP, MySQL, Oracle, SQL Server, DB2, ERP, Flat files and Legacy systems.
- Expert in Coding Teradata SQL, Teradata Stored Procedures, Macros and Triggers.
- Expertise in Query Analyzing, performance tuning and testing.
- Hands on experience in monitoring and managing varying mixed workload of an active data warehouse using various tools like PMON, Teradata Workload Analyzer and Teradata Visual Explain.
- Extensively worked on Query tools like SQL Assistant, TOAD and PLSQL Developer.
- Good Knowledge in Logical and physical modeling using Erwin. Hands on experience in 3NF, Star/Snowflake schema design and De-normalization techniques.
- Proficient in converting logical data models to physical database designs in Data warehousing Environment and in-depth understanding in Database Hierarchy, Data Integrity concepts and Data Analysis.
- Extensively used various Informatica Power center and Data quality transformations such as - source qualifier, aggregator, update strategy, expression, joiner, lookup, router, sorter, filter, web services consumer transformation, XML Parser, address validator, comparison, consolidation, decision, parser, standardizer, match, merge to perform various data loading and cleansing activities
- Extensively used Autosys to schedule jobs, perform initial data loads, data copy from one environment to another when the environment is initially setup.
- Experience in extracting source data from Mainframe OLTP systems by writing several COBOL and JCL scripts.
- Experience in writing UNIX shell and PERL scripts to support and automate the ETL process.
- Involved in Unit Testing, Integration Testing and preparing test cases.
- Involved in production support activities 24/7 during on call and resolved database issues.
- Involved in full lifecycle of various projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support.
- Strong problem solving, analytical, interpersonal skills, communication skills and have the ability to work both independently and as a team.
TECHNICAL SKILLS
Databases: Teradata,Oracle, DB2, MS-SQL Server.
DB Tools/Utilities: Teradata SQL Assistant 13.1, BTEQ, Fastload, Multiload, FastExport, TPump, Teradata Visual Explain, Teradata Administrator, PMON, SQL Loader, TOAD 8.0.
BI Tools: MicroStrategy reports.
Programming Languages: C, SQL, PL/SQL, UNIX and PERL Shell Scripting.
ETL Tools: Informatica
Data Modelling: Logical/Physical/Dimensional, Star/Snowflake, OLAP, ERWIN.
Scheduling Tools: UC4, Autosys, Crontab.
Version Control Tools: GIT, Clear Case.
Operating Systems: Sun Solaris, Linux, Windows, UNIX.
PROFESSIONAL EXPERIENCE
Confidential, Boston, MA
Sr. Informatica/Sr.Teradata Developer
Responsibilities:
- Involved in requirement gathering, business analysis, design and development, implementation of business rules.
- Development of scripts for loading the data into tables in EDW and to load the data from Source to staging and staging area to target tables.
- Develop ETL load process and Framework for the warehouse.
- Implemented logical and physical data modeling with STAR and SNOWFLAKE techniques using Erwin in Data Mart. Created source-to-target data mappings.
- Extensively used the Teradata utilities like BTEQ, Fast load, Multiload, TPump, DDL Commands and DML Commands (SQL).
- Performed tuning and optimization of complex SQL queries using Teradata Explain and Run stats.
- Work together with product management, engineering, design, policy, and senior executives to rapidly execute, learn and iterate.
- Can roll up sleeves and work with large quantities of data by using SQL, Teradata or other data and statistical tools as well as reporting tools such as tableau.
- Extensively used the Teradata utilities like BTEQ, Fast Load, Multiload, DDL Commands and DML Commands (SQL).
- Written the Multithreading and synchronization scripts
- Developed TPT scripts to load data from Load Ready Files to Teradata Warehouse.
- Used BTEQ and SQL Assistant front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Used SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc.) to achieve better performance
- Monitoring database space, identifying tables with high skew, working with data modeling team to change the Primary Index on tables with High skew.
- Reviewed SQL for missing joins & joins constraints, data format issues, mismatched aliases, casting errors.
- Created UNIX scripts to find, create, and check file sizes and to change permissions on the files.
- Had administrator duties to migrate code from Development and testing environments to Production.
- Worked in Supporting Production environment for Informatica and Wherescape.
- Provide 24*7 production support for the Teradata ETL jobs for daily, Monthly and Weekly Schedule.
Confidential, Atlanta,GA
ETL Developer/Teradata Developer
Responsibilities:
- Involved in writing clear requirements and bridge the gap between IT teams and the business teams.
- Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
- Dealt with Incremental data as well Migration data to load into the Teradata.
- Experienced in SQL performance tuning experienced in writing complex SQL.
- Participated in data model (Logical/Physical) discussions with Data Modelers and creating both logical and physical data models.
- Involved in Investigating and resolve data issues across platforms and applications, including discrep-ancies of definition, format and functions
- Extensively used the Teradata utilities like BTEQ, Fast load, Multiload, TPump, DDL Commands and DML Commands (SQL). Created various Teradata Macros in SQL Assistant for to serve the business an-alysts
- Involved heavily in writing complex SQL queries based on the given requirements such as complex Te-radata Joins, Stored Procedures, Macros
- Extensively used various Teradata Set Tables, Multi-Set table, global tables and volatile tables for Load-ing/Unloading.
- Working closely with CA7 Schedulers to set up job stream through CA7 to run daily, weekly and Monthly process jobs.
- Involved in creating UNIX Shell Wrappers to run the deployed Ab Initio scripts.
- Worked efficiently on Teradata Parallel Transport and generated codes.
- Created series of Teradata Macros and Stored Procedures for various applications inTeradata SQL Assistant.
- Involved in writing complex SQL queries based on the given requirements and for various business tickets to be handled.
- Created several SQL queries and created several reports using the above data mart for UAT and user reports.
- Worked efficiently on Teradata Parallel Transport codes.
- Used several of SQL features such as GROUP BY, ROLLUP, CASE, UNION, Sub-queries, EXISTS, COALESCE, NULL etc.
- Responsible for trouble shooting, identifying and resolving data problems.
- Created proper PI taking into consideration both planned access and even distribution of data across all the available AMPS.
- Loaded and transferred large data from different databases into Teradata using MLoad and FastLoad.
Confidential
Teradata /ETL Developer
Responsibilities:
- Extensively involved in creating the detailed design documents, source to target mapping documents, test plans and technical design documents for the Creating and implementation as per the client requirements.
- Involved in writing clear requirements and bridge the gap between IT teams and the business teams.
- Responsible for redesign, performance tuning and enhancements of the existing ETL Process and also created ETL process for reporting requirements.
- Coordinating the Users and Reporting teams for the development efforts and served as the subject matter expert on Data Warehouse and ETL processes
- Worked with mappings using expressions, aggregators, filters, lookup, update strategy and stored procedures transformations.
- Worked in life cycle development including Design, ETL strategy, troubleshooting and Reporting. Identifying facts and dimensions.
- Proficient in writing Mload, FastLoad and TPump scripts from windows, UNIX and Mainframes environments.
- Expertise in writing Teradata procedures using BTEQ and SQL assistant.
- To implement the type 2 process in more than one table created a dynamic procedure using metadata layer which will insert/updates the tables on the fly.
- Fine-tuned existing Teradata procedures, macros and queries to increase the performance. Re-designed the table structures on one AMP basis. Well organized the primary index mechanism.
- Performance fine tuning of existing SQL queries for reports, ETL jobs to reduce the procession time, reduced the number of procedures by creating dynamic procedures.
- Extensively worked under the Unix Environment using Shell Scripts and Wrapper Scripts. Responsible for writing the wrapper scripts to invoke the deployed Informatica mappings.
- Performed potential transformations at the staging area, such as cleansing the data (dealing with missing elements, parsing into standard formats) combining data from multiple sources, de-duping the data and assigning surrogate keys.
- Worked on developing various parameterized mappings in Designer.
- Expertise in SQL queries for cross verification of data.
- Involved in Unit testing, System testing and debugging during testing phase. Enhanced the testing validation procedure by developing the Stored Procedure's using XML Plan Developed Perl scripts to automate the quantum plan comparison testing procedure.