teradata/etl Developer Resume
Richmond, VA
PROFESSIONAL SUMMARY:
- Around 7 years of IT experience in Data warehousing with emphasis on Business Requirements Analysis, Application Design, Development, coding, testing, implementation and maintenance of client/server Data Warehouse and Data Mart systems.
- Worked in Data Warehouse and Business Intelligence Projects along with the team of Informatica (ETL).
- Experience in Design and Development of ETL methodology for supporting Data Migration, data transformations & processing in a corporate wide ETL Solution using Teradata TD 14.0/13.0/12.0.
- Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.
- Involvement in all phases of SDLC (Systems Development Life Cycle) from analysis and planning to development and deployment.
- Experience in various stages of System Development Life Cycle(SDLC) and its approaches like Waterfall & Agile Model.
- Implemented and followed a Scrum Agile development methodology within the cross functional team and acted as a liaison between the business user group and the technical team.
- Experience in scheduling Sequence and parallel jobs using Data Stage Director, UNIX scripts and scheduling tools.
- Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
- Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star - Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Experience in UNIX shell scripting for processing large volumes of data from varied sources and loading into databases like Teradata and Vertica.
- Data and Database Migration including Mainframe to PC database conversions and Data Mapping, retrieval, cleansing, consolidation, mapping, and reporting for client review.
- Experience in OLTP Modeling (2NF, 3NF) and OLAP Dimensional modeling (Star and Snow Flake) using Erwin (conceptual, logical and physical data models).
- Significant experience in ETL (Extract, Transform and Loading data) tool Informatica Power Center in analyzing, designing and developing ETL processes for Data Warehousing projects using Informatica Power Center (9.5.1/9.x/8.x/7.x/6.x).
- Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformations, Informatica Repository Manager and Informatica Server Manager.
- Worked on technologies like Oracle, Netezza, and SQL Server.
- Designed and developed interfaces to load the data from multiple sources like Relational databases, flat files into oracle database.
- Having knowledge in implementing quality logical and physical ETL designs in the MuleSoft framework that have been optimized to meet the operational performance.
- Hands on experience in debugging and performance tuning of sources, targets, mappings and sessions.
- Experience in integration of various data sources definitions like SQL Server, Oracle, Teradata SQL Assistant, MYSQL, Flat Files, XML and XSDs.
- Experience on Teradata tools and utilities (BTEQ, Fast load, Multi Load, Fast Export, and TPUMP).
- Experience in TSQL, PL/SQL, UNIX shell scripting & Perl scripting, C++, C.
- Having handsome knowledge on JAVA.
- Joiner, sorter, Aggregator, JAVA Update Strategy, Filter and Router transformations.
- Profound knowledge about the architecture of the Teradata database and experience in Teradata Unloading utilities like Fast export.
- Strong skills in SQL, PL/SQL packages, functions, stored procedures, triggers and materialized views to implement business logic in oracle database.
- Experience with relational databases such as Oracle 8i/9i/10g, SQL SERVER 2005/2008.
- Worked with various SQL Editors such as TOAD, SQL Plus, and Query Analyzer.
- Experience with UNIX Shell Scripting (KSH - KORN Shell Scripting).
- Experienced with scripting languages Unix Shell, Perl.
- Experience in identifying and resolve ETL production root cause issues.
- Experience in maintenance, enhancements, performance tuning of ETL code.
- Involved in Unit testing, System testing to check whether the data loads into target are accurate.
- Good working experience in writing SQL and PL/SQL scripts including views and materialized views.
- Experience working on Informatica Scheduler for job scheduling.
- Strong analytical and problem solving skills.
- An excellent team member with an ability to perform individually, good interpersonal relations, strong communication skills, hardworking and high level of motivation & Ability to work effectively while working as a team member as well as individually.
- Desire to learn new skills, technologies, and adapt to new information demands.
TECHNICAL SKILLS:
ETL: ToolsInformatica Power Center 9.6.1/9.5.1/9.1.0/8. x, Informatica Data Quality 8.6, B2B Data Transformation, Informatica Power Exchange 9.x, Web Services
BI Tools: Business Objects XI R2(Desktop Intelligence, WEBI Designer, DB2
Databases: Oracle 11g/10g/9i/8, MS Access, Teradata 15/14/13/12, Netezza, MS SQL Server 2000/2005/2008
Languages: SQL, PL/SQL, JAVA, UNIX
Data Modeling: Star Schema Modeling, Snow Flake Modeling, Erwin and Dimensional modeling
Operating systems: Windows 95/98/00/07/NT/XP, UNIX (Sun-Solaris, HP/UX), LINUX
Other Tools: Toad, SQL Developer, WINSCP, Putty, Clear Case, Build forge, Control-M, WLM, Teradata SQL Assistance
Scripting: UNIX shell scripting, Perl, Batch Script, FTP
Process Models: Waterfall, Agile(Kanban Board) Models
PROFESSIONAL EXPERIENCE:
Confidential, Richmond, VA
Teradata/ETL Developer
Responsibilities:
- Involved in understanding the Requirements of the End Users/Business Analysts and Developed Strategies for ETL processes.
- Developed various graphs to process Contract, Group, Member and Pharmacy Claims data based on business requirements utilizing functionalities like Rollup, Lookup, Scan, etc.
- Extracted data from different sources like MVS data sets, Flat files (“pipe” delimited or fixed length), excel spreadsheets and Databases.
- Used Teradata utilities FAST LOAD, MULTI LOAD, TPUMP to load data.
- Wrote, tested and implemented Teradata Fast load, Multiload and BTEQ scripts, DML and DDL.
- Managed all development and support efforts for the Data Integration/Data Warehouse team.
- Used Informatica power center 9.6.1 to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
- Involved in migration of the mapps from IDQ to power center.
- Involved in analyzing different modules of facets system and EDI interfaces to understand the source system and source data.
- Developed and deployed ETL job workflow with reliable error/exception handling and rollback within the MuleSoft framework.
- Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Provided on call support during the release of the product to low level to high level Production environment.
- Used Agile methodology for repeated testing.
- Worked with Tidal scheduling tool for jobs scheduling.
- Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
- Developed UNIX KORN Shell wrappers to initialize variables, run graphs and perform error handling.
Environment: Informatica Power Center 9.6.1, Informatica Developer client 10.1.1, Informatica BDM, IDQ 9.1/9.5.1, Teradata 14, TPT, SQL Assistant, MYSQL, Unix, oracle 11/g, SQL Server, Clear Case, WLM, FTP.
Confidential, Richmond, VA
ETL/Teradata Developer
Responsibilities:
- Developed mappings/Reusable Objects/Transformation/Mapplet by using mapping designer, transformation developer and Mapplet designer in Informatica Power Center 9.5/9.1.
- Involved in Design and development of new data warehouse (Analytical Data Warehouse) for better reporting and analysis.
- Worked with several vendors in sending outbound extracts and load in bound files into the warehouse.
- Worked on developing Informatica Mappings, Mapplet, Sessions, Worklets and Workflows for data loads.
- Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
- Good knowledge in B2B Data Transformation, B2B data exchange.
- Used Teradata utilities fastload, multiload, tpump to load data.
- Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL.
- Performed tuning and optimization of complex SQL queries using Teradata Explain.
- Expert level knowledge of complex SQL using Teradata functions, macros and stored procedures.
- Worked closely with Facets 4.48, 4.51 and different EDI transaction file like 837,834, 835 to understand the source structure and the source data pattern.
- Involved in Business process and Analysis of the EDI Implementation Transactions related to our customers.
- Analyzed the source data, made decisions on appropriate extraction, transformation, and loading strategies
- Analyzed complex ETL requirements/tasks and provided estimates/ETCs.
- Developed Fast Load jobs to load data from various data sources and legacy systems to Teradata Staging.
- Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Applied the rules and profiled the source and target table's data using IDQ.
- Extensively worked in data Extraction, Transformation and loading from Xml files, large volume data and Adobe PDF files to EDW using B2B data transformation and B2B Data exchange.
- Worked with BTEQ in UNIX environment and execute the TPT script from UNIX platform.
- Extensively used Data Stage for extracting, transforming and loading databases from sources including Oracle, DB2 and Flat files.
- Executed sessions by defining the parameter values through ABC (Audit Balance Control) to track and re-run the failed session easily.
- Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
- Worked with Tidal scheduling tool for jobs scheduling.
- Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
- Created and Developed re-usable transformations, mappings and Mapplets confirming to the business rules.
- Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results.
Environment: Informatica Power Center 9.5/9.1, Informatica IDQ 9.1/9.5.1, Teradata 13, Mload, Fast load, TPT, Fast export, Tpump, BTEQ, Unix, Queryman, Teradata Manager, Java, SQL Assistant, oracle 11/g, SQL Server.
Confidential, Richmond, VA
Teradata/Informatica Developer
Responsibilities:
- Involved in understanding the Requirements of the End Users/Business Analysts and Developed Strategies for ETL processes.
- Developed various graphs to process Contract, Group, Member and Pharmacy Claims data based on business requirements utilizing functionalities like Rollup, Lookup, Scan, etc.
- Extracted data from different sources like MVS data sets, Flat files (“pipe” delimited or fixed length), excel spreadsheets and Databases.
- Used Teradata utilities FAST LOAD, MULTI LOAD, TPUMP to load data.
- Wrote, tested and implemented Teradata Fast load, Multiload and BTEQ scripts, DML and DDL.
- Managed all development and support efforts for the Data Integration/Data Warehouse team.
- Used Informatica power center 9.0.1 to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
- Worked in development of Big Data POC projects using Hadoop, HDFS, Map Reduce, Hive.
- Understand the structure of data, build data architecture and implement data model in Vertica, and carry out data mapping from legacy Oracle system to Vertica.
- Created TPT to transfer the data Oracle system to Teradata.
- Saved resources from a claims tracking process by modeling a Claims data mart to contain aggregated claims & Rolled up claims by coding SQL stored procedures, Informatica (ETL) and UNIX scripts.
- Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.
- Prioritized requirements to be developed according to Agile methodology.
- Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Provided on call support during the release of the product to low level to high level Production environment.
- Worked with Tidal scheduling tool for jobs scheduling.
- Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
Environment: Informatica Power Center 9.5.1, Informatica BDM, Teradata 14, SQL Assistant, MYSQL, Oracle, Unix, oracle 11/g, TPT, Vertica 5.1, Java, SQL Server, WLM, Clear Case, FTP.
Confidential
Informatica Developer
Responsibilities:
- Worked on Informatica - Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplets and Transformations.
- Involved in the development of Informatica mappings and performance tuning.
- Used most of the transformations such as the Source Qualifier, Aggregators, Lookups, Filters and Sequence.
- Designed the procedures for getting the data from all systems to Data Warehousing system.
- Extensively used ETL to load data from different databases and flat files to Oracle.
- Peforming GAP Analysis for various EDI transactions like 810, 850, 856.
- Extensively worked on Database Triggers, Stored Procedures, Functions and Database Constraints. Written complex stored procedures and triggers and optimizers for maximize performance.
- Creating and running Sessions & Batches using Server Manager to load the data into the Target Database.
- Using Export & Import Utilities and SQL Loader for data refreshing from production to development environment.
- Involved in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.
- Troubleshooting while testing is in progress in the development environment, which includes monitoring of Alert Log, Trace file and fixing software bugs.
Environment: Informatica 8.x, Erwin, Oracle 10g, EDIFACT, SQL Server 2005, DB2, SQL*Plus, SQL Loader, SQL Developer, AUTOSYS, Flat files, UNIX, Windows 2000.
Confidential
SQL /Informatica Developer
Responsibilities:
- Used the correlated sub queries to fetch the data from the database tables.
- Used Bulk collect, bind variables, Ref cursors.
- Created database triggers for audit tables.
- Created database objects like Tables, Synonyms, Views, Indexes, and Sequences.
- Creating database objects like Tables, Views, Synonyms, Procedure, Trigger, and Functions.
- Involved in creating database triggers to insert the data into the Audit tables.
- Written complex quires to fetch the data by using analytical functions.
- Documented the Unit test cases and provided the Unit test data for the testing team.
- Providing the UAT support.
- Given knowledge transfer session for the support team.
- Generated PL/SQL packages functions and procedures to implement the business logic.
- Worked on the XML File to load the data into the Database.
Environment: Informatica Power Center 8.6, Windows XP, Toad, SQL Developer, Oracle 9i, Flat files, SQL, Relational Tools, Clear Case, UNIX (HP-UX, Sun Solaris, AIX), UNIX Shell Scripts.