Etl/ Teradata Developer Resume
White Plains, NY
PROFESSIONAL SUMMARY:
- Around 5 years of experience in Information Technology with a strong background in Database development and Data warehousing and ETL process using Informatica Power Center 9.x Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.
- Proficiency in developing SQL with various relational databases like Oracle, SQL Server, Greenplum.
- Extensive experience with Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, Normalizer, Union.
- Proficient in Oracle Tools and Utilities such as TOAD and SQL*Loader.
- Strong understanding of OLAP and OLTP Concepts.
- Knowledge of IDQ 9.6.1 standards, guidelines and best practices
- Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1
- Understanding & Working knowledge of Informatica CDC (Change Data Capture).
- Good knowledge of Data modeling techniques like Dimensional/ Star Schema, Snowflake modeling, slowly changing Dimensions.
- Flexible, enthusiastic and project oriented team player with solid communication and leadership skills to develop creative solution for challenging client needs.
- Hands - on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.
- Experience UNIX shell scripting, file management in various UNIX environments.
- Responsible for migrating the workflows from development to production environment.
- Able to work independently and collaborate proactively & cross functionally within a team.
- Strong hands on experience using Teradata utilities (SQL, B-TEQ, Fast Load, MultiLoad, FastExport, Tpump), Teradata parallel support and Unix Shell scripting.
- Proficient in coding of optimized Teradata batch processing scripts for data transformation, aggregation and load using BTEQ.
- Experience in managing teams/On Shore-Off Shore Coordination/Requirement Analysis/Code reviews/Implementing Standards.
TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 9.6/9.5, Informatica Data Quality
Databases: Oracle 12g/11g/10g, Greenplum, Teradata V15
Operating Systems: UNIX, Linux, Windows 2012/2008/NT, Windows 7/8
Languages: C, C++, SQL, HTML, PHP, UNIX Scripting
Reporting Tools: Business Objects XIR2, SAP BI 7.0, POWER BI
Other tools: Toad, Putty, Autosys. Sql Assistance, Control M
PROFESSIONAL EXPERIENCE:
Confidential, White Plains, NY
ETL/ Teradata Developer
Responsibilities:
- Involved in client meetings on daily basis for understanding and analyzing the business requirements.
- Involved in client meetings on daily basis for understanding and analyzing the business requirements.
- Worked with BTEQ in UNIX environment and execute the TPT script from UNIX platform.
- Used SQL Assistant to querying Teradata tables.
- Worked with complex SQL queries to test the data generated by ETL process against target data base.
- Wrote numerous BTEQ scripts to run complex queries on the Teradata database.
- Used volatile table and derived queries for breaking up complex queries into simpler queries.
- Created a cleanup process for removing all the Intermediate temp files that were used prior to the loading process. Stream lined the Teradata scripts and shell scripts migration process on the UNIX box.
- Wrote the incremental logic for the dimensional tables to run Incremental Load in BTEQ scripts.
- Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files
- Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
- Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
- Wrote Teradata Macros and used various Teradata analytic functions . Monitoring the Teradata queries using Teradata View Point.
- Created proper Primary Index (PI) talking into consideration of both planned access of data and even distribution of data across all the available AMPS.
- Creating and maintaining source-target mapping documents for ETL development team.
- Scheduled the Bteq scripts using Control-M tool. Writing shell scripts to get the collect stats from Control-M jobs.
- Experience in helping/giving queries to POWER-BI reporting team to get the reports.
- Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
- Designed and developed physical models with appropriate Primary, Secondary, PPI and Join Index taking into consideration of both planned access of data and even distribution of data across all the available AMPs.
- Designed and developed workflows and automated with UNIX shell scripts and ESP. Created shell scripts to drop and re-create indexes to load data efficiently and decrease load time.
Environment: Teradata V14, Teradata SQL Assistant, Informatica Power Center 9.1, BTEQ, MLOAD, FLOAD, FAST EX-PORT, UNIX Shell Scripts, Control-M scheduling tool.
Confidential, N.A, CA, LA
ETL/ Informatica Developer/ Support Engineer
Responsibilities:
- Involved in client meetings on daily basis for understanding and analyzing the business requirements.
- Coordinating with client and offshore to complete work on time.
- Regular touch base with infa admins, Gp admins, autosys team
- Attending weekly meetings with UDW L3 team, EDH L3 team and with offshore.
- Attending KT sessions for new deployments to prod
- Working on Queries raised by Offshore and helping them to solve the issues to maintain smooth execution of batch process
- Working on creating new approaches for enhancing the regular process
- Attending adhoc meetings
- Responding to adhoc queries raised by upstream/downstream/application teams.
- Helping offshore team in testing code changes and reviewing them
- Deploying the changes tested to prod by raising CM.
- Preparing the technical documents (like design documents, HLD’s, LLD’s) from business requirement documents.
- Prepare test strategies and execution plan. Monitoring progress of work.
- Extensively developed informatica mappings, mapplets and workflows.
- Used informatica Workflow manager to develop, schedule sessions and send pre-and post-session emails to communicate success or failure of session execution.
- Extensively used Greenplum SQL
- Maintenance existing GL application.
Environment: Informatica Power Center 9.6.1, Oracle 11g, Linux, Autosys, Toad, Greenplum, Wherescape Red, Powerexchange.
Confidential, N.A, CA, LA
ETL/ Informatica Developer
Environment: Informatica Power Center 9.6.1, Oracle 11g, Linux, Autosys, Toad, Greenplum
Responsibilities:
- Extensively used Informatica Client tools - Power Center Designer, Workflow Manager, Workflow Monitor and Repository Manager
- Extracting data from Oracle and Flat file, Excel files and performed complex joiner, Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformations and Update strategy transformations to load data into the target systems.
- Extracted the data from various data source systems into the Landing Zone area by creating the Informatica mappings using the Teradata fast Loader Connections.
- Created Sessions, Tasks, Workflows and worklets using Workflow manager.
- Worked with Data modeler in developing STAR Schemas
- Used TOAD, SQL Developer to develop and debug procedures and packages.
- Involved in developing the Deployment groups for deploying the code between various environment (Dev, QA).
- Work on the performance improved areas. Debug the issues and coming up with the proper solutions which will reduce the process times.
- Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
- Wrote Teradata Macros and used various Teradata analytic functions.
- Wrote Fast export scripts to export data
- Good experience with IDQ analyst tool and Developer tool.
- Good experience in Glossary and Rule specifications.
- Good experience in creating connections in IDQ tool like standardizer, Labeller, Match and Merge.
- Good experience in creating mapplets for Powercenter in IDQ tool.
- Configured Address doctor content on both PC and IDQ servers and helped users in building scenarios
- Experience developing and supporting complex DW transformations
- Excellent understanding of Star Schema Data Models; Type 1 and Type 2 Dimensions.
- Created pre sql and post sql scripts which need to be run at Informatica level.
- Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading.
- Experience on scheduling the workflows using AUTOSYS.
- Experience in created dependencies AUTUSYS using JIL Script.
- Experience in ON-HOLD, OFF-HOLD, ON-ICE, OFF-ICE commands in AUTOSYS.
- Expertise in using both connected and unconnected Lookup Transformations.
- Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
- Monitored and improved query performance by creating views, indexes, hints and sub queries
- Work on the performance improved areas. Debug the issues and coming up with the proper solutions which will reduce the process times.
- Created mappings by cleansing the data and populate that into Staging tables, populating the staging to Archive and then to Enterprise Data Warehouse by transforming the data into business needs & Populating the Data Mart with only required information
Confidential
ETL Developer
Responsibilities:
- Developed ETL mappings, transformations using Informatica Power Center 9.5.1
- Extracted data from flat files, DB2 and loaded the data into Oracle staging using Informatica Power Center.
- Designed and created complex source to target mapping using various transformations inclusive of but not limited to Sorter, Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations. .
- Created various transformations such as Union, Aggregator, Lookup, Update Strategy, Joiner, Filter and Router Transformations.
- Used the Dynamic look up to implement the CDC.
- Extensively worked with SCD Type-I, Type-II and Type-III dimensions and data warehousing Change Data Capture (CDC).
- Used reusable Session for different level of workflows.
- Designed various tasks using Informatica workflow manager like session, command, email, event raise, event wait and so on.
- Designed and developed UNIX shell scripts as part of the ETL process to automate the data load processes to target.
- Involved in Unit Testing, User Acceptance Testing (UAT) to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements
Environment: Informatica PowerCenter 9.5.1, SQL server 2012, DB2, PL/SQL, Toad.
Confidential
Application ETL Developer
Responsibilities:
- Responsible for design and development of Sales Data Warehouse.
- Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
- Extracted data from heterogeneous sources like Oracle, SQL Server
- Good knowledge of key Oracle performance related features such as Query Optimizer, Execution Plans and Indexes.
- Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS.
- Created Packages and Procedures to automatically drop table indexes and create indexes for the tables.
- Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Conducted a series of discussions with team members to convert Business rules into Informatica mappings.
- Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
- Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
- Created Unix Shell Scripts to automate sessions and cleansing the source data.
- Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.