Informatica And Teradata Specialist / Data Analyst Resume
Dallas, TexaS
SUMMARY:
- Lead ETL Informatica & Teradata Developer / Data Analyst
- Nine Plus Years of IT industry experience in working with various domains like Insurance, Retail, Finance, Library Management and Telecom.
- Strong Data Warehousing experience in using Informatica Power Center different versions 7.1.2/ 8.5.1/8.6.1/9.1/9.5.1/9.6.1.
- Strong experience in using Database systems like Oracle, Oracle GG, Teradata 12.0/13.0/14.0/14.10, DB2, MySQL, and Sybase.
- Strong experience in Integrating Informatica with SFDC & extracting data from it.
- Experience in writing SQL and PL/SQL programs stored procedures, Triggers, Functions.
- Having Seven years of strong experience on using Teradata database, Teradata utilities like TPT, FASTLOAD, MULTILOAD, and BTEQ scripts.
- Good Understanding in protection & recovery concepts in Teradata like Fallback, Raids, and Journals .
- Have enough knowledge on Teradata Administrative activities .
- Have strong experience in Tuning Teradata Queries (BTEQ) for better performance.
- Have strong experience using Informatica and Teradata combination in Enterprise Warehouse environment.
- Strong experience in working with Flat Files & XML objects .
- Expereince in rewriting Hadoop HiveQL and Map Reduce scripts in to Informatica.
- Expertise with Vertica scripts and Loads.
- Expertise in handling DWH Unix to Linux Migrations (U2L).
- Experience in Debugging and implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows in Informatica ETL mappings.
- Created Unit Test cases and Unit Test results.
- Strong expertise in understanding the Warehouse Data model.
- Solid expertise in using ETL Scheduling tools.
- Strong experience in writing UNIX Shell Scripts.
- Supported Enterprise ware house in critical environments and worked 24 * 7 to make sure not effecting the business.
- Experience in Profiling source data and coming up with model (LDM, PDM) and ETL business requirements(STM - Source to Target Mapping Document).
- Attended class room training on Agile Methodology Techniques and Certified in Scrum Master Program (CSM).
- Excellent verbal and communication skills clear understanding of business procedures and ability to work as an individual as well as part of a team.
- Excellent Analytical and Problem Solving Skills.
TECHNICAL SKILLS:
Data Warehousing: Informatica Power Center 9.6.1/9.5.1/9.1/ 8.6.1/8.5.1 /7.1.2
Databases: Teradata 12.0/13.0/14/14.10 (Fast-Load, Multi-Load, T-Pump, TPT and BTEQ), Oracle 11g/10g/9i,SFDC, Sybase 12.0, DB2 UDB 8.0 / DB2 BLU, MySQL, Vertica, Hadoop(Hive QL, Map Reduce)
BI Tools: Cognos 8.0/7.1, Report Builder (Custom)
Data Modeling: MS Visio, ERWIN
Job Scheduling: CA Autosys, BMC Control M, Informatica Scheduler, Maestro(TWS) 8.5/9.2
Programming: Unix Shell Scripting, XML, SQL and PL/SQL
Environment: Windows, Unix, Linux
Other Tools: TD SQL Assistant, TD View Point, Toad, Win SQL, SQL Navigator, Putty, WinScp, SVN, Gromit, Rally, Jira
PROFESSIONAL EXPERIENCE:
Confidential, Richardson, Dallas, TEXAS
Informatica and Teradata Specialist / Data Analyst
Responsibilities:
- Working with Business users in understanding the requirement.
- Working with Data Modelers in implementing Logical and Physical Data Models for Vertica.
- Analyzed existing Informatica code & Teradata scripts to understand the process how the data has been loaded to application database in Teradata and built the process to handle the data which is missing in it.
- Used different transformations available in Informatica effectively to optimize the performance and to increase the through put.
- Did Tuning of existing Teradata bteq scripts for loading the missing historical data in faster & effective manner.
- Good experience in understanding the development spec document - Gromit & provided necessary suggestions to make sure it is understandable way to all other dev teams.
- Re used & modified all unix driver scripts which are used for Migrating History data from Teradata to Vertica.
- Did profiling Teradata data for better implementation of Initial loads & Delta logics.
- Handled huge loads as part of Initial loads.
- Worked on using Teradata TPT for exporting data in to files and from there in to Vertica as part of Initial process.
- Written Vertica scripts to load data in to staging from files & from there inserts/ Upserts in to Vertica target as part of Delta loads.
- Experienced with Unix to Linux (U2L) Migration of DWH Environment.
- Part of Sustainment Team (Prod Support Level 3 - to fix data issues if any raised by business for prod environment.
- Experienced in using Support Ticketing tool BMC Remedy AOTS Tool.
- Did effective Unit testing & Involved in code review / test results with entire development team.
- Worked in documenting the development process & Deployment Guide to hand it over to System Testing.
- Worked with testing team in explaining the development process & Deployment Guide.
- Having good experience in doing Bug fixes which raised in System Testing.
- Good experience in doing code migration using SVN.
- Involved in doing GAP Analysis for changes in source system.
- Experience with Maestro scheduler to schedule on going loads.
- Experience in Release management for code deployments in to production in phases.
- Experience in Using Agile methodology - attending scrums, Rally for task management & retrospective for improvements. Also executed different development project in Waterfall Methodology as well.
Environment: Informatica Power Center 9.6.1/8.6.1, Teradata 14, Vertica, Oracle, Oracle GG, UNIX, Linux, SVN, Gromit, Maestro, Sql Assistant, Toad, Windows XP Professional
Confidential, Columbus, OHIOLead Informatica Developer / Data Analyst
Responsibilities:
- Working with Business Users, ETL Architect to gather & understand the requirements.
- Working with Hadoop Developers in understanding the existing ETL process written in Hive QL & Map Reduce for OCLC Circulation.
- Experience in analyzing the existing code written in Hadoop with Hive QL & Map Reduce and creating the Mapping documents.
- Re written ETL process using Informatica Power Center from the mapping document created from Hadoop Process.
- Extracted Data from Flat Files, My Sql and Oracle and loaded in to DB2 BLU.
- Created initial process to migrate the exiting warehouse data in to DB2 BLU from Legacy My Sql.
- Created ongoing process to load data in to different Facts and Dimensions in DB2 BLU.
- Created Complex process using Informatica for SCD combining Type1 & Type 2 and individual based on attributes.
- Used Most of the transformations available in Informatica PC.
- Involved in Performance Tuning for Informatica Process and worked with DB2 Admin for improving the performance at DB level.
- Monitoring and guiding other Informatica developers to make sure the development in on track to meet the deliverable time lines.
- Written unix shell scripts for various purposes.
- Created Technical design documents for getting sign off on development and handing over for production.
- Involved in KT sessions to handover the process to client.
- Scheduled Informatica workflows in Control M.
- Executed this project using Agile Methodology.
- Involved in Daily status meetings and scrum planning meetings.
Environment: Informatica Power Center 9.6.1, Oracle, Flat File, My Sql, DB2 BLU, Hadoop, Hive QL, UNIX, Control-M, Jira, Windows XP Professional
Confidential, Austin, TX
Lead Informatica and Teradata Specialist / Data Analyst
Responsibilities:
- Working with Business Users to gather the requirements.
- Working with Data Modelers in implementing Logical and Physical Data Models.
- Pull data from different sources like oracle, MySql, applications (SFDC) and Flat Files using Informatica Power center 9.5.1/8.6.1 and loaded it into Teradata.
- Strong understanding of XML objects and processing them using Informatica.
- Extensively used different transformations available in Informatica for EDW development.
- Extensively used Teradata Utilities TPT, M- Load & F-Load for most of the ETL Loads and Fast Export for extracting data from Teradata.
- Written Teradata BTEQ scripts for loading Hoovers Site usage data.
- Worked to profile source data to analyze and giving the outputs about data behavior in the source.
- Involved in doing GAP Analysis for changes in source system.
- Implemented ftp process with UNIX to ftp the source files from application server to local informatica server and vice - versa
- Implemented cleansing techniques to clean the input files for ETL Process.
- Implemented various Archiving and purging process in accordance to business for better management of Teradata environment.
- Implemented various audits in EDW for better tracking of EDW Loads and process.
- Implemented various types of change data captures according to source data behavior and business requirements.
- Worked in analyzing ware house data for business purposes.
- Implemented various Performance tuning techniques at ETL & Teradata BTEQ for efficient development and performance.
- Used Teradata View point to monitor the performace of Jobs (BTEQ) in Teradata.
- Involved in doing Error Handling using session logs and workflow logs.
- Involved in migrating the ETL, Teradata and UNIX objects to different environments.
- Involved in doing Unit Testing and Integration Testing in both Dev and QA.
- Created Unix Shell Scripts to run the jobs, Ftp from local servers and schedule the workflows.
- Hands on experience in developing ETL templates for POCs and high level discussion on ETL architecture.
- Worked with external vendors (omniture, Adobe) to understand their data feeding for hoovers site usage file structure and worked extensively in profiling data and getting designing ETL Layout.
- Involved in doing analysis, document preparation and Testing for Upgrading Teradata database from TD12 to TD14.
- Worked in Testing ETL environment for Upgrading activity from INFA 8.6 to INFA 9.1 and to INFA 9.5.1.
- Working for on call to support ware house production data.
- Experienced in analyzing HiveQL scripts in Hadoop that were written to load data from MySQL to data mart and re built the Mapping document which helps to rewrite the logic in Informatica to load data from MySQL to DB2 BLU.
- Worked to generate the adhoc reports in Custom Report Builder for Business users
- Working with reporting team to generate accurate reports for business users from EDW.
Environment: Informatica Power Center 9.6.1/9.5.1/9.1.0/8.6.1, Oracle 10g, MySql, Teradata 14.10/12.0, SFDC, Flat Files, XML, HiveQL, UNIX, Cognos 8.0, Report Builder, Windows XP Professional
ConfidentialInformatica developer
Responsibilities:
- Worked with Business Analyst in gathering the requirements and prepared Source to Target Mappings which helpful in designing and developing Enterprise Information Store.
- Prepared Design Documents which is helpful in Analyzing how data got migrated from OLTP to EIS/EDW for production support.
- Extracted data from Oracle and Flat Files using Informatica Powercenter 8.6.1/8.5.1 and loaded it into TeraData.
- Extensively used TeraData Utilities (T-Pump, M-Load, and F-Load) to load large volumes of data in to EIS.
- Used TeraData utility Fast Export for fast extracting of data from TeraData.
- Worked with hundreds of mater Tables and Fact Tables to load in EDW.
- Created shell script to release loader locks (M-Load, Fast Load) in case of any failures.
- Used BTEQ scripting for executing Sqls as part of post success commands.
- Written Oracle stored procedures as per the business requirements.
- Extensively used different transformations available in Informatica Such as Source Qualifier, Aggregator, Expression, Joiner, Union, Router, Filter, Lookup, and Normalizer while developing the mappings.
- Worked with Task developer extensively in creating Reusable Sessions and created task such as email.
- Created worklets using worklet designer.
- Created mapping Parameters and Variables.
- Implemented various Performance Tuning techniques in Informatica.
- Involved in doing Error Handling using session logs and workflow logs.
- Migrated developed mapping objects from one folder to other folder in DEV using Import Export fecility, from Dev to QA and to Prod Repositories by creating labels.
- Worked in tough and tight schedules for emergency fixes in production support on call.
- Created Test Plans and written test cases while doing Unit Testing, Integration Testing and worked with QA team in getting tested the code in QA Repository.
- Worked with Quality center in creating, tracking and assigning the defects while testing.
- Used Autosys in Scheduling and monitoring the jobs in QA and PROD.
- Created Unix Shell Scripts to schedule the workflows Using Autosys in QA and Prod Environment.
Environment: Informatica Power Center 8.6.1/8.5.1 /7.1.2, Oracle 10g, Flat Files, Teradata 12.0, Cognos 8.3, UNIX, F-Secure SSH Client, Erwin, SQL Assistant, BTEQ, TOAD 9.0.1, SQL Plus, Auto Sys, Windows XP Professional
Confidential, Buffalo, NewYorkInformatica Developer
Responsibilities:
- Worked with Business data Analysts (BDA) to understand the requirements for EDW development.
- Extracted data from various heterogeneous sources like Sybase, Flat Files and COBOL (VSAM) using Informatica Power center and loaded data in target database DB2.
- Involved in DWH up gradation for source system changes.
- Utilized all the features of Source Qualifier transformation such as filter, joiner, sorter and SQL override to the extend level at the source level.
- Used Normalizer Transformation for Cobol (VSAM) sources.
- Extensively worked with different transformations such as Expression, Aggregator, Sorter, Joiner, Router, LookUp, Filter and Union in developing the mappings to migrate the data from source to target.
- Worked with External stored procedures for data cleansing purpose.
- Extensively used Lookup transformation and Lookup Caches in looking the data from relational and Flat Files.
- Created Mapping parameters and Variables and written parameter files.
- Created UNIX shell scripts for various needs.
- Involved in doing Unit Testing, Integration Testing and Data Validation.
- Implemented various Performance Tuning techniques by finding the bottle necks at source, target, mapping and session and optimizing them.
- Worked with the Debugger Wizard in debugging the mappings.
- Extensively worked with session logs and workflow logs in doing Error Handling and trouble shooting.
- Worked with the Control M scheduling team in scheduling Informatica Jobs as per the requirement and frequency.
- Worked with the Cognos team in generating various reports.
- Involved in preparing the Migration Documents.
- Implemented Informatica Procedures and Standards while developing and testing the Informatica objects.
Environment: Informatica Power Center 7.1.2, Sybase, DB2, Flat Files, Cobol (VSAM),Win SQL, Toad, Ultra Edit - 32, SQL Advantage, Control Center, Power Designer SQL Modeler, CDMA, Ms-Access, MS-Visio, Cognos, UNIX, Windows XP Professional.