- 7 years of extensive experience in Data warehousing involving Data Analysis, ETL design and Development, Database coding, unit testing, Conducting UAT, Implementation of code using migration tools.: Extensive ETL experience using Informatica Power Center, Informatica Power exchange on various source systems such as Teradata, Oracle, Mainframe, XML files, Flat files, SAS Datasets.
- Experience in performance tuning of Informatica Sources, Targets, transformations, Session, implementing Database partitions, optimizing SQL queries, implementing Pushdown optimization.: Expertise in performing unit testing, System integration testing. Performed User acceptance testing as part of LOB for supported applications.
- Expertise in implementing SCD logic, Normalizer transformation, SQL transformation, Transaction control transformation, rank Transformation, Expertise in using Informatica debugger property . Expertise in mainframe file read implementation using Data maps.: Excellent analytical skills to understand data and approach the most optimized solution.
- Versatile team player with excellent analytical, presentation and interpersonal skills with an aptitude to learn new technologies.: Solid time management and multitasking skills which help in conducting project meetings, reviews, requirement walk through
- Experienced in Teradata MPP architecture, Load Utilities, TPT connections with ETL and Query analysis through explain plan.: Strong understanding of Banking concepts
- Managed ETL Production support team responsible for the 24x7 ETL cycles of the CORE Customer Data mart processing system utilizing US and offshore resources.: Proficient in supporting client data warehouse applications on technical as well as from domain perspective.
- Actively participated in disaster recovery exercises involving Teradata, Informatica and Linux environment.: Experienced in UNIX work environment, file transfers protocol, job scheduling tools
ETL and Database tools: Informatica 8.6.1/9.5.1 , Informatica Powerexchange,Teradata Studio, Toad Data Point
Database: Teradata, Oracle 10g, Teradata unity Datamover
Scripting: UNIX Shell Scripting,PL/SQL
ITSM Tool: Service Now
Migration Tool: Harvest,IBM Udeploy
ETL Production Support
- Interacting with the Line is business on regular basis to discuss user reported issues and small work efforts.
- Performed as On Call support to review any production ABEND and fix the same as per priority.
- Actively working with Business users to analyze their queries and provide feedback.
- Carry out maintenance activity which involves less than 300 hours development and testing effort. This includes LOB requests to modify production code with changed requirement, automate manual work.
- Actively participated in Disaster recovery exercises as part of PNC’s Business continuity plan. It involved teradata, Informatica servers and UI servers.
- Extensive experience on working with PNC vendors to troubleshoot any issues related to transmission or data related.
- Performed analysis of applications to review scope of automation and implemented the same .
- Performed reviews of new developments for our data mart . Support guide review was important to prepare ourselves to support new source data.
- Pro actively reviewed Primary and Backup databases to gain knowledge on any data differences between them . Expertise in teradata unity data mover to sync up data between primary and backup.
- Actively participated in platform migration events and validating the same .
- Active regular engagement with development to guide them on standard to be followed for new projects.
- Worked with Mainframe developer to get files and data in required format.
- Created Data maps to read mainframe and set up the connector .
- Worked with Line of Business to understand the requirements of setting paperless alerts for different alert type of customer.
- Working with online banking team to understand data model for customer’s alert profile.
- Developed ETL Code to read core data and Lookup online banking to collect alert profile and used SQL transformation to delete alerts . Created Mainframe file and worked with written communications to load the data to create paperless alert .
- Reused the ETL code for NSF alert after successful execution to modify customer’s overdraft alert.
- As the changes directly impacting customer’s experience, worked with different stakeholders to minimize impact.
- Monitered the execution of ETL code as the change is one time in production.
- Data analysis of on enterprise database and source systems of CATS about possibility of data elements being passed to different tables.
- Worked with EDW team to publish a new table after reviewing the same with Scoring team .
- Created Low level and High level design documents for LOB to review .
- Worked with Application DBA to create an optimized approach of the code development.
- ETL code development from STM document .
- Worked with LOB for new roles created for the new table.
- Working with LOB to validate the data.