- Data Warehouse Analyst with 4+ years of experience on several database platforms (namely Oracle, SQL, Teradata) and ETL methodologies.
- Proficient in using Informatica platform to extract, transform, and load data. Developed, tested, integrated, and deployed ETL routines using SQL, Informatica, UNIX shell scripting.
- Developed and implemented best technical practices for data movement, data quality, data cleansing, and other ETL related activities. Designed ETL processes and created source - to-target data mappings, integration workflows, and load processes.
- Implemented Type 1 and Type 2 slowly changing dimensions in ODS table loading to keep historical data in data warehouse.
- Performed ETL procedures to load data from different sources into data marts and data warehouse.
- 3+ years of experience in SQL and database development, with an emphasis on data warehouse methodologies and techniques.
- Extensive experience in developing Stored Procedures, Functions, Views and Triggers, complex SQL queries using SQL Server, TSQL, and Oracle PL/SQL.
- Strong understanding of ER modelling design goals and normalization techniques.
- Created UNIX shell scripts to extract data from various sources and load into the NO SQL database i.e., Mongo DB.
- Developed data models as per business specific requirements.
- Conducted training sessions for developers, and technical groups on best ETL practices.
- Worked with developers, DBAs, and systems support personnel in automating successful code to production. Have been Confidential part of the deployment procedure for multiple projects.
- Successfully closed many of the SIT issues during SIT phase within Confidential short span of time.
ETL Tools: Informatica Power Center 9.6.1
Programming: C++, C#, HTML/XML, Shell Scripts
Operating Systems: MS Windows, Unix
Databases: Oracle 10g/11g, Teradata 14.00.07.10 , Mongo DB, TOAD
- Created mappings and UNIX scripts to submit fee details to ENABLER using API for PPV purchases.
- Purchases with successful submission were moved to history tables.
- An alert was triggered if API failed to submit the fee for Confidential particular BAN more than the configured API retry count.
- Developed, tested, integrated, and deployed ETL routines using SQL, Informatica, UNIX shell scripting.
- Converted PL/SQL procedures to Informatica mappings and Confidential the same time created procedures in the database level for optimum performance of the mappings.
- Created Worklets, Workflows and Tasks to schedule the loads Confidential required frequency using Workflow Manager.
- Developed Mapplets, reusable transformations, source and target definitions, mappings using Informatica 9.1.0.
- Worked with Pre-Session and Post-Session UNIX scripts for automation of ETL jobs. Was also involved in migration/conversion of ETL processes from development to production.
- Generated queries using SQL to check for consistency of data in the tables and to update the tables as per business requirements.
- Performed performance tuning Confidential the functional level and mapping level. Used relational SQL wherever possible to minimize data transfer over the network .
Tools: used: Informatica 9, Oracle 10g, Teradata
- Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Joiner etc., transformations to implement complex logics while coding Confidential mapping.
- Developed and maintained ETL (Extract, Transformation, and Load) mappings to extract the data from heterogeneous data sources including Flat files, XML, and Excels and loaded into Oracle
- Used Teradata utilities like BTEQ, fast load, fast export, multi load for data conversion.