- 8 years of extensive experience in Information Technology with special emphasis on Design and Development of Data Warehousing using Informatica Power Center .
- Informatica Power Center Certified Developer V9.5.
- Involved in Full Life Cycle Development (Waterfall & Agile) of building a Data Warehouse on Windows and Unix Platforms for Healthcare Insurance, Investment Banking, Financial, and Retail Industries.
- Strong expertise in Relational data base systems like Oracle, SQL Server, MS Access.
- Designed and developed various mappings and mapplets in Mapping designer and sessions and workflows in Workflow manager to extract data and load to Oracle database.
- Very strong knowledge in Relational Databases (RDBMS), Data modeling and in building Data Warehouse, Data Marts using Star Schema and Snow Flake Schema. Knowledge in Data Modeling (Logical, Physical, Star Schemas) with Erwin.
- Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server and worked on integrating data from flat files like fixed width and delimited
- Have extensively worked in developing for supporting Data Extraction, transformations and loading using Informatica Power Center.
- Experience in integration of various data sources like SQL Server, Oracle, and Flat Files.
- Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
- Worked on MDM Hub configurations - Data modeling & Mappings, Data validation, Match and Merge rules, Hierarchy Manager, customizing/configuring Informatica Data Director (IDD).
- Performed all kinds of MDM Hub jobs - Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and Automation Processes.
- Extensively used ETL methodology for performing Data Migration, Extraction, Transformation and Loading using SSIS and designed data conversions from wide variety of source system.
- Strong ability in defining query for generating drill down reports, handling parameterized reports and creating stylish report layouts in SSRS and SAS.
- Performed various kinds of testing like Integration, Unit and User Acceptance testing(UAT).
- In-depth understanding of Star Schema, Snow Flake Schema, Normalization, 1st NF, 2nd NF, 3rd NF, Fact tables, Dimension tables.
- Experience in SQL Plus and TOAD as an interface to databases, to analyze, view and alter data.
- Extensively used SQL statements while performing process and applied Query.
- Experience in working with scheduling tools like Autosys and UC4.
- Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplets and Mapping Designer
- Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups, Aggregators, and Re-usable transformations
- Worked on Deployment Groups and Migrations.
- Experience in using the Informatica command line utilities like paced to execute Workflows in non-windows environment
- Experience in debugging mappings; Identified bugs in existing mappings by analyzing the data flow and evaluating transformations
- Hands on experience in Performance Tuning of sources, targets, transformations and sessions
- Good experience in documenting the ETL process flow for better maintenance and analyzing the process flow
- Worked with Informatica Server and Client tools, experience in the Data quality analysis, Design, Development, Implementation, Testing, Production Support of Database/Data warehousing, Legacy for various industries using Data Modeling, Data Extraction, Informatica B2B Data Transformation and Data Exchange and Data Loading, Informatica MDM
- Experience in UNIX shell scripting, FTP and file management in various UNIX environments
- Highly Motivated to take independent responsibility as well as ability to contribute and be a productive team member with excellent Verbal and Communication Skills and clear understanding of Business procedures
Operating System: UNIX, Linux, Windows XP/2000/98/95/Win 7
Programming Languages: C, C++, HTML, PL/SQL, ASP.NET, Java, UNIX Shell Scripting, VB
Virtualization Tools: VMware, VSphere, ESX/ESXi, Vcenter server.
Databases: Oracle 11g/10g/9i, Teradata V2R4, DB2, Netezza, TOAD, MS SQL Server 2008, 2012, SAP BW, SAP Logon
ETL Tools: Informatica Power Center 10.1.1/9.6.1/9.1/8.6, Informatica Data Quality (IDQ),Informatica MDM Data Director 10.0, 9.7, 9.5.
BI Tools: Business Objects 6.x, Congo s 8.0
Database Tools: Oracle 10g, 11g/12c RAC, Cassandra DB, MongoDB, DB2, MS Access, PL/SQL, MySQL, PostGre SQL
FTP Tools: IP Switch, Win SCP
Web Technologies: Servlets, JDBC, JSP, HTML, CSS, XML,.
- Responsible for Business Analysis and Requirements Collection.
- Efficiently worked in all the phases of System development life Cycle (SDLC) using different methodologies like Agile and Waterfall .
- Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Performed requirement gathering analysis design development testing implementation support and maintenance phases of both MDM and Data Integration Projects .
- Parsed high-level design specification to simple ETL coding and mapping standards.
- Designed and customized data models for Data warehouse supporting data from multiple sources on real time
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse .
- Created mapping documents to outline data flow from sources to targets.
- Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Developed mapping parameters and variables to support SQL override.
- Created mapplets to use them in different mappings.
- Developed mappings to load into staging tables and then to Dimensions and Facts .
- Used existing ETL standards to develop these mappings.
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
- Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse .
- Utilized Informatica Data Quality (IDQ V10.1) for data profiling and matching/removing duplicate data, fixing the bad data, fixing NULL values.
- Expertise in Exporting IDQ (V10.1) mappings into Informatica designer and creating tasks in workflow manager and scheduling the mapping with scheduling tools to run these mappings.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
- Modified existing mappings for enhancements of new business requirements.
- Used Debugger to test the mappings and fixed the bugs.
- Imported the IDQ (V 10.1) address standardized mappings into Informatica Designer as a Mapplets.
- Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
- Responsible for setting up jobs in EDW/DIH/ETL Integration environment complete on a daily basis.
- Created UNIX shell scripts to execute SAS programs through Autosys scheduler
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- MDM development includes creating Base object tables, staging tables and landing tables as per requirement in LLD.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
- Performed Performance testing with different sets of node configuration, different queue and different volumes.
- Extracted the data from various sources like Teradata, SQL server, Oracle using PROC SQL CONNECT procedure and loaded into the SAS datasets.
- Used Tortoise SVN version control tool for version controlling and movement of code to upper environments like SIT, UAT, Pre production and Production .
Environment: Informatica Power Center 10.1.1, Informatica IDQ 10.1.1, MDM Data Director 9.1.0,Oracle 11g, Toad 11.5.0, UNIX Shell Scripting,DIH, Flat Files,Microsoft Office, Citrix,MSBI Tools,SSIS,SSRS.
Confidential, Long Island, NY
- Use Informatica, Tidal, and Unix extensively to fix the job and load data successfully and as per requirements.
- Used ETL Informatica Client tools like Designer, Workflow Manager, Workflow Monitor and Repository Manger.
- Clean up the QA environment filled with new Data and use for extensive testing.
- Worked extensively with transformations like Source qualifier, Aggregator, Expression, Lookups, Filter, Router, Sequence Generator, and Rank Etc.
- Work extensive hours and Weekands to prepare environment for testing and fix any logical or structural problem in mapping to fix failed jobs.
- Partition work flow for better and faster performance.
- Helping other teams to finish their job whenever out team waiting for other team to finish their work.
- Coordinate with different environment team member to successfully load the data in our environment and send data to other environment.
- Used Debugger by making use of Breakpoints to monitor data movement, identified and fixed the bugs.
- Responsible to write complex SQL queries to check the validity of DATA.
- Worked as a testers to determining both medium and high severity defects, that would potentially affect the downstream systems before the release of the project, and also fixed the defects before moving the jobs into production.
- Currently working on migrating all our existing development from our servers onto into Microsoft AZURE, a cloud based service.
- Designed and developed ETL process using SSIS to load data from various sources like Oracle, MS Access and DB2 into SQL Server 2014, 2008 R2/ databases.
- Configured SSIS packages using package logging, Breakpoints, Checkpoints and Event handling to fix the errors.
- Working with STAR Schema and identifying all the logic to successfully fulfill the project requirements.
- Carefully chose date and fill different tables and database with new Data for Testing.
- Run Monthly tie out jobs and SQL query to validate data.
- Experience working with IDQ transformations like Address Validator, Labeller, Parser, Standardizer, Match and Exception etc.
- Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.
- Created queries, procedures and packages in MDM Hub for displaying and updating the data.
- Created mapping document and mappings in MDM HUB ORS using various cleanse list and cleanse functions.
- Prepared estimates and tracked each and every task and strictly adhered to the estimated deadlines.
- Co-ordinate with Off-Shore team in INDIA and to take their help in overnight running job or issue.
- Extensive knowledge in promoting packages (code) across development, test, preproduction and production
- Preparation of Test Data/Unit Testing /Integrated testing and generated various test cases.
- Involved in different Team review meetings.
- Use CITRIX Scheduling tool to schedule Informatica jobs. Use Citrix for override dependency and run particular job.
- Communication with Project Manager and other team member and client everyday for latest status.
Environment: Informatica Power Center 9.6.1, Informatica IDQ 9.6.1, Informatica MDM (Formerly Siperian) 9.1, Oracle 11g, Toad 11.5.0, UNIX Shell Scripting, Flat Files, Microsoft Office, Citrix.
Confidential, Westborough, MA
Sr. ETL Informatica Developer
- Involved in gathering of business requirements, interacting with business users and translation of the requirements to ETL High level and Low-level Design.
- Documented both High level and Low-level design documents, Involved in the ETL design and development of Data Model.
- Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2008 with high volume data.
- Developed T-SQL Stored Procedures and Triggers and User defined Data Types and Functions to enforce the business rules.
- Due to memory space issues, replaced look up transformations with stored procedures and bridge tables.
- Worked on improving data warehouse solutions by translating business requirements into robust, scalable, and supportable solutions that work well within the overall system architecture.
- Used Calculations, Variables, Break points, Drill down, Slice and Dice and Alerts for creating Business Objects reports
- Developed complex ETL mappings and worked on the transformations like Source qualifier, Joiner, Expression, Sorter, Aggregator, Sequence generator, Normalizer, Connected Lookup, Unconnected Lookup, Update Strategy and Stored Procedure transformation.
- Extract, load and transform the data from Oracle to Green plum using Green plum Utilities like upload, External Tables, GPFdist and Data Stage.
- Created Queries, Query Groups and packages in MDM Hub Console
- Wrote T-SQL statements for retrieval of data.
- Implemented Slowly Changing Dimension Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.
- Worked on loading the data from different sources like Oracle, DB2, EBCDIC files (Created Copy book layouts for the source files), ASCII delimited flat files to Oracle targets and flat files.
- Created Netezza views to be used in Micro strategy for semantic layer on data marts.
- Used Netezza Bulk writer and Netezza Bulk reader in Informatica to load and read data from Netezza.
- Responsible for Data mapping testing by writing complex T-SQL Queries using Query Studio
- Develop Queries, functions and Stored Procedures in Green plum in order to make data accessible for Business Analyst and Managers.
- Created Informatica mappings with PL/SQL Procedures/Functions and triggers in T-SQL to build business rules to load data.
- Involved in migration of data from Oracle to Netezza.
- Experience in working with Mapping variables, Mapping parameters, Workflow variables, implementing SQL scripts and Shell scripts in Post-Session, Pre-Session commands in sessions.
- Experience in writing SQL*Loader scripts for preparing the test data in Development, TEST environment and while fixing production bugs.
- Converted Teradata syntax tables, views, functions and stored procedures to Green plum/Postgres syntax tables, views, functions using SQL & PL/PGSQL.
- Experience in using the debugger to identify the processing bottlenecks, and performance tuning of Informatica to increase the performance of the workflows.
- Experience in creating ETL deployment groups and ETL Packages for promoting up to higher environments.
Environment: Informatica Power Center 9.5.1/9.1, Oracle 11g/10g/9i, DB2, MS Access, TOAD 9.0, Netezza Twin fin 6.0, UNIX, Teradata
Confidential - Piscataway, NJ
ETL Informatica Developer
- Developed a standard framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
- Identified all the dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.
- Analyzed the logical model of the databases and normalizing it when necessary.
- Involved in identification of the fact and dimension tables.
- Extensively used Informatica Power Center for extracting, transforming and loading into different databases.
- Extracted data from various heterogeneous sources like Oracle, Teradata and Flat Files.
- Have implemented both Type I and II ‘keep history’ (SCD) for data mart.
- Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.
- Created Unix Shell Scripts using Korn, Perl, Sed and Awk to automate the load of data from Data files to Oracle table
- Created Source and Target Definitions in the repository using Informatica Designer - Source Analyzer and Warehouse Designer.
- Extensively on different types of transformations like Source qualifier, Expression, Filter, Aggregator, Rank, Lookup, Stored procedure, Sequence generator.
- Used Mapping Designer to create mappings.
- Replicated operational tables into staging tables, to transform and load data into the enterprise data warehouse using Informatica.
- Created and scheduled Work lets, configured email notifications. Set up Workflow to schedule the loads at required frequency using Power Center Workflow Manager, Generated completion messages and status reports using Workflow Manager.
- Involved in Performance Tuning at various levels including Target, Source, Mapping, and Session for large data files.
- Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse
- Documented Data Mappings/ Transformations as per the business requirement.
- Performed testing, knowledge transfer and mentored other team members.
Environment: Informatica Power Center 8.6.1, Oracle 10g/9i, 8i, Flat Files,DB2, SQL Server 2000, Mainframe DB2, Teradata, TOAD, Business Objects, Quest Central for DB2, Windows 2000, UNIX AIX 5.1, SQL, PL/SQL.
Informatica Developer/Database Developer
- Analysis of Source, Requirements, existing OLTP system and Identification of required dimensions and facts from the Database.
- Identified business rules for data migration and Perform data administration through data models and metadata.
- Designed new database tables to meet business information needs. Designed Mapping document, which is a guideline to ETL Coding.
- Responsible for Documentation for ETL development process, Installation and troubleshooting guide.
- Worked with Source Analyzer, warehouse designer, Mapping Designer, Mapplets, and Reusable Transformations.
- Using Informatica Designer, designed Mappings, that populated the Data from Source to staging area to Sales & Marketing Datamart.
- Maintained Development, Test and Production Repositories using Repository Manager Folder copy and repository backup.
- Developed Mapplets using corresponding Sources, Targets and Transformations.
- Executed sessions, sequential and concurrent batches for proper execution of mappings and sent e-mail using server manager.
- Used session partitions, dynamic cache memory, and index cache to improve the performance of Informatica server.
- Assisted in Migrating Repository.
- Created database triggers for Data Security.
- Wrote SQL, PL/SQL codes, stored procedures and packages.
- Optimize SQL queries for better performance.
- Designed Informatica transformations for loading data from various sources like flat files/ OCI/ODBC sources.
- Worked closely with Software Developers to isolate, track, and troubleshoot defects
- Extensively worked on UNIX and shell scripts. Developed UNIX shell scripts to run the pmcmd functionality to start and stop sessions and batches.
- Migrated large volumes of data from legacy systems to Oracle database.
- Extensively used SQL Loader for Data loading
- Enhancements and Functional Specifications.
- Consolidation, Cleansing, Integration, and customization of data.
- Optimized Query Performance, Session Performance and Reliability.
- Created complex procedures.
- Database connectivity was done using ODBC.
- Preparation of Unit Test Plans.
- Verification of functional specifications and review of deliverables.
- Complex SQL queries were used for data retrieval.
- Involved in Data Modeling using Erwin.
- Developed Packages, Procedures and function to maintain the Business logic using PL/SQL.
- Creating Database objects including Tables, Indexes, Views, Sequences, Synonyms and granting Roles, and Privileges to the system users.
- Creating sql scripts for deployment of database objects on Production.
- Involved in fine tuning of sql query to achieve good performance.
Environment: Informatica 8.6/ Power Center 9.0/8.6, Power Exchange, HP UNIX, Windows NT, Oracle 9i/10g,, DB2, Erwin 4.0, SQL, PL/SQL, T SQL, TOAD, Business Objects, CARS,XML,MDM.