- Over Seven years of Professional experience as Software developer in all phases of Software Development Life Cycle SDLC including analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehousing/Data mart design, ETL Informatica PowerCenter 9.x/8.x/7.x
- Experience includes domain knowledge of Financial system, Banking, healthcare information technology, Insurance, Retail Industry.
- Provided production support and have handled different severity tickets.
- Strong Data warehousing experience specializing in RDBMS, ETL Concepts also performed ETL procedure to load data from different sources into data warehouse using Informatica Power Center Repository Manager, Designer, Workflow Manager and Workflow Monitor .
- Extensive experience with ETL tool Informatica in designing and developing complex Mappings, Mapplets, Transformations, Workflows, Worklets, and scheduling the Workflows and sessions.
- Experience in integration of various data sources like Oracle, SQL Server, MS Excel and Flat Files into staging area.
- Worked with Dimensional Data warehouses in Star and Snowflake Schemas. Designed Fact and Dimension tables.
- Experience in implementing Change Data Capture CDC Techniques like Slowly changing dimensions and created slowly growing target mappings, Type1/2 dimension mappings.
- Extensively developed Complex mappings from various transformations like Unconnected /Connected lookups, Router, XML, Filter, Expression, Aggregator, Joiner, Update Strategy and more.
- Proficient in using Workflow Manager Tools like Task Developer, Workflow Designer and Worklet Designer and Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.
- Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate one time, Daily, Monthly and Yearly Loading of Data.
- Experience in developing Shell scripts on UNIX and experience with UNIX command line.
- Experienced with coordinating cross-functional teams, project management and presenting technical ideas to diverse groups.
- Have extensively worked on Trillium, conducted data cleansing in my previous projects.
- Strong analytical, problem-solving, organizational, communication, learning and team skills.
Informatica Power Center 7.1.3/ 8.5.1/8.6.0/9V
Oracle 9i/10g/11g, DB2, SQL Server 2008, Tera Data
Star-Schema and Snowflake-Schema Modeling
SQL, PL/SQL, Unix Shell Scripting
Erwin 3.x/4.x, MS Visio.
Win 98/2000, XP, 2007, Unix.
Toad 7.6/8.0, SQL Loader, MS Office.
Role : Informatica Developer
Confidential is a financial corporation that engages in providing federal and private students loans. The major part of the project involved the split of Sallie Mae into two companies i.e. Confidential Corporation. The new company Confidential would be involved in providing the service of existing federal student loans whereas Sallie Mae going forward wont provide any federal loans and will focus on providing private student loans, credit cards, insurance products and banking sector. The project required to revive the loans received through different systems such as FDR, Upromise, and CLASS. The data is sourced from Oracle and Flat files through ETL process.
- Responsible for gathering requirements through business meetings and engaging with business analyst teams.
- Responsible for on-call activities in production environment once in a month as a secondary person.
- Contacting systems like FDR, CLASS, Upromise for receiving files that contained loan information.
- Preparing the design detail documents prior to presenting towards the upper level management for approvals.
- Designing ETL processes using Informatica to load data from Flat Files, and Excel files to target Oracle Data Warehouse database
- Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter, Source Qualifier.
- Using offshore resources effectively as a part of the Bandbox project for implementation and testing processes.
- Extensively involved in parameterization of the workflow objects.
- Engaging with testing departments for defect fixes for Unit and System Testing.
- Responsible for tracking and maintaining change requests, defects, and risks.
- Using Workflow Manager to create Sessions and scheduled them to run at specified time with required frequency.
- Monitoring and configuring the sessions that are running, scheduled, completed and failed.
Environment: Informatica 9.x, Informatica Power Exchange, Oracle11g, Toad, Flat Files, UNIX.
Role : ETL Lead
Confidential multinational corporation headquartered in Confidential , that designs, manufactures, and sells networking equipment. The project was mainly involved in Confidential Product and Services team seeking to have a centralized, web-based IFP application Integrated forecast and planning that will allow them to prepare their monthly forecast in Hyperion Planning. The monthly forecast should incorporate actual data for closed months, and forecast data for a rolling 5 quarters. Actuals data is sourced from EDW Teradata, Oracle and Flat files through ETL process and is loaded in Essbase cube through flat files for Forecasting purpose.
- Responsible for analysis, design, development, testing and implementation.
- Used Informatica V9 and V8.
- Involved in providing technical solution to the business requirement and manage it through to the implementation phase
- Extensive usage of Informatica Power center for extracting, transforming and loading data from relational sources and non-relational sources.
- Coordinate with Offshore effectively and make them understand system better by providing inputs on function and technical perspective.
- Created and managed daily, weekly and monthly data operations, workflows and scheduling processes.
- Worked extensively on SQL, PL/SQL and UNIX Shell Scripts.
- Extensively involved in parameterization of the workflow objects.
- Have identified and resolved performance bottlenecks at various levels like sources, targets, mappings, and sessions
- Automated the jobs thru scheduling using Dollar Universe, which runs by forecast calendar by maintaining the data validations.
- Prepared deployment documents, user guides, coding standards, and other technical documents.
- Supported during QA/UAT/PROD deployments and bug fixes.
- Performed data profiling and data cleansing in order to remove inconsistent data to increase the performance.
- Configured and Used the Debugger to troubleshoot the mappings
- Performance tuning has been done to increase the throughput for both mapping and session
- Responsible for tracking and maintaining change requests, defects, and risks
- Used U job scheduler to call the Informatica workflows and worked on Trillium.
- Improve performance of mappings by identifying source, target and mapping bottlenecks.
- Efficient documentation done for all phases like Analysis, design, development, testing and maintenance.
Environment: Informatica 9.x, 8.x, Oracle10g, SQL server 2005, SQL Plus, Toad, Flat Files, UNIX, Dollar Universe
Sr. Informatica Developer
Confidential is a large telecommunication carrier. Qwest Communications provides long-distance services and broadband data, as well as voice and video communications globally. This project includes developing Data warehouse from different data feeds and other operational data sources. Built a central Database where data comes from different sources like oracle, SQL server and flat files. Actively involved as an Analyst for preparing design documents and interacted with the data modelers to understand the data model and design the ETL logic.
- Responsible for Business Analysis and Requirements Collection.
- Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Parsed high-level design specification to simple ETL coding and mapping standards.
- Designed and customized data models for Data warehouse supporting data from multiple sources on real time
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Created mapping documents to outline data flow from sources to targets.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Developed mapping parameters and variables to support SQL override.
- Created mapplets to use them in different mappings.
- Developed mappings to load into staging tables and then to Dimensions and Facts.
- Used existing ETL standards to develop these mappings.
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
- Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Extensively used SQL loader to load data from flat files to the database tables in Oracle.
- Modified existing mappings for enhancements of new business requirements.
- Used Debugger to test the mappings and fixed the bugs.
- Wrote UNIX shell Scripts PMCMD commands for FTP of files from remote server and backup of repository and folder.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
Environment: Informatica Power Center 8.6.1, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Data Analyzer 8.1, PL/SQL Dev V7, Oracle 10g/9i, Erwin, SQL Server 2005, UNIX AIX, Toad 9.0.
Sr. Informatica Developer
Confidential is part of a larger Enterprise Data Warehouse EDW strategy focused on implementing and improving business processes and systems that are loan related. The main objective of the Loan performance file extract automated process is to bring all loan related information into one centralized data warehouse. The intent is to develop an automated load process for loading the flat files using Informatica and batch scripts. Based upon the documents given by the users records have been segregated using batch script and are loaded.
- Extensively Worked on Informatica tools such as Source Analyzer, Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer.
- Extensively used all the transformations like source qualifier, aggregator, filter, joiner, Sorter, Lookup, Update Strategy, Router and Sequence Generator etc.
- Used transformation language likes transformation expression, constants, system variables, data format strings etc.
- Responsible for the Data Modeling and populating the business rules using mappings into the Repository for Meta Data management. Involved in running the loads to the data warehouse and data mart involving different environments.
- Created various SQL packages, procedures, functions, triggers as per migration requirements to transform the data for data migration.
- Extensively worked on Workflow Manager and workflow monitor to create, schedule, monitor workflows, work lets, sessions, tasks etc.
- Configured incremental aggregator transformation functions to improve the performance of data loading. Worked Database level tuning, SQL Query tuning for the Data warehouse and OLTP Databases.
- Developed mappings to extract data from SQL Server, Oracle, Flat files and load into Data Mart using the Power Center.
- Database Deployment for various applications Development of the Data Models.
- Troubleshooting database, workflows, mappings, source, and target to find out the bottlenecks and improved the performance
- Wrote PL/SQL stored procedures to load data into time dimension table. Wrote Triggers, PL/SQL Procedures, Packages and Shell Scripts to apply and maintain the Business Rules.
- Configured Informatica Server to generate control and data files to load data into target database using SQL Loader utility.
- Involved in troubleshooting and performance tuning of data loading process.
Environment: Informatica Power Center 8.1.1, XML, Oracle, SQL Server, UNIX, SQL Plus, PL/SQL, Erwin Data Modeling.
Confidential is significantly expanding its Risk capability, subsequently, an Confidential is required to provide suitable enterprise-wide risk information environment capable of meeting their future reporting and information requirements. The RIS is focused on providing a platform for risk information collection from various source systems, performing complex calculations e.g. to calculate Risk-Weighted Assets , and information delivery reporting . A primary focus is a repository of all information required to support compliance.
- Designed ETL processes using Informatica to load data from Flat Files, and Excel files to target Oracle Data Warehouse database
- Performed data manipulations using various Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, and Sequence Generator, Stored Procedure etc.,
- Used SQL overrides in source Qualifier to meet business requirements
- Written pre session and post session scripts in mappings. Created sessions and workflows for designed mappings. Redesigned some of the existing mappings in the system to meet new functionality
- Created and used different tasks like command and email tasks for session status
- Used Workflow Manager to create Sessions and scheduled them to run at specified time with required frequency
- Monitored and configured the sessions that are running, scheduled, completed and failed
- Involved in writing UNIX shell scripts for Informatica ETL tool to fire off services and sessions
Environment: Informatica 8.5, Oracle 9i, Flat files, Excel, SQL Server, PL/SQL, Toad, UNIX Shell Scripting, Windows XP, Cognos
Confidential is a privately-sector Indian bank, headquartered in Confidential. Its network is spread across 700 offices in nation. Enterprise Asset Data mart was designed to help in planning future market based on the analysis of historical data and the operational data present with the bank. Helped in budgeting and variance analysis of assets/liabilities and income/expenses from various located branches throughout the globe.
- Performed major role in understanding the business requirements and designing and loading the data into data warehouse ETL .
- Created new mappings and updating old mappings according to changes in Business logic.
- Used ETL Informatica to load data from sources like Oracle database and Flat Files.
- Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source Target definitions and coded the process of data flow from source system to data warehouse.
- Implemented Slowly Changing Dimensions - Type I II in different mappings as per the requirements.
- Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status.
- Used Debugger to troubleshoot the mappings
- Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter, Source Qualifier
- Created various Mapplets in designer using Informatica PowerCenter Mapplet Designer.
- Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
- Involved in Unit and System testing of developed mapping.
- Optimizing Query Performance, Session Performance.
- Defined Target Load Order Plan for loading data into Target Tables.
- Involved in creating desktop intelligence and web intelligence reports.
Environment: Informatica PowerCenter 6.2, Windows NT, PL/SQL, Excel, SQL Server 7.0, Erwin, Business Objects 6.5.1
Informatica ETL Developer
- Analyzed the business requirements and framing the business logic for the ETL process.
- Extensively used ETL to load data from both fixed width as well as Delimited Flat files.
- Validated data, Error reports generated with Error messages.
- Worked extensively on different types of transformations like normalizes, expression, union, filter, aggregator, update strategy, lookup, stored procedure, sequence generator and joiner.
- Designed and Developed complex mappings, reusable Transformations for ETL using Informatica Power Center 7.1/6.2
- Developing and testing all the Informatica mappings involving complex Router, lookups and update strategies.
- Created workflows and sessions to run the mappings.
- Implemented variables and parameters in the mappings.
- Setting up batches and sessions to schedule the loads at required frequency using Power center workflow manager.
- Generated completion messages and status reports using workflow manager. Working on Dimension as well as Fact tables, developed mappings and loaded data on to the relational database.
- Extensively worked in performance tuning of programs, ETL procedures and processes. Also used debugger to troubleshoot logical errors.
- Involved in writing ETL specifications and unit test plans for the mappings. Performed Developer testing, Functional testing, Unit testing for the Informatica mappings.
Environment: Informatica PowerCenter 7.1/6.2, Oracle, SQL, SQL Plus, PL/SQL, DB2, Windows NT, UNIX.