- Over 7 years of experience in the IT industry with a strong background in software development and 6+ years of experience in Development & Testing Business Intelligence solutions in data warehousing and decision support systems using ETL tool Informatica PowerCenter 9.0.1,8.6.1/7.1.4/6.2.
- Excellent understanding on full software development life cycle (SDLC) of ETL process including requirement analysis, design, development, support of testing and migration to production.
- Experience in various domains like HealthCare, Finance, Insurance, Government and Banking
- Extensively worked with large Databases in Development, Testing and Production environments.
- Experience with advanced ETL techniques including staging, reusability, data validation, change data Capture (CDC), Real time.
- Experience in working with Code migration, Data migration and Extraction/Transformation/ Loading using Informatica Power Center and Power Exchange with Oracle, Sql Server, Teradata, XML, Flat files, and Cobol on UNIX, Windows NT/2000/9x.
- Strong in Data warehousing methodologies of Star /Snow Flake schemas of Ralph Kimball, Bill Inman.
- Experience in developing Informatica Mappings, Sessions, Worklets and Tasks.
- Experience in working with Deploying groups.
- Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator and XML Source Qualifier.
- Expertise in implementing complex business rules using different transformation, mapplets and worklets.
- Experience with code migration from between repositories and folders.
- Hands on experience in identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
- Expertise in developing SQL and PL/SQL Scripts through various procedures, functions, and packages to implement the business logic.
- Well versed with different SQL Development Tools like TOAD & PL/SQL Developer.
- Good experience in shell scripts for Informatica pre & post session operations.
- Experience in logical/physical data models and Forward/Reverse Engineering using ERwin.
- Experience in team management and a good team player.
- Good experience in ETL technical documentation.
Informatica Powercenter 9.0.1,8.x,7.x SQL Server 2005,2008
Business Objects 6.5/XI R2, OBIEE
Windows 95/98/2000/2003 Server/NT Server Workstation 4.0, UNIX Programming Visual Basic 6.0/5.0, PowerBuilder 6.0, C, SQL, PL/SQL, Java, HTML, XML, DHTML Other Tools SQL*Loader, SQL*Plus, TOAD, MS Visio,TOAD 8.0 Scripting Languages SQL,PL/SQL,UNIX Shell Scripting Methodologies E-R Modeling, Star Schema, Snowflake Schema Data Modeling Tool Erwin 3.5/4.1
Confidential, FL SEP 2011 - Present
Description: Adecco USA is a leader in recruiting and workforce solutions. They have more than 900 offices in North America servicing a range of clients through an integrated suite of workforce solutions. Adecco USA is made up of several specialty divisions that align with the unique needs of our clients. Our main Job is to Migrate the (Line Of Business) Data in to Adecco centralized Front Office Systems.
- Gathered requirements from the end users and Involved in analysis of source systems, business requirements and identification of business rules.
- Design and Implement Informatica mappings to Migrate data from various legacy applications/acquisition offices to a centralized application.
- Responsible for Analysis, Design and Implement of various data marts for BackOffice(Financial,payroll,Benefits and HR Modules) using Data Modeling Techniques, Informatica 9.0.1.
- Use ConnectWise software to open and track Tickets.
- Use Elite Software to maintain change control approval process.
- Deploy Informatica Objects and Business Object Universes to TEST, UAT and Production Environments.
- Involved in development of Informatica Mappings, Mapplets and Workflows for complex Business rules.
- Various Transformations like Expression, Lookups, Filters, Sequence Generator, Joiner, Sorter were used to handle situations depending upon the requirement.
- Called Stored Procedures to perform Database operations on Post-Session and Pre-Session commands.
- Implemented Type2 slowly changing dimensions to keep track of Historical data.
- Created parameter file for session parameters and called in the sessions.
- Created shortcuts to reuse objects across folders without creating multiple objects in the repository.
- Performed Unit Testing and verified the data using Informatica Debugger break points.
- Tuned mappings and sessions for better performance on the data loads.
- Performed Error handling on sessions in Workflow Manager.
Environment: Informatica Powercenter(Designer 9.0.1, Repository Manager 9.0.1, Workflow Manager 9.0.1),Business ObjectsXI/6.X, Oracle 11g/10g, PL/SQL, SQL*PLUS, SQL Server 2008,2005 Flat files, XML, TOAD, UNIX, TOAD, Erwin 4.0 and Shell Scripting.
Confidential, Wayne, NJ JAN 2010 - Aug 2011
Description: The Healthcare business of Bayer Pharmaceuticals is the leading provider of decision support solutions that help organizations across the healthcare industry improve clinical and business performance. Bayer Enterprise data warehouse consists of more than 5 years of data, which was derived from different data marts with different formats. The data warehouse was built in Oracle database. The sources for this warehouse were SQL Server, Oracle, XML and flatfiles. These sources were loaded in to Oracle data warehouse with Informatica.
- Worked with Business analysts and the DBA for requirements gathering, business analysis and designing and participated document review meetings.
- Involved in the team during the entire ETL process and development of data marts using Informatica PowerCenter 8.6.1.
- Implemented Slowly Changing Dimensions methodology to keep track of historical data.
- Designed and developed Star Schema and created Fact and Dimension Tables for the Warehouse using Erwin and Business Intelligence Reporting Applications using OBIEE.
- Extracted the data from Oracle11g/10g, XML, Flat files load the data in to Oracle data warehouse.
- Worked on Informatica Power Center tools - Source Analyzer, Target designer, MappingDesigner, WorkflowManager, Mapplet Designer and Transformation Developer.
- Used Informatica Designer to create mappings using different transformations to move data to a Data Warehouse. Developed complex mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Joiner, and Rank.
- Responsible for creating different sessions and workflows to load the data to Data Warehouse using Informatica Workflow Manager.
- Created Reusable transformation and Mapplet based on the business rules to easy the development process and responsible for document the changes.
- Maintained and modified mappings as per changing reporting requirements.
- Extensively worked on the performance tuning of the Mappings as well as the sessions.
- Checked Sessions and error logs to troubleshoot problems and also used Debugger for complex Problem troubleshooting
- Developed Informatica mappings/sessions/workflows for ETL processing and used UnixShellScripts for smooth application interfacing.
- Involved in identifying the bottlenecks in Sources, Targets & Mappings and accordingly optimized them.
- Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Involved in using the Stored Procedures, Functions and Triggers at Data Base level and imported them in Informatica for ETL.
- Extensively used SQL * Loader for Bulk loading from Flat Files to Oracle Tables.
- DevelopedSegment Value Detail, Work Order DetailandOGA Repositories(RPD)by importing metadata intoPhysical Layer, applied Business rules in theBusiness Modeling and Mapping layerand created customized user views in thePresentation layer
- Involved in Unit and System testing of developed mappings.
- Responsible for migrating the folders or mappings and sessions from development to production environment.
- Written Detail design documentation to describe program development, logic, coding, testing, changes and corrections.
Environment:Informatica Power Center (Designer 8.6.1, Repository Manager 8.6.1, Workflow Manager 8.6.1),OBIEE, Oracle 11g/10g, PL/SQL, SQL*PLUS, SQL*Loader SQL Server 2008, Flat files, XML, TOAD, SQL*Loader, UNIX, TOAD, Erwin 4.0, SQL * Loader and Shell Scripting.
Confidential, NYC, NY SEP 08 - DEC 09
Sr. Informatica Developer
Description: Confidential, is one of the top most financial banks in USA. The work was in a team Environment with 3 oracle developers, 1 Informatica Admin and a team lead. As an Informatica Developer maintaining the existing projects i.e., scheduling, monitoring them, tune them accordingly; assist other teams in their work etc, configuring the Email task for sending the success or failure mails. Here I effectively utilized my programming skills to adhere to coding standards, procedures and techniques while contributing to the technical code base including any required documentation. Our main primary source was EDW.
- Gathered requirements from the end users and Involved in analysis of source systems, business requirements and identification of business rules.
- Involved in the Data Warehouse Data modeling based on the client requirement.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
- Extracted sources from flat-files, Oracle, SQL Server and load them into Oracle.
- Responsible for creating system design and detail design documents based on the requirement document provided by the business users.
- Provides strategic thinking, leadership pertaining to new ways of leveraging information to improve business processes.
- Experienced in database design, data analysis, development, SQL performance tuning, data warehousing ETL process and data conversions.
- Based on the logic, developed various Mappings & Mapplets to load data from various sources using different transformations like Source Qualifier, Expression, Filter, Normalizer, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner in the mapping. Also developed Error Processing to capture the error records and loads them into Message Log table.
- Worked with Stored Procedure Transformation for time zone conversions.
- Created UNIX scripts to automate the activities like start, stop, and abort the Informatica workflows by using PMCMD command in it.
- Fine tuned the mappings by analyzing data flow and Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Lookup, Joiner and aggregator transformations.
- Provided production support including error handling and validation of mappings.
- Addressed and track requests for system enhancements, improvements from end users/customer and also resolved production issues.
- Extensively used Debugger Process to modify data and applying Break Points while Session is running.
- Used various Informatica Error handling techniques to debug failed session.
- Created Test cases for Unit Test, System Integration Test and UAT to check the data.
- Responsible for migrating the folders or mappings and sessions from development to test environment and Created Migration Documents to move the code from one Environment to other Environment.
- Proactive team player with the demonstrated ability to multi-task and prioritize in a fast-paced professional environment.
- Maintained effective communication with non-technical client personal and handled the change requests.
Environment: Informatica Power Center 8.1.1, Oracle 10g , SQL Server 2000, Flat Files, CSV files, PL/SQL(Stored Procedure, Trigger, Packages), Erwin, Tidal, MS Visio, SQL Developer, SQL*Plus, TOAD, Windows 2003/2007, UNIX AIX 5.3.
Confidential, NJ JULY 07 - AUG 08
Sr. Informatica Developer
Description:Confidential, one of the largest Insurance giant in US and provides property, casualty and specialty insurance to individuals and businesses around the world. The Claim PATH is a reporting business intelligence application for managing claim information. This application has data from multiple US Claim sources and planned to add additional data sources to the same application. This process includes design, development, testing, Deployment. ETL was used to load source data to the data marts. Aimed OLAP was for the customized selection of providers for clients, and electronic monitoring of performance like insurance status, available benefit, and claim amounts paid Etc. In this process we developed two data marts those are policy sales and policy claims. The data warehouse was to provide the managers of Chubb, the Slice and Dice capability to analyze the data they have.
- Involved in translating business requirements to integrate into existing Data mart design.
- Developed ETL jobs to extract information from Enterprise Data Warehouse.
- Extensively used ETL to load data from different relational databases, XML and flatfiles.
- Used ETL, InformaticaRepositoryManager to create repositories and users and to give permissions to users.
- Debugged the mappings extensively, hard coding the test data ids to test the logics going instance by instance.
- Performed various transformations Look Up, Joiner, Exp, Router, Update Strategy, aggregations and ranking routines on the data to be stored in the application reporting mart.
- Handle the Migration process from Development, Test and Production Environments.
- Implemented Type 2 slowly changing dimensions to maintain dimension history and Tuned the Mappings for Optimum Performance.
- Used ETL, Informatica Designer to design mappings and coded it using reusable Mapplets.
- Developed workflow sequences to control the execution sequence of various jobs and to email support personnel.
- Involved in unit testing and documenting the jobs and workflows. Set Standards for Naming Conventions and Best Practices for Informatica Mapping Development.
- Used database objects like Sequence generators and Stored Procedures for accomplishing the Complex logical situations.
- Created various UNIX shellscripts for Job automation of data loads.
- Worked on all phases of SDLC from requirement, design, and development and testing.
- Created mappings, which include the Error Handling Logic being implemented to create an error, ok flags and an error message depending on the source data and the lookup outputs.
- Extensively involved in the analysis and tuning of the application code (SQL).
Confidential, Jacksonville, FL MAY06 - JUNE 07
Sr. Informatica Developer
Description: Confidential, of Florida (BCBSF) is a leader in Florida's health industry. It offers a broad choice of affordable, health-related products and services like health care insurance (BlueOptions), Preferred Provider Organization (PPO) products, Health Maintenance Organization (HMO) products, commercial Medicare products, health savings and related accounts.This project involved in managing healthcare and life insurance products and services. This system gives full information regarding benefits, plans offered by the company. This project includes developing Data warehouse from different data feeds and other operational data. Built a central Database where data comes from different sources like oracle, SQL server and flat files. Actively involved as an Analyst for preparing design documents and interacted with the data modelers to understand the data model and design the ETL logic. Reports were generated using OBIEE.
- Responsible to Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
- Extensively usedInformaticaPower center for extracting, transforming and loading data from relational sources and non-relational sources.
- DevelopedInformaticamappings, re-usable transformations, re-usable mappings and Mapplets for data load to data warehouse.
- Developed Mappings using Designer toextract, transformdata according to the requirements and loaded into database with large volumes of data in terabytes.
- Worked on ANSI accredited NCPDP standard formats and drug insurance claims for business processing.
- Developed various mapping and tuning using Oracle/PL SQL and SQL*Plus in the ETL process.
- Extensively usedvarious transformations such asSource Qualifier, Expression, Lookup, Sequence Generator, aggregator, Update Strategy, and Joinerin migrating data from various heterogeneous sources likeOracle,OWB, DB2, XMLand Flat filesto Oracle.
- Handleslowly changing dimensionsof Type 2 to populate current and historical data to Dimensions and Fact tables in the data warehouse. Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, Stored procedure transformations in the mapping.
- Created sessions, database connections and batches usingInformaticaWorkflow Manager.
- Developed mappings to load data in slowly changing dimension.
- Createdconnected and Un-Connectedtransformations to lookup the data from Source and Target Tables.
- ScheduledSessionsandBatcheson theInformaticaServer usingInformaticaworkflow Manager.
- MigrateInformaticaobjects and Database objects to Integration Environment and schedule using theControl M.
- Created complexPL/SQLstored procedures and functions.
- Monitored the ETL jobs/schedules and fixing the Bugs.
- Worked along with the DBA to resolveperformance and tuning issues.
- Provided reliable, timely support of integration, performance and user acceptance testing processes.
- Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
- Involved in doingUnit Testing,Integration TestingandSystem Testing.
- Identified and created various classes and objects for report development.
- Setting up Batches and large volumes of data and creating sessions to schedule the loads at required frequency usingPower Center Workflow manager.
- Designed complexUNIXscripts and automated it to run the workflows daily, weekly and Monthly.
- Provided Production Support to the system in 24/7 environments.
Environment:InformaticaPower Center 8.0, Oracle 10g/9i,DB2,XML,Flat files,SQL, PL/SQL, TOAD, Perl, SQL*Plus, Control-M, Windows, UNIX.
Confidential, TAMPA, FL JUN 05 - APR 06
Description: Confidential, Wireless operates the nation\'s most reliable and largest wireless voice and data network, including the largest 3G broadband network. The scope of the project is to develop operational and analytical reports for each module. The ETL part is handled by using Power centered Informatica. Information capture is the process of acquiring various types of information, including digitized images of hard copy documents and XML data. Information also involves extracting data from these images and XML documents, converting data to different types, and exporting images.
- Worked with Power Center Designer tools in developing mappings and mapplets to extract and load the data from flat files, XML files and Oracle (source) to Oracle (target).
- Prepared software requirement specifications through interaction with business representatives and designed Star schema, logical and physical database designs.
- Created reusable mapplets transformations to load data from operational data source to Data Warehouse and involved in Capacity Planning and Storage of data.
- Develop complex mappings such SlowlyChanging Dimensions Type II - Time Stamping in the Mapping Designer.
- Used Informatica Workflow Manager to create Workflows, database connections, sessions, and batches to run the mappings.
- Used all Transformations such as Expressions, Filters, Joiners, aggregators, Lookups, Update strategy, Sequence Generator, Routers and XML to load consistent data into Database.
- Monitoring the workflow performance and the status with Workflow Monitor.
- Worked on Repository manager to create and manage user profiles.
- Built-in mapping Variable / Parameters and created parameter files for imparting flexible runs of sessions / mappings based on changing variable values.
- Developed various command tasks to automate the Pre session jobs. Did performance tuning to improve the load. Wrote complex SQL Queries involving multiple tables with joins.
- Involved in writing of Triggers, Functions, and Packages.
- Setting up sessions to schedule the loads at required frequency using Power Center Workflow manager, PMCMD.
- Converted SQL/Procedures and SQL Loader scripts to Informatica mappings.
- Used the Target Load Ordering with Stored Procedures to update database.
- Used Variables and Parameters in the mappings to pass the values between sessions.
- Performed Unit Testing and Involved in tuning the Session and Workflows for better Performance.
Environment: Informatica 7.1.4, Informatica workflow manager 7.1.4, Flat Files, XML files, Oracle 10g, XML, Windows XP.
Confidential, India AUG 04 - APR 05
Description: Developed a Content Release Management System to stage and manage the release of content and create Revision Packages based on changes from various Authoring Input Source Systems to the Content Delivery Manager system. Application was built on Model-View-Controller architecture using struts framework in J2EE environment. The content management happens via an XML based content management tool. Bi-directional data from various source systems are integrated to the application with IBM WBI Application Integration toolusing web services & SOAP messages for 2- way communication.
- Developed, tested stored procedures, functions and packages in PL/SQL for Data ETL.
- Used ETL tool Informatica Power Center 6.2 to create maps to transform data.
- Creating various active and passive transformations like source qualifier, Lookup, Router, Procedure, aggregator, filter, joiner, expression and standard and reusable mappings using Informatica.
- Loaded operational data from heterogeneous sources into various data mart
- Unit and integration tested Informatica Sessions, Batches, and Workflows
- Documented the mappings and the transformations involved in ETL process
- Involved in gathering business requirements, data sourcing and data transformation, data loading, SQL and performance tuning
- Involved in writing UNIX scripts and used them to automate the scheduling process
- Developed scripts to load the data from source to staging and staging area to target tables using different load utilities like Bteq and MultiLoad.
- Used mapping Wizards in creating slowly growing dimensions and slowly changing dimensions
- Enhanced session performance, and improved response times, through extensive performance tuning of the mappings, ETL Procedures and processes
- Responsible for querying data from different database tables as per the requirement.
Environment: Informatica Power Center V6.2, Windows NT, Oracle 9i, UNIX scripts, PL/SQL.