Experience Summary Overall 5 years of experience in providing Business Intelligence solutions in Data warehousing /Decision Support Systems with Experience in Financial, Retail, Healthcare, Telecommunications industries using ETL (Extraction, Transformation and Loading) utilizing Informatica Power Center( 8.6/8.5/8.1/7.1.2/6.1). Experienced in Star-Schema data modeling, Normalization, Data Profiling, Data Cleansing, Logical and Physical data model design process. Multitudinous experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing. Extensive work experience in ETL analysis, designing, developing, testing and implementing ETL process in Informatica 7, 8 including performance tuning. Proficient in the development of ETL (Extract, Transform, Load) processes with a good understanding of source to target data mapping, ability to define, capture Meta data and Business rules. Experienced in Performance tuning and debugging of existing ETL processes. Experience in developing UNIX Shell Scripts for automation of ETL process, Test Plans, Test Strategies and Test Cases for Data Warehousing projects ensuring the data meets the business requirements Sound knowledge of RDBMS concepts, with hands on exposure in the development of relational database environment using SQL, PL/SQL, Stored Procedures and Triggers Experience in UNIX shell scripting, job scheduling on multiple platforms like windows NT/2000, Linux environment. Designing, developing, testing, performance tuning. Extensively developed real time and near real time BI systems using the Informatica ETL tool, Teradata utilities and the OLAP reporting tools. Developed scheduling charts and scheduled shell scripts, Informatica ETL mappings. Provided integration and postproduction support for ETL and Reporting systems.
Operating Systems : Windows NT/XP/2000, UNIX (HP-UX, Linux).
ETL Tools : Informatica Power Center 8.6/8.5/7.1.2/6.1.
Databases : Oracle 9i/10g/11g, Teradata (load utilities).
Languages : PL/SQL, SQL and UNIX Shell Scripting.
B.I Tools : COGNOS 8.x/7.x series, Power play and Transformer
Software/Tools : TOAD, Erwin, ROBOT Scheduler
Bachelors in Computer science, Sri Venkateswara University, India 2007
Confidential,Midland, TX Jan 2012 – Present
Confidential, is one of the world\'s leading integrated energy companies. They explore for, produce and transport crude oil and natural gas; refine, market and distribute transportation fuels and lubricants; manufacture and sell petrochemical products; generate power and produce geothermal energy; provide energy efficiency solutions; and develop the energy resources of the future.
Responsibilities: Extracting data from flat files and load into staging tables. Developed many mappings by using Informatica, such as Source Qualifier, Expression, Look-up, Update Strategy, Filter & Router, Joiner transformation etc for developing Informatica mappings. Developed Informatica mappings by using Slowly Changing Dimensions. Extensively worked on the Performance Tuning of mappings and sessions. Involved in little bit of production support to schedule the jobs by using ROBOT scheduler. Involved in doing the manual process of scheduling by different kind of tool like WinScp, which is used for FTP process of moving file from one location to another location.
Environment: Informatica Power Center 8.6.1(Designer, Workflow Manager, Workflow Monitor), Oracle 11g, Toad 8.6, ROBOT Scheduler, DAC, Winscp
Confidential,Arlington, VA July 2011 – Dec 2011
Confidential,is a Mortgage company. It guarantees only securities backed by single-family
and multifamily loans insured by government agencies. Ginnie Mae does not buy or sell loans or issue mortgage-backed securities (MBS).
Responsibilities: Reviewed code, design and test plans as appropriate throughout project lifecycle. Extracting data from Excel files and load into staging tables. Used various Transformations of Informatica, such as Source Qualifier, Expression, Look-up, Update Strategy, Filter & Router, Joiner transformation etc for developing Informatica mappings. Involved in Analysis phase of the business requirement and design of the Informatica mappings using low level documents. Developed Informatica mappings by using type 2 Slowly Changing Dimensions. Creating mappings with different look-ups like connected look-up, unconnected look-up, Dynamic look-up with different caches such as persistent cache etc. Extensively worked on the Performance Tuning of mappings and sessions. Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations. Involved in designing the ETL unit testing. Used Bugzilla to track the bugs and resolve the bugs.
Environment: Informatica Power Center 9.1(Designer, Workflow Manager, Workflow Monitor), Oracle 11g, SQL, Toad 8.6, Flat files, Bugzilla.
Confidential,Richmond, VA May 2009 - July 2011
Confidential,sets monetary policy, supervises and regulates member financial institutions and provides an array of financial services. It covers a broad spectrum, including monetary policy, macroeconomics, banking, financial institutions and markets, payments systems, national and regional economic conditions and supporting functions. Analyzed business requirements, performed source system analysis, prepared technical design document and source to target data mapping document. Worked with Data modeler on logical and physical model designs. Worked with DBA to set up development, test, stage and production environments Performed impact analysis for systems and database modifications. Coordinated development efforts with offshore team Developed ETL processes to populate the Enterprise Product Orders data mart Dimensions and Fact using Informatica, Oracle SQL & UNIX shell scripts Developed complex mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookups, Filter, Router, Sequence Generator, Update Strategy, and Joiner Optimized performance by tuning the Informatica ETL code as well as SQL.Performed unit testing and documented the results. Worked closely with QA team during the testing phase and fixed bugs that were reported. Developed and Scheduled jobs for the daily load of the Orders data mart that invoked Informatica workflows & sessions associated with the mappings, SQL scripts to drop & recreate the indexes on source and target tables, Unix Script to create / update the parameter file, check source / target database connectivity, delete tag files etc Reviewed code, design and test plans as appropriate throughout project lifecycle Involved in Informatica 9.1 upgrade testing.
Environment: Informatica Power Center 8.6/9.1 (Designer, Workflow Manager, Workflow Monitor), Oracle 11g, SQL, Windows XP, Toad 8.6, Erwin.
Confidential,Cleveland, OH May 2008 - Apr 2009
The Patient Data Analytics Solution provides business intelligence analysis services to the billing department through interactive client tools. Data from various online transaction processing (OLTP) applications and other sources is selectively extracted, related, transformed and loaded into the Oracle data warehouse using Informatica Power Center ETL tool. Then the transformed data from data warehouse is loaded into an OLAP server to provide Business Intelligence Analysis Services.
Responsibilities: Developed mappings, mapplets by using mapping designer, and mapplet designer in Informatica Power Center designer. Used debugger to test the data flow and fix the mappings. Developed Reusable Mapplets and Transformations for reusable business calculations. Developed workflows and tasks using Informatica Power Center Work flow Manager. Developed shell scripts, which will invoke the Informatica Power Center workflows passing all the variables for the job to execute like user id, password. Developed, executed and documented test cases for integration and system testing
Responsible for Pre and Post session planning for optimizing Data load performance, capacity planning and user support.
Environment: Informatica Power Center 8.1 (Designer, Workflow Manager, Workflow Monitor),
Oracle 9i, SQL Server 2008, UNIX Shell Scripting, TOAD (Tool for Oracle Application Developer),
Windows 2000 Professional, Linux.
Confidential,Kansas City. May 2007- Mar 2008
ETL Informatica Developer
Involved in Analysis, Design and development of Data warehousing project. Using DSS System Architect, the metadata structure was designed to populate OLAP data. This data warehouse application provides detail analysis with decision making support solutions to management to access, analyze and share information and generate reports for head counts, financial and HR reports for their distributors or resellers with the Client / Server configurations and Installations. Performed Project Requirements Gathering, Requirements Analysis, Design, Development, Testing for the ETL, Data warehousing and Reporting modules of the project. Developed Mappings and Transformations (using Filter, Joiner, Lookup, Update Strategy, Expression and Aggregator) on the extracted data as per the Business requirements. Used Teradata TPump for highly parallel utility designed to continuously move data from data sources into Teradata tables without locking the affected table. Designed, developed, tested and attuned Informatica mappings. Analyzed and modified existing ETL objects in order to incorporate new changes in them according to the project requirements. Comprehended and converted critical business requirements into technical functionalities for the ETL and Reporting teams. Developed Unix scripts, to perform multifarious data processing tasks and automation function.
Environment: Informatica Power Center (Designer, Workflow Manager, Workflow Monitor),
Oracle 9i, Teradata, UNIX Shell Scripting, TOAD, Windows 2000 Professional, Linux