- Over 7+ years of total IT experience and technical proficiency in building Data Warehouses, Data Marts, Data Integration, Operational Data Stores and ETL processes for clients in Financial (Equities, Futures, Options, Commodities, SPOT’s, Swaps, Bonds, Credit Risk, Market Risk, Operational Risk) and HealthCare (Providers, Customers, Organizations, Plans, Claims, and Extracts) domains.
- 5+ years of strong experience in working with large scale Data Warehouse implementations using Informatica PowerCenter 8.x/7.x/6.x, Oracle, DB2, SQL Server on UNIX and Windows platforms.
- Strong knowledge in OLAP systems, Kimball, and Inmon methodology & models, Dimensional modeling using Star and Snowflake schema.
- Extensive experience in Extraction, Transformation, and Loading (ETL) data from various data sources into Data Warehouse and Data Marts using Informatica PowerCenter tools (Repository Manager, Designer, Workflow Manager, Workflow Monitor, and Informatica Administration Console).
- Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions and Workflows using Informatica PowerCenter.
- Experience in performance tuning of Informatica mappings and sessions to improve performance of the large volume projects.
- Experience in Migration, Configuration and Administration of Informatica PowerCenter.
- Experience in integration of various data sources like Oracle, DB2, SQL Server, Flat Files, Mainframes, XML files into Data Warehouse and also experienced in Data Cleansing and Data Analysis.
- Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages, Cursors, Triggers, Views, and Indexes in distributed environment.
- Excellent expertise with different types of data load strategies and scenarios like Historical Dimensions, Surrogate keys, Summary facts etc.,
- Worked extensively in all stages of SDLC, from gathering requirements to testing, implementation and support.
- Experience in preparing documentation such as High level design, System requirement document, and Technical Specification document etc.,
- Strong experience in writing UNIX Shell scripts, SQL Scripts for development, automation of ETL process, error handling, and auditing purposes. Experience in using UC4, Autosys, and Control-M scheduling tools to organize and schedule jobs.
- Good knowledge on generating various complex reports using OBIEE, MicroStrategy, and Business Objects.
- Experience in using IBM Clear Quest to track defects and document test cases.
- Good knowledge on TIBCO Rendezvous and IBM MQSeries.
- Worked with cross-functional teams such as QA, DBA and Environment teams to deploy code from development to QA and Production server.
- Experience in project management, estimations, and resource management activities.
- Excellent analytical, problem solving skills with strong technical background and interpersonal skills.
TECHNICAL SKILL SET:
Operating Systems UNIX, Linux, Windows XP/2000
Databases Oracle 11g/10g/9i, DB2 V8.01, SQL Server 2008/2005, MS Access
Database Tools TOAD, SQL Navigator
Load Utilities SQL Loader
ETL Tool Informatica PowerCenter 8.6.1/8.5.1/8.1.1/7.x/6.x
BI Reporting Tools Business Objects, OBIEE, MicroStrategy 8.2
Programming Languages C, C++, HTML, XML, COBOL, PL/ SQL, Java, J2EE, JSP
Scripting Languages Shell Scripting, Perl Scripting
Tools UC4, Cron, Control-M, Autosys
Application Servers Web Logic 10.x/9.x, Tomcat
Middleware TIBCO Rendezvous, IBM MQ Series
Test Management Tools IBM Clear Quest, Quality Center, JIRA
Confidential, Chicago IL Jul ’09 – Present
Data Warehouse Consultant
Project: Clearing Positions
Confidential, is the world’s leading and most diverse derivatives marketplace. The main objective of the project is to build a distributed environment which would be a primary source for trade processing, position management, performance bond, settlement, asset management, banking, and deliverables information.
Roles & Responsibilities:
- Interacting with business owners to gather both functional and technical requirements.
- Documenting the business requirements and framing the business logic for the ETL process.
- Developing technical specifications and other helpful ETL documents following CME Group’s standards.
- Involved in creating logical and physical data models using CA ERwin data modeler. Generating the DDL scripts for the physical data model.
- Use Agile methodology for SDLC and utilize scrum meetings for creative and productive work.
- Design and develop PL/SQL packages, stored procedure, tables, views, indexes, and functions; implement best practices to maintain optimal performance.
- Design, develop, and test Informatica mappings, workflows, worklets, reusable objects, SQL queries, and Shell scripts to implement complex business rules.
- Load historical and intraday trades, settlements, positions, and product data into Oracle data warehouse to enable business analysts to better understand, monitor, and analyze liquidity generating performance of Market Maker firms trading CME Group’s products.
- Migrating historical data from DB2 to the Oracle data warehouse.
- Transferring the data from various sources like XML, flat files, DB2 into Oracle data warehouse.
- Extensively worked on SCD type 2 using Look up transformation.
- Identifying bottlenecks/issues and fine tuning them for optimal performance.
- Oversaw unit and system tests and assisted users with acceptance testing.
- Upgraded Informatica repository to 8.6.1 within the timeframe.
- Responsible for capturing, reporting, and correcting error data.
- Used Business Objects XI R2 to programmatically generate reports and gather necessary information about report instances.
- Performed/automated many ETL related tasks including data cleansing, conversion, and transformations to load Oracle 10G based Data Warehouse.
- Work with DBA’s and systems support personnel in elevating and automating successful code to production. Used UC4 for job scheduling, workload automation and for generating reports.
- Developer Shell/Perl scripts to transfer files using FTP, SFTP, and to automate ETL jobs.
- Experience using Web Logic for hosting the servers.
- Provide on-call support to production system to resolve any issues.
- Conducting code walkthroughs and review peer code and documentation.
- Playing role in design of scalable, reusable, and low maintenance ETL templates.
Environment: Informatica Power Center 8.6.1/8.5.1, Oracle 11g/10g RAC, DB2 v8.01, UC4, RHEL 5.4, Windows XP, SQL, PL/SQL, Shell/Perl Scripting, BO XI, ERwin, TIBCO, Web Logic 10.3.4, IBM Clear Case & Clear Quest 7.0.1, SQL Developer, TOAD 9.0.
Confidential, Wilmington DE Mar ’08 – Jun’09
The objective of the project is to provide timely, accurate and consistent sales information. This involves developing and implementing a Solution that will address the technology challenges, business and organizational issues and providing a more complete picture of sales. This means providing the ability to measure accurate sales results and identifies growth potentials. The current technology solution proposed is to implement a data warehouse which would be the primary source for all business reporting.
The data is extracted from several source systems and consolidates it into an enterprise data store known as Operational Data Store (ODS). Data is then extracted from the ODS and transformed into various data marts such as Spending Report (SPR) data mart which is used to track total spending of different groups/departments.
Roles & Responsibilities:
- Interacted with Business Users to gather business requirements and designed user friendly templates to communicate any further enhancements needs to be implemented.
- Coordinated with Data Modelers for designing the dimensional model.
- Extensively worked in Credit Cards billing and payments subject area.
- Involved in documenting Functional Specifications, Design Specifications documents and created ETL Specifications documents and updated them as and when needed.
- Created System Interface Agreement (SIA) between source system and target systems, which has the escalation procedures in case of issues and SLA.
- Designed ETL specifications with transformation rules using ETL best practices for good performance, maintainability of the code and efficient restart ability.
- Designed reusable objects like mapplets & re-usable transformations in Informatica.
- Experienced in developing mappings using transformations such as Source Qualifier, Aggregator, Lookup, Filter, Sequence Generator, Expression, Router, Update Strategy, Rank, XML SQ/Parser/Generator, Normalizer, etc., to load data from different sources like Oracle, Flat Files, Excel Spread Sheets, XML, COBOL files to the target Data Warehouse.
- Experience in implementing Type II changes in Slowly Changing Dimension Tables.
- Designed and developed the UNIX shell scripts for the automation of ETL jobs.
- Performed data validation in the target tables using complex SQLs to make sure all the modules are integrated properly.
- Involved in cleansing raw data in staging area using stored procedures in pre and post-session routines.
- Tested and tuned the SQL queries for better performance. Identified the bottlenecks in mapping logic and resolved performance issues.
- Worked closely with Business Intelligence (BI) team and assisted them to develop reports using Business Objects reporting tool.
- Conducted code reviews to make sure the business requirements are met and the coding standards are followed.
- Experience in working with different 3rd party data cleansing tools like Trillium.
- Coordinated with System support team to setup the system test environment for code migration and code execution process in QA environment.
Environment: Informatica Power Center 8.1/7.1, Erwin, BO XI, SQL Server 2008/2005, PL/SQL, Shell Scripting, COBOL, IBM MQSeries, Erwin, Trillium, Autosys, Tomcat, TOAD, Sun Solaris 2.7, Windows XP.
Confidential, Maryland Mar’06 – Feb’08
COMSORTer Application: Comsort Inc. is a wholly owned subsidiary of Merck Inc, one of the world’s largest pharmaceutical companies. Comsort creates and sends surveys to physicians to nominate specialists in their respective fields which Merck’s sales and marketing team uses to target for promoting drugs and to invite them as speakers at physician conferences where they can promote Merck’s drugs. The project was about creating an online application that Comsort could use to enter the survey data and to generate lists of physicians based on the surveys and the nominations provided.
- The COMSORTer application was Oracle based and the existing data was stored on SQL Server and DB2.
- Migrated the data from SQL Server and DB2 to Oracle.
- Created a mapping document that outlines the sources mapped to the targets
- Created a document outlining the plan of action to be taken for the entire process
- Created views to select data from the existing SQL Server databases.
- Created DTS packages to generate flat files from the views created.
- Designed mappings to load first the Staging tables and then the destination tables.
- Designing mappings using transformations such as Source Qualifier, Joiner, Expression, Lookup, Filter, Router etc.
- Created different transformations using Informatica for loading the data into SQL Server database.
- Transferred the data from a combination of different input files like XML, Flat files to Oracle.
- Created, optimized, reviewed, and executed Complex SQL queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
- Created Functional Spec & Technical Spec documentation & also documented the issues found in the end to end testing.
- Extensively worked with DBA’s during the performance testing phase for our database.
- Generated SQL Loader scripts and Shell scripts for automated daily load processes.
- Developed triggers and stored procedures for data verification and processing.
- Extensively worked on database performance tuning techniques and modifying the complex join statements.
- Identifying Bottlenecks, Optimizing SQL, Reducing Unnecessary Caches etc.,
- Creating workflows with the Event Wait task to specify when the workflow should load the tables. The Event Wait task would wait for the indicator file which was being dropped onto the Informatica server by the Cold Fusion (Front End), and then transfer the control to the rest of the workflow to load the data.
- Used existing UNIX scripts and modified them to load the Oracle tables.
- Designed mappings to load the Surveys, Questions, Projects and other tables related to Surveys.
- Involved in the smooth transition from Informatica 7.1 to Informatica 8.0. Worked as an Informatica Administrator to migrate the mappings, sessions, workflows, repositories into the new environment.
- Configured and Administered Informatica Servers.
- Designed and developed scripts for administrative tasks like backup’s, tuning and periodically refreshing the test databases from the production databases.
- Created views and designed mappings to load test for the UAT to test the application.
- Extensively used Debugger Process to test data & applied Break Points.
- Provided production support for Business Users and documented problems and solutions for running the workflow.
- Created various geographical and time dimension reports.
- Moving the mappings and workflows from Dev to QA and QA to Production environment and unit testing the process at every level.
- Documented detailed steps for migrating the code.
- Supporting the application in Production environment by monitoring the ETL process everyday during the nightly loads.
- KT the entire process to the production support members.
Environment: Informatica Power Center 8.1/8.0/7.1, SQL Server 2000, MicroStartegy 8, TOAD 7.6, Oracle 10g/9i, SQL Loader, DB2 UDBv8.1, PL/SQL, T-SQL, Erwin, Control-M, UNIX AIX4.2, Shell Scripts, Windows XP/2000.
Confidential, Hyderabad India May’04 – Feb’06
Ceeyes is a leading provider of intellectual property software cores in the acres such as Layer2/7 Networking, Wireless, and Embedded real time software. We have expertise in product design and development for embedded systems software and system integration.
Project: Annual Maintenance Contract (AMC)
Roles & Responsibilities:
- Identifying functional requirements, analyze the system, provide suggestions, and design as per requirements and test the design.
- Coordinated with team members in analyzing the business requirements.
- Use Agile methodology in design and development of the application,
- Developed conceptual design document with prototyping of UI, involved in estimation and detailed scheduling of various modules.
- Identifying database requirements and was involved in designing of database for various modules.
- Created stored procedures, functions, scripts, and packages for applying the business rules.
- Performance tuning and optimization achieved through the management of indices, table partitioning, and optimizing the SQL scripts.
- Created generic packages useful for other team members.
Environment: PL/SQL, SQL, J2EE (JSP, Servlets), XML, Oracle 9i, UNIX, Windows 2000.
Bachelor Degree in Computer Science & Information Technology