We provide IT Staff Augmentation Services!

Data Warehosue Resume

Columbus, OH


  • Over 8 + years of experience in the IT Industry as a Senior QA analyst and Developer in the various Data warehouse / Business Intelligence applications.
  • Extensive experience in Manual Testing of ETL and BI applications in Integration Testing, System Testing, Performance Testing, Functionality Testing, Post-Implementation Testing and Regression Testing.
  • Exposure to Finance and Insurance, Health Care, TV Rating, Telecom, Sales domains with regards to Data Warehouse Testing.
  • Good experience in XP, Agile, V, Scrum and Waterfall Methodologies.
  • Expertise in design and development of test scenarios and Test cases to validate all the requirements.
  • Have good skills in all phases of Testing including test estimation and team management.
  • Good understanding of data warehouse development and testing methodologies and ETL concepts.
  • Proficient in ETL testing and methodologies like control mechanism of the batch loads, generation of error logs, loading and processing of rejected records, testing of SCD 2 implementationsensuring surrogate key integrity and checking process dependencies.
  • Strong experience in various versions of ETL tools like Informatica, Data Stage, Sunopsis and good understanding with Ab Initio
  • Proficient in various databases like Oracle, Netezza, DB2, SQL Server, MS Access, Sybase ASC, Sybase IQ and Teradata.
  • Extensive experience in writing complex SQL Queries, Analytical SQL, PL/SQL and T-SQL statements.
  • Proficient in T-SQL queries DDL/DML
  • Strong experience in various operating systems like UNIX Shell scripting and Windows XP/NT/2003/2000.
  • Excellent Debugging, Data Analysis, problem solving and communication skills along with string verbal and written skills.
  • Exceptional analytical and problem solving skills, communication skills, strong decision making skills and organization skills.
  • Experienced in developing custom tools in java for automating the load process and validating target tables for regression tests.
  • Expertise in automation of ETL process by various tools like Quality Center, Test Director, Clear Quest and bugzilla.
  • Extensive experience in coordinating with off-shore Team and onsite projects.
  • Sound experience in Knowledge Transfer and Mentoring whenever required.
  • Good sense of teamwork, leadership quality, integrity, aptitude to learn and ability to put in sustained quality effort consistently.
  • Excellent Team player with good communication skills and coordination in team meetings.

Technical Skills


Oracle, Sybase, Sybase IQ, DB2, Netezza and SQL Server

Testing Tools:

Win-Runner 7.0, Quick Test Pro 9.0, Fitnesse, Quality Center 8.2 and Bugzilla

ETL/BI Tools:

Informatica PowerCenter 6x/7x/8x, Suopsis 3x, Datastage 7x, Business Objects 5.0, Web Focus and Cognos.



Operating systems:

Windows 2000/XP/NT, Sun Solaris, UNIX AIX

Scripting Language:

Java Script, VBScript, XML, Unix

Development Tools:

DbVisualizer 4.3, WinSQL, Toad, Erwin, Power Designer,
MS-Word, WS-FTP Pro, MS-Excel, MS-Access, PVCS, WRQ Reflection, Hummingbird V8.0, Putty, SharePoint and SAS/BASE,SAS/STAT,SAS/GRAPH,SAS/ACCESS,SAS/CONNECT


Confidential,Columbus, OH Nov’08 – Till Date
Sr. Data warehosue Tester
Nationwide Financial made up with different business systems like Individual Investments, Nationwide bank, Retriment plans, Individual Protection, Nationwide Funds Group.
Nationwide Financial IT (NFIT) provides IT solutions and day-to-day operations support for these business segments and distribution capabilities through the following Business Solution Areas (BSA) and supporting teams
SPIA: SPIA is a new generic commission based product to be implemented and maintained on the iCube system. The product is being built for sales with partner firms or direct sales. SPIA is a Nationwide Single Premium Immediate annuity product. It is part of the Income Promise Select program.
CLS : Performing Research & Analysis on problem logs and resolved production incidents related to Balancing, Data Discrepancy, Segmentation, etc.
Pensions e-Delivery: In an effort for cost savings to Nationwide, edelivery process is used by Nationwide PPA ( Pensions Plan Administrators ) to provide electornic delivery of statements ( via email ) to the participants. Using this option participants can opt-in/out of the edelivery process for receiving statements.
Data Quality: This project will continue 2009 NFDW Data Quality "Additonal Data Quality Checks" that will proactively identify and alert on data anomalies for our Life, Pensions, IIG, IPAS, Retail, and Agency customers. Includes full automation of the current manual point-to-point balancing process and reporting LOB balancing details on a Customer Dashboard. Includes \'Data Profiling Analysis\' to ensure proper checks are implemented.
Reports SMS: SMS (Sales Management System) is the Nationwide Financial internal application build using the Webfocus reporting tool which contains the sensitive sales and producer information intended for reviewing sales activity. Some of the things which SMS do are:

  • Show the detailed policy information for the past 3 years within the territory
  • Show who are the top generating producers are within the particular territory
  • Show who are the Top firms are within the particular territory
  • Show Top 200 producers within the particular territory that have production
  • Responsibilities:

    Reviewing System Requirements documents and Design Documents (DD).
    • Prepared Test Plans, Test Methodology and Test Cases as per System Requirements documents and Design Documents for Functional, Integration, and Regression Test specification and executed them.
    • Tested the application for Functionality and all failed cases were documented.
    • Test cases were documented, prepared pass-fail reports and comparison.
    • Used Quality Center to print custom reports of test cases pass and failed.
    • Performed the back-end database testing.
    • Execute complex and nested queries using Winsql.
    • Involved in preparing the status report every week which includes Revision History, test case status, metrics, QA Issues and Defects.
    • Manage the complete testing process in Quality Center. Involved in bug tracking and reporting.
    • Test and validate the functional Test Cases based on mapping document.
    • Prepare the Measured Mapping documents for to validate System Requirements test conditions.
    • Prepared test data based on Busines requirements (positive and negative).
    • Test and validate the System Requirements and Technical test Conditions.
    • Prepare the Test Data for the Initial, Incremental (Daily Inserts and Updates) and historical Load testing and validate.
    • Data is tested efficiently at each level of ETL process by writing complex SQL queries against the Source and Target.
    • Test the data based on the Dimensional Model Diagram.
    • Validate the data from OLTP (Multiple sourec systems) to OLAP (Data warehouse/Data Marts).
    • Proficient in T-SQL queries DDL/DML.
    • Run the Informatica Workflows through UNIX command prompt or through Maestro scheduling tool and validate the results.
    • Write Macro’s in Excel.
    • Test and validate the Business Objects, Webfocus reports.
    • Test and validate Production Defects/Incedents (Break fix/CLS tickets).
    • Transformation rules applied are thoroughly reviewed and tested the data loads.
    • Validate different levels of data Extraction, Transforming and loading.
    • Test and validate various data sources in ETL like CSV flat files and different RDBMS.
    • Worked on UNIX shell scripts to test the data in the Source tables and the Data warehouse tables.
    • Problem is debugged and analyzed efficiently and defects are raised through Quality Center.
    • Test the control loads by validating the UNIX scripts.
    • Test and maintained Data Integration Job control tables to check the status of control loads.
    • Test and validate the rejected records and the log files.
    • Work closely with Developers and other team members in resolving the issues and understanding the functionality for future reference.
    • Test and validate the table alerts, log file alerts with Dash board alerts.
    • Test and Validate the Database upgrade project (ex tables/views, counts and constraints)
    • Test and Validate the Informatica upgrade project.
    • Test and validate the referential Integrity between the tables.
    • Numeric Variables, Checking for Missing Values, Looking for "n" Observations per subject, verifying double entry and Using SQL for Data Cleaning.
    • Extracting Data from Oracle database using SAS/ACCESS, E-review, PROC SQL.
    • Validate Informatica Mappings, update processes, Control Loads, and scheduling mechanism.

    Environment: Informatica 8.X/9.X, Oracle 10g, Netezza 4.5.4, Windows XP, Sun Solaris 2.7, Business Objects, WebFous, MS-FTP Pro, MS Excel 2003/2007, MS Word 2003/2007, SQL Navigator, Toad, WinSQL, SQL/T-SQL, PL/SQL, Quality Center 8.2

    Confidential,Rochester, MN Oct ’07 – Nov ‘08
    Data warehosue Tester

    Confidential,is a not-for-profit medical practice dedicated to the diagnosis and treatment of virtually every type of complex illness. The organizational breadth of this EDT includes Mayo Clinic Arizona, Mayo Clinic Jacksonville and Mayo Clinic Rochester including Mayo Health System. The scope includes Practice, Education, Research and Administration. Mayo maintains all of its services information in a centralized data trust called EDT (Enterprise Data Trust) in which there are three different Warehouses CORE, Referral Optimization (RO) and Cancer Center (CC). The Data flows from several Source Systems into EDT and from EDT to several Data Marts.


    Reviewing Functional Requirements documents and Detailed Design Documents (DDD).

    • Test and validate the functional Test Cases based on S2T mapping document.
    • Prepare the Test Data for the Initial, Incremental (Daily Inserts and Updates) and historical Load testing and validate.
    • Transformation rules applied are thoroughly reviewed and tested the data loads.
    • Mentored Junior ETL QA Team members.
    • Test and validate various data sources in ETL like CSV flat files and different RDBMS (17 Sources).
    • Wrote the complex SQL queries against the Source and Target databases.
    • Co-Ordinate with team members working in three different-locations.
    • Test and maintained Data Integration Global tables to check the status of control loads.
    • Test and validate the rejected records and the log files.
    • Validate Data Stage Jobs, update processes, Control Loads, and Scheduling mechanism.

    Environment: Data Stage 7.X, Sybase, Sybase IQ, DB2, SQL Server, Oracle, Windows XP, Sun Solaris 2.7, FTP Explorer, MS Excel, MS Word, Rapid SQL, Toad, Quest Central, Quality Center 8.2

    Confidential,Tampa, FL Jun’05 – Sep ‘07
    ETL Tester
    Confidential,is the leading provider of television audience measurement and advertising information services worldwide. Audience measurement and viewing information’s reports are generated on daily basis for all the markets and sent to the clients. The ODW is a Data Warehouse built to Support Overnight Data Delivery and many other products for Nielsen’s clients. The EDW is an Enterprise data warehouse which is meant to be a common Data source at enterprise level in NMR. As a Quality Analyst at Nielsen Media Research I’m working on End to End testing of ODW and EDW and my roles and responsibilities are as follows:

    • Develop and design Test Scenario’s and Test Cases by interacting with Business Analysts and Developers.
    • Automate the ETL Test process i.e. Test execution and reporting using fitnesse Frame work.
    • Prepared the expected results based on business rules.
    • Wrote complex SQL/T-SQL to incorporate ETL logic and to validate Test Cases.
    • Created expected results and checked and validate the Data Integrity.
    • Test the various informatica mappings with Tuning Techniques to improve the mapping/session performance.
    • Use Informatica debugger and session log to debug the defects. Test reusable Mappings and Transformations.
    • Test and validate various data sources in ETL like XML file formats, fixed width flat files, CSV flat files and RDBMS.
    • Co-Ordinate work between off-shore and on-site teams.
    • Test and maintained process tables to check the status of control loads.
    • Work closely with Developers in resolving the issues and understanding the functionality for future reference.
    • Automate the validation process of target tables using Java classes coupled with the fitnesse UAT framework.
    • Validate Informatica mappings, update processes, Control Loads, and Scheduling mechanism.

    Environment: Informatica Power Center 7.1, Sybase, Sybase IQ, SQL Server, Netezza 2.5, Sunopsis, XML, Windows XP, Sun Solaris 2.7, FTP Explorer, Share Point, Java 1.5, MS Excel, MS Word, Fitnesse, Bugzilla, DBVisualizer 4.x, Quality Center 8.2

    Confidential,Tampa, FL Apr’04 – May’05

    Collecting the data using Power Exchange for 30 minute interval on a daily basis that is coming from Catalog orders. Then appending the data to a Master flat file, Extracting, Transforming and Loading it into DB2 database once a day.

    • Used Informatica Power Center to create mappings, sessions and workflows for populating the data into the dimension, facts, and lookup tables constantly from different source systems (DB2, Flat files).
    • Designed the Source - Target mappings and Involved in designing the Selection Criteria document.
    • Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
    • Implemented Slowly Changing Dimensions.
    • Created and Monitored Workflows using Workflow Manager and Workflow Monitor.
    • Used Pmcmd commands of Informatica in UNIX scripts.
    • Involved in Unit Testing of the mappings.

    Environment: Power Center 7.1, DB2, AS/400, Windows XP, VPN Connectivity, Sun Solaris 2.7, FTP Explorer, MS Excel
    Confidential,Client: Confidential,India Aug’03 – Mar’04
    This application is used to store the product information that is being outsourced through other pharmaceutical companies through profit sharing, supply agreements, or co-development arrangements. This application also keeps the track of Clients and vendors for NATCO Application has been developed using VB and Oracle running on Windows NT Platform.

    • Involved in writing backend procedures, using PL/SQL that would load and retrieve data

    from the database.

    • Involved in writing and implementation of the various test cases and test scripts.
    • Performed Manual Testing.
    • Analyzing and documenting detailed program information working and interacting with

    software developers and members of various other dependent departments.

    • Created Unix shell scripts to perform repeated Tests
    • Used Clear Quest for bug tracking and reporting, also followed up with development team

    to verify bug fixes, and update bug status

    • Documented bugs found out during the process of testing.
    • Wrote SQL & PL/SQL queries for data validation.

    Environment: VB5.0, HTML, Java Script, Oracle7.0, SQL Server 6.5, UNIX

    Education: Master of Science in Computer Science

    Hire Now