We provide IT Staff Augmentation Services!

Business Intelligence- Etl/data Warehousing Tester Resume

Austin, TX


  • Over Seven years of IT experience in SDLC & STLC life cycle. Involved in analysis, design and Quality Assurance of Data Warehousing, Web, Client - server (J2EE & .NET) applications for Banking, Financial, Brokerage, Healthcare etc.
  • Two years of experience in ETL using Informatica Power Center 8.6.1.
  • Experience in creation and execution of Test Plans, Test Scripts and Test Cases using both Manual and Automated testing techniques.
  • Well Versed with System testing, Integration testing, Performance testing, Functional testing, Regression testing and User Acceptance Testing.
  • Experienced in analyzing Business Requirements and Specifications. Worked with Development team and Business Analysts to analyze the test scenarios and ensure that test requirements are correct and complete.
  • Extensively used on Informatica Designer components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer.
  • Expert in Waterfall Lifecycle, AGILE, RUP, XP & Iterative project testing methodologies
  • Expert in Manual and Automation testing for Client/Server, Object Oriented and web based multi-tier architecture applications
  • Extensively strong on databases including Sybase, Oracle 10g/9i/8i, Teradata, SQL Server 2008 and DB2
  • Expertise in understanding Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Multidimensional Model (Kimball and Inmon),Star and Snowflake schema design.
  • Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development.
  • Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing.
  • Efficient in in corporation of various data sources such as Oracle, MS SQL Server, and DB2, Sybase, XML and Flat files into the staging area.
  • Tested several Jobs in DataStage to populate test data into tables in Data warehouse and Data marts.
  • Excellent knowledge in Dimensional modeling including Conceptual, Logical and Physical Data Models.
  • Tested both Relational and dimensional data.
  • Strong in Data warehousing concepts, dimensional Star Schema and Snowflakes Schema methodologies.
  • Expert in unit testing, system integration testing, implementation, maintenance and performance tuning.
  • Experience in different Scheduling tools like AutoSys for automating and scheduling jobs run.
  • Excellent with PL/SQL, T-SQL, Stored Procedures, Database Triggers and SQL * Loader.
  • Experience in UNIX Shell Scripting.
  • Detail oriented with good problem solving, organizational, analysis, highly motivated and adaptive with the ability to grasp things quickly.
  • Ability to work effectively and efficiently in a team and individually with excellent interpersonal, technical and communication skills.


Operating System: Windows NT/98/2000, Windows XP, UNIX (Sun, AIX, HP), and MS DOS

Servers: IBM Web sphere 2.0, BEA Web Logic Server 8.1, Java Web Server, Actuate Server

Languages: C, C++, PERL, HTML, XML, Visual Basic, VB Script, Java2.0, SQL and PL/SQL

Database: Oracle9i, Oracle10g, SQL Server 2000, IBM DB2, Sybase, Teradata and MS-Access

Internet: JDK1.3, Servlets1.4, Java Beans, Applets, JFC/Swing, HTML, Java Script1.2, CSS

Other Software: Web connector, EDI, PEGA, FACETS, FLEXX and TCP/IP

Internet Tech: HTML, XML, Java Script, VBScript, ASP, JSP Java Beans, EJB, RMI, MQ-Series,FTP, TCP/IP, J2EE.

Testing Tools: Quick Test Pro and Quality Center.

Utilities: Toad, Citrix server, Oracle SQL Developer, Putty, VPN, Net meeting and CuteFtp.

Management: VSS, Win CVS and PVCS.

Packages: MS-Office, MS - Visio and MS-Project.

ETL: DataStage 8, Informatica 8.6.1

Health Care: EPIC Suite, HL-7, FACETS 4.5.

Reporting Tools: Business Objects Xi R2, Cognos 8, Verisk Health Systems, Microstrategy for Recon Reports

Scheduling: AutoSys, Cron


Confidential, Austin, TX

Business Intelligence- ETL/Data Warehousing Tester

Roles & Responsibilities:

  • Created Test Strategy for Enterprise Data Warehouse in ETL framework, including Business Intelligence.
  • Involved in Business analysis and requirements gathering.
  • Extensively used Informatica power center for extraction, transformation and loading process
  • Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
  • Responsible for coordinating testing throughout both systems ODS and EDW which were two different teams.
  • Tested the Informatica Mappings to load data from Source systems to ODS and then to Data Mart.
  • Developed data quality test plans and manually executed ETL and BI test cases.
  • Designed and kept track of Requirement Traceability Matrix
  • Quality Center updates and test cases loading and writing Test Plan and executing Test Cases and printing status report for the team meetings.
  • Tested whether the data is formatted to store the data tables into Operational Data Store(ODS)
  • Verified the tables are loaded into the Operational Data Store (ODS) for the Data staging area.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
  • Performed data quality analysis using advanced SQL skills.
  • Tested slides for data flow and process flows using PowerPoint and Microsoft Visio
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Tested the different sources such as Flat files, Main Frame Legacy Flat Files and Oracle to load into the Teradata data warehouse
  • Extracted data from Oracle and upload to Teradata tables using Teradata utilities FASTLOAD.
  • Worked with Teradata utilities such as Teradata SQL Assistant for Querying Teradata.
  • Extensively used the Report Inspector tool to change, multiple reports database schemas, scheduling multiple reports, deploying and publishing.
  • Testing the reports and maintenance of daily, monthly and yearly.
  • Involved in Writing Detailed Level Test Documentation for reports and Universe testing.
  • Involved in Integrating and Functional System Testing for the entire Data Warehousing Application.
  • Involved in data warehouse testing by checking ETL procedures/mappings.
  • Implemented and maintained tools required to support data warehouse testing.
  • Performed all aspects of verification, validation including functional, structural, regression, load and system testing
  • Automated and scheduled the Informatica jobs using UNIX Shell Scripting
  • Worked on test data and completed unit testing to check all business rules and requirements are met. Also tested for negative data to check that the job fails on any critical error.
  • Tested several data migration application for security, data protection and data corruption during transfer
  • Responsible to help testing team on creating test cases to make sure the data originating from source is making into target properly in the right format.
  • Analyzed the data and applied relevant transformations and verified that the source data types are correct.
  • Applied appropriate validation rules and created graph parameters to check the validations are correct so that the graph should stop.
  • Created lookup files where necessary for validation of critical enterprise codes, lender identifiers and service numbers.
  • Participated in testing and Deploying the Cognos Framework Manager Models, Packages, Reports and Powerplay Models.
  • Create Run and Schedule Reports and Jobs using Cognos Connection
  • Test all the security changes using COGNOS upfront and check if the changes have been deployed correctly
  • Performed Unit testing for all developed Business Intelligence Reports in Cognos
  • Worked with business team to test the reports developed in Cognos
  • Tested whether the reports developed in Cognos are as per company standards.
  • Prepared reusable function in QTP to use across the automation scripts
  • Performed database testing with SQL queries to verify data integrity using QTP.
  • Preparing and supporting the QA and UAT test environments.
  • Involved in Manual and Automated testing using QTP and Quality Center.
  • Created different user defined functions for applying appropriate business rules
  • Performed the tests in both the FIT, QA and contingency/backup environments
  • Used Quality Center for defect tracking.

Environment: Oracle 10G, DB2, SQL Server 2005, SSIS, T-SQL, Teradata 12.0, SQL Assistant 12.0, TOAD 9.7, Quality Center 10.0, SQL, PL/SQL, QTP 9.0, Cognos 8 BI series, Informatica Power Center 8.6.1, MainFrame Flat Files, COBOL II, UNIX, Korn Shell Scripting

Confidential, Denver, CO

Sr. ETL/ QA Analyst


  • Reviewing the business requirements for Data Warehouse ETL process and working with business and requirements team for gaps found during the review.
  • Developed a detailed Test Plan, Test strategy, Test Data Management Plan, Test Summary Report based on Business requirements specifications.
  • Wrote extensive SQL and PL/SQL scripts to test the ETL flow, Data Reconciliation, Initialization, and Change Data Capture, Delta Processing.
  • Prepared a UAT Plan and set up UAT environment.
  • Prepared Execution procedure document format to prepare the Test cases based on mapping document.
  • Defined and ran jobs using the AutoSys Graphical User Interface (GUI) and Job Information Language (JIL), monitoring and managing AutoSys jobs and alarms.
  • Ran jobs using the AutoSys to load the data from Source to Target.
  • Executed DataStage jobs to test the ETL processes.
  • Developed Strategies for Data Analysis and Data Validation.
  • Used the DataStage Designer to develop various jobs processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database
  • Used the DataStage Director to schedule jobs, testing and debugging its components and monitoring of results
  • Worked with DataStage Manager for importing metadata from repository, new job Categories and creating new data elements.
  • Validated the Data in the Warehouse using SQL queries
  • Tested the Reports generated by the BI tools (Microstrategy, Hyperion) and validated the data on the Reports
  • Validated test cases for Source to Target mappings (STM).
  • Validated data flow from Source through to FACT tables and Dimension tables using complex queries(left outer Joins, sub queries etc)
  • Validated data model for landing zone, staging zone and Warehouse tables for column length and data type consistencies
  • Validated FK failures and data integrity checks for all the tables.
  • Validated Business codes, Error process and Purge process.
  • Extensively used UNIX shell scripts and compared the files in SAS to make sure the extracted data is correct.
  • Involved in developing scenarios of testing and maintaining testing standards
  • Involved in Business functionality review meetings and Use-Case Analysis
  • Participation in requirement / Use Case analysis, risk analysis and configuration management.
  • Developed the templates for User/Customer Training and documentation.
  • Participated in developing and implementing End-End testing manually.
  • Used Share point for Document Management and Dimensions for version control.
  • Involved in setting up QA environment for testing the applications.
  • Used Quality Center and Test Director for defect tracking and reporting
  • Assisted Team members in knowledge transfer
  • Involved in Regression, UAT and Integration testing
  • Coordinated with Different project teams to set up common test environment and common Integration for different applications
  • Conducted defect review meetings with the development team members
  • Involved in peer review of test cases, Test Pool meetings and Impact analysis
  • Involved in testing the applications in Target host in IBM Mainframe environment
  • Used QMF for writing and executing DB2 queries in the host

Environment: DataStage 8 (Designer, Director, Manager),Windows XP, Oracle10G,TOAD for oracle, SQL Developer, Embarcadero Rapid SQL 7.5,IBM DB2 8.1,Sybase,DART (Desktop Asset Request & Tracking), Rational ClearCase v7,Telelogic DOORS 8.3, UNIX, Reflections, Telnet, Mercury Quality Center, PVCS (Dimensions), MS Project, Hyperion Performance Suite 8.5. AutoSys Jobs, SAS, Mainframe, Microstrategy.

Confidential, Dallas, TX

ETL/ QA Analyst


  • Wrote extensive SQL and PL/SQL scripts to test the ETL flow, Data Reconciliation, Initialization, and Change Data Capture, Delta Processing, Incremental process of Policy and Claims systems.
  • Developed a detailed Test Plan, Test strategy, Test Data Management Plan, Test Summary Report based on Business requirements specifications.
  • Prepared a UAT Plan and set up UAT environment.
  • Prepared Execution procedure document format to prepare the Test cases based on mapping document.
  • Analyzed the Data Dictionary developed by the Systems Analyst for developing test scripts to test the Policy and Claims systems.
  • Implementing logical and physical data modeling with STAR schema using Erwin in Data Marts.
  • Extensively used Informatica PowerCenter 8.1.1 to load data sourcing from Source to Target databases.
  • Executing ETL - Informatica Workflows for updating test Databases from the Production servers.
  • Worked rigorously with developers to create test scripts in regards to the Informatica ETL.
  • Tested HRA claims data for medical and pharmacy from pharmacy ODS and flat files.
  • Tested workflow tasks such as session, command and decision and email.
  • Tested Informatica Mappings and worked on Staging area to validate the data with SQL Queries.
  • Tested several reports developed in MS Reporting Services which were migrated from Business Objects XI R2.
  • Tested various jobs and performed data loads and transformations using different stages of informatica and pre-built routines, functions and macros.
  • Wrote complex SQL queries to validate EDW data versus EDM source data including identification of duplicate records and quality of data based on Mapping/Transformation rules.
  • A given test case will be a ‘Pass’ if the EDW results (expected value) match exactly with EDM (actual value).
  • Initiated and developed new techniques to improve the testing process and improve the performance of the test scripts.
  • Ran SQL queries to verify the number of records from Source to Target and validated the referential integrity, Time variance, Missing records, Nulls/Defaults/Trim spaces rules as per the design specifications.
  • Worked with XML feeds from multiple sources systems and loaded the same into Enterprise data warehouse
  • Verified correctness of data after the transformation rules were applied on source data.
  • Coordinated execution of User Acceptance Testing, regression and integration testing with multiple departments.
  • Employed data modeling, table normalization (de-normalization for Adhoc) in establishing data and application integrity.
  • Identified appropriate test data in relevant source systems and incorporated this data into test scripts.
  • Updated the status of the testing to the QA team, and accomplished tasked for the assigned work to the Project Management team regularly.
  • Performed Regression testing of the fixed issues in the new build until no new issues are identified.
  • Submitted weekly bug or issue report updates to the Project Manager in the form of the QA Error Log.
  • Submitted Final Test Report and required documentation for the entire project within the assigned time frame.
  • Prepared documentation on how to approach and validate the data in Data-warehouse.

Environment: Informatica 8.1.1, Business Objects Xi R2, WindowsXP,Oracle 9i,,Sybase,IBM DB2,TOAD for Oracle, UNIX, Reflections, Mercury Quality Center, PVCS (Dimensions), MS Project, FACETS 4.3, FACETS 4.5, upgraded to 8 NCA Mainframe for CA7 Job Scheduler, Hyperion Performance Suite 8.5, Verisk Health Systems.

Confidential, Omaha, NE

ETL/QA Analyst


  • Written Test plans, Test cases, executed Test cases for SQL/backend test and tracked defects in Quality Center based on the Business Requirements, Functional Requirements, Business Workflows, A&D documents and ICD documents.
  • Participated in BREQ meetings and FREQ meetings to keep track of new requirements from the project.
  • Reviewed the A&D documents with Tech lead, Database Developers and Test team for better understanding of the requirements.
  • Performed System Testing, Integration System test (IST), End to End (E2E), D2D Test, Environment Shakeout test, Implementation Shakeout test, regression testing, UAT test and Production test per the needs of the application and record Issues / Defects and track in Quality Center.
  • Participated in End to End testing flowing orders from Order entry to Billing
  • Validated out going XML messages
  • Validated flat files coming from downstream systems and mocked up feed files by using vi editor.
  • Performed Back end testing by writing SQL statements like Join, Inner Joins, Outer joins and Self Joins used TOAD and SQL Developer
  • Performed Backend/SQL Test for Insert, Update, and Functions.
  • Ran main frame JCL jobs to collect the data from Main frame systems and load them into Oracle Tables.
  • Participated in business requirements reviews and software system (application) designs for testability.
  • Immediate notification / escalation of issues to the Management.
  • Coordinated the Onsite and Offshore teams to resolve issues.
  • Track and report upon testing activities, including the test case execution stage, defect status if any defects opened during execution and the testing results status.
  • Ensuring content and structure of all the testing artifacts are documented in SharePoint Tool.
  • Tested the data extraction procedures designed to extract data into flat feed files.
  • Connected remotely to UNIX servers using PUTTY and FTP files across different test environments by using Hummingbird and Command prompt.
  • Executed the UNIX shell scripts that invoked SQL loader to load data into tables.
  • Generated Reports in Front end and validated Reports in Backend to ensure data integrity and validate Business rules.
  • Maintained Traceability Matrix to track the requirements to the test cases to ensure complete test coverage in the Mercury Quality Center.
  • Clearly communicated defects with developers and updated comments in Mercury Quality center.
  • Executed the regression test cases along with testing new enhancements by using QTP and analyzed results.
  • Reviewed Error log files in UNIX box when order fails to load into SQL tables.

Environment: Informatica 7.1, Windows 2000/XP, Oracle 9.i, Outlook, UNIX, Humming Bird, Putty, TOAD, SQL Developer, XML, Quality Center 9.2 IE 6.0, 7.x, JAVA, Web Services, JCL, COBOL, DB2.

Confidential, Overland Park, KS

ETL/ QA Analyst


  • Written Quality Test Plan based on the Business Plan, Qualified Idea phase, Concept Phase, Definition phase and Development Phase and Plan, Test Strategy, Business Requirements, Solution Design, and Interface Design Documents.
  • Created detail Test cases for each test phase to ensure complete coverage. Test Cases were incorporated both positive and negative test conditions. Executed test cases from Quality Center
  • Maintained Traceability Matrix to track the requirements to the test cases to ensure complete test coverage in the Mercury Quality Center.
  • Extracted Patient data from Server into flat file and imported to Oracle DB.
  • Loaded data into Oracle using SQL loader and validated data
  • Written SQL Queries and executed them using TOAD and SQL Editor.
  • Performed Shakedown test, System test, Workflow Integration Test, ser Acceptance testing and Regression testing activities in Test environment and Production support environment
  • Facilitated UAT (Business partners) in the areas of test planning and defect mitigation.
  • Attended application walk-through and reviewed Requirements to understand the scope of the project and Attended project meetings as needed and Coordinated with the Developers to resolve the issues/defects.
  • Written shell scripts to invoke the SQL Loader to load the data into oracle DB from flat files using control files.
  • Tested and validated HIPPA compliance requirements in the application.
  • Developed automated Test Scripts in QTP using VBScript for Regression Testing.
  • Uploaded Requirements and Excel Test Cases into Mercury Quality Center. Executed Test scripts from Test Lab manually.
  • Written, executed Test cases, and documented defects in the Test Director/Quality Center 8.x.
  • Worked as Quality Center project administrator, customized the project to meet testing needs.
  • Used Mercury Quality Center for test planning, defect report and executing the test cases.
  • Tested application when DB upgraded to Oracle 9i from Oracle
  • Performed Interface testing for KMATE data (TMS, RIS and MPM) in the MRS via Integration Broker (IB) and validated data provided by all three systems was stored in the MRS SQL database.
  • Performed Back end testing by writing SQL statements using SQL Plus and TOAD.
  • Ran the batch jobs using client component Ascential Datastage Director for validates jobs, schedules jobs, runs jobs and monitors jobs.
  • Verified logs after running ETL jobs.
  • Involved in System Testing, Regression Testing and Integration Testing.
  • Participated in requirements walkthroughs with users to get better understanding.
  • Used Rational Clear Quest for bug tracking and followed up with development team to verify bug fixes, and update bug status for third party application.

Environment: Windows2000 pro, MS/Office, Crystal Reports, IE, Attachmate for access to non-Imagecast applications(Mainframe), MS IIS, DBMS platform-MS SQL-2000, Wintel Server, Citrix Server, ConnectR, IntegrationBroker (IB), GE Centricity RIC-IC, ASP, XML, JAVA, C++, UNIX, Ascential Datastage, HTML, KMATE Cache, KPHC, Crystal Reports, Zebra Label printer, Right Fax system, MRS, MS Project and Quality Center, Test Manager, ClearQuest.

Hire Now