We provide IT Staff Augmentation Services!

Sr. Data Warehouse Tester (etl And Bi) Resume

Phoenix, AZ

OBJECTIVE:

  • Multifaceted, seasoned, and highly motivated IT professional offering more than 9 years of extensive experience in Software testing and quality assurance.
  • Knowledgeable about software life cycle methodology and software delivery life cycle process. Expert in the field of software testing process improvement and quality to achieve goals and objectives.

SUMMARY:

  • Over 9 years of extensive experience in information technology field as a Software Quality Assurance Analyst and Software Tester and Scrum master.
  • Hand on experience in testing of ETL applications build using Web services based on Service Oriented Architecture, Web Applications (Using J2EE, Microsoft .net n - tier architecture)
  • Extensive experience good understanding of SDLC, STLC, Software Quality Assurance, Software Configuration and Releases Management
  • Extensive experience in analyzing Business Requirement, Functional and Technical Specifications
  • Hand on experience in analyzing Business Requirement, Functional Requirement, Design specification and assisting in developing Test plan and Test scenarios
  • Hand on experience in Functional testing, Integration testing, System testing, Regression testing, Conversion testing, Interface testing, Parallel testing and End to End testing
  • Expertise in developing Test Strategy, Test Plans, Test scenarios using standard and modular script development approach in MS Office and Quality Center
  • Hand on experience with Agile/Kanban methodologies and used VersionOne, TFS as Agile lifecycle management tool
  • Hand on experience in ETL testing using ODI (Oracle Data Integrator), Informatica, DataStage 11.3 and reporting and analysis tools such Business object, Oracle OBIEE, Hyperion, EPM 11
  • Extensively used Informatica Workflow Manager to run the workflows/mappings and monitored the session logs in Informatica Workflow Monitor
  • Experience in Testing Database Applications of RDBMS in Oracle, SQL Server, DB2 and MS Access
  • Hand on experience in Data warehouse testing including ETL testing using ETL tool Oracle Data Integrator (ODI), Informatica Power center and Data Stage 11.3
  • Hand on experience in BI report testing using OBIEE, Hyperion, Enterprise Performance Management (EPM 11) and business object
  • Good understanding of OBIEE repository (Physical, Business Model and Mapping and Presentation layers) for both Stand-Alone and Integrated and analytics implementations
  • Hand on experience in writing complex SQL (DDL, DML and DCL) queries for back-end data validation testing
  • Extensive knowledge of ETL process, OLAP, OLTP and N tier data architecture and RDBMS
  • Expertise in writing simple and complex SQL queries using TOAD, SQL Developer
  • Hand on experience in Test Planning, Test Execution, Reports using Microsoft Test Management and Lab management
  • Expertise in Defect Tracking, Defect management and Reporting using tools like, Quality Center, BugTraker, JIRA
  • Hand on experience in Configuration and version management tool Subversion (SVN), JIRA
  • Experience in testing and validation of Business standard and ad-hoc reports
  • Ability to work as independent and team member in challenging and cross platform environment.
  • Highly motivated team player with excellent analytical, problem solving, interpersonal and communication skills

TECHNICAL SKILLS:

Languages: VBScript | ASP | HTML | SQL and PL/SQL | JAVA| .NET Framework | Visual Basic

Technologies: MS .Net Web Technology | Client/Server | PC | Citrix |Active Directory |LDAP |Oracle Identity Manager | Sail Point Identity IQ Connector Manager

Operating Systems: UNIX | Windows |Linux

Tools: Microsoft Team Foundation Server (TFS) | Microsoft Test Manager(MTM) | Application Life Cycle Management | Quality Center | RALLY|JIRA| SharePoint | Subversion | Version One

Data warehouse: Informatica Power Center | Oracle Data Integrator (ODI) | SSIS | AutoSys | DataStage 11.3

Reporting tools: OBIEE, Business Object, Hyperion, ESS base, EPM 11, Crystal

Databases: Oracle | MS Access | DBMS | DB2 | Informix | SQL Server

Methodologies: Waterfall | V model |Agile |Validation & Verification | Kanban

PROFESSIONAL EXPERIENCE:

Confidential, Phoenix, AZ

Sr. Data Warehouse Tester (ETL and BI)

Responsibilities:

  • Defined requirements, designed and documented standards, procedures, and implementation plans for Software Configuration Management
  • Managed requirements gathering and technical design for Data warehouse and reporting
  • Manage resource allocation & team members to design and execute test procedures
  • Assisted team as needed to help remove impediments and distractions that interfere with the ability of the team to deliver on the Sprint goal.
  • Facilitated daily stand-ups, backlog grooming/refinement sessions, sprint/story planning sessions and closely worked with product owner to maintain stories/tasks/tests and defects on VersionOne.
  • Closely worked with product owners, data architect and OBIEE architect to assist the team with any impediments, technical clarifications or requirement understanding
  • Responsible for providing weekly team’s progress report to the management and also sending notifications for weekly release review information
  • Facilitated retrospectives monthly basis to get feedback from each team member to list out, what is going well, what did not go well and what we can do to improve so that team can deliver user stories more efficient way
  • Customized and maintained Test management tool MTM
  • Responsible for Test cases development, Execution, defect management and tracking
  • Responsible for ETL validation, Data validation between source to target, Report validation
  • Responsible for creating test data, traceability matrix to test END to END validation of complex DataStage jobs
  • Completely tested and verified that the source data is extracted correctly, transformed and loaded into the target tables (ODS to EDW) by writing and executing the SQL queries
  • Responsible for Test cases development, Execution, defect management and tracking
  • Facilitate defect review meetings and improved existing defect management process, driven team to work in the faster manner by reporting defects status.
  • Worked closely with the business users and developers to make sure that business requirements are correctly translated into database schemas and ultimately into the application
  • Requirement validation of the reports and drill downs with business users
  • Validated the customization of the OBIEE Repository (Physical, Business and Presentation layer)
  • Created dashboards with global prompts, column selectors, navigation, and automatic drill-down
  • Tested the user level security for intelligence dashboards based on Business Requirement
  • Validated the Dashboards with Dashboard prompts, Charts, Pivot tables
  • Performed Data Validation of data in the reports and dashboards for OBIEE
  • Responsible for Defect life cycle process, maintain and track Defects using Quality Center.
  • Generate Defect reports using VersionOne and SQL statements, on the daily and weekly basis in the form of tables & Graphs using Microsoft Power Point, prepared associated action plans and documented, distributed across the various project teams.

Environment: DataStage 11.3, OBIEE 11g, Exadata, SQL, Microsoft Visual Studio, MTM, MS Office, SQL Server, Oracle Data Staging, Version One, Agile, Kanban

Confidential, MEMPHIS, TN

QA Lead/ Security Tester

Responsibilities:

  • Participate in requirements and functional specification reviews and walk through.
  • Reviewed and analyzed the requirements documents, Design documents.
  • Developed and maintain Functional and UAT Test Plans.
  • Developed Test Scenarios and Test cases for Functional, Integration, Security and UAT testing.
  • Developed and executed the test scenario to validate the security with single sign on, Roles, Responsibility and Entitlements for individuals and groups.
  • Responsible for validating Roles and Responsibility given to the user by checking against new hire data coming from HR department.
  • Reviewed and analyzed Password Synch up documents. Created and executed test cases in order to meet the functional requirement for Password match.
  • Developed Test scenario and test cases for provisioning and de-provisioning of the Mail box and Lync Accounts.
  • Tested Integration of provisioning solutions with Active Directory, LDAP on UNIX platform.
  • Developed process documents, user’s manual and defect logging process to support UAT testing.
  • Conducted test execution/status meeting with project stake holders.
  • Supported UAT testing and created defect guide lines documents for users using HP ALM.
  • Conducted meetings and walk through with users and junior team members to discuss the defect management process in JIRA.
  • Created integration test scenarios and tested application areas developed on SOA architecture and Web Services.
  • Generated and distributed test progress and defect status reports to the management team.
  • Facilitated defect review meetings and improved existing defect management process, driven team to work in the faster manner by reporting defects status.
  • Wrote and used SQL queries to setup and validate test data.
  • Performed UAT testing with business users.
  • Created and reported testing progress and status from JIRA

Environment: SailPoint IdentityIQ, Sun Identity Manager, Sun Access Manager, Active Directory, LDAP, UNIX .NET, Podio, Microsoft SQL Server 2008, Management Studio, MS office 2007, Share Point, Subversion, JIRA, J2EE, HTML, XML.

Confidential, OXFORD, MS

Sr. QA Analyst/BI Tester

Responsibilities:

  • Responsible for implementing the QA on the initial stage of SDLC, involved in requirements gathering and analysis along with Business Analyst.
  • Participate in gap analysis for the given requirement, Mock up reports and dashboards for review by the business users
  • Analyzed and reviewed Product backlogs, Task boards in TFS
  • Analyze technical design specification documents, business rules, data validation specifications to identify deficiencies in test plans and procedures, and enhance test overage accordingly
  • Responsible for conducting QA entrance and exit criteria meetings with project stake holders
  • Create integration test scenarios and tested application areas developed on SOA architecture and web services
  • Worked in an Agile environment using Sprint to develop plan, Test suites, test cases and executed test cases using Microsoft Test Manager (MTM) 2012
  • Manages, updated test plan, test cases and defects using MTM 2012
  • Participate in requirements definition reviews and functional specification reviews to ensure that critical information is included and testability requirements are met
  • Developed Test Approach and Test plan document for different Data Marts testing
  • Participate in production issues review meetings with business and development team
  • Responsible for developing the test cases to validate the ETL process, business rules, Data transformation rules, Reports layouts and data
  • Create requirements and Test matrix for ETL and Reporting Test cases in Quality Center
  • Develop test cases using standard test case template in Excel sheet and upload in Quality Center
  • Validated the access to the System - Security, navigation to a particular screen, dashboard, page, and report
  • Validate access to the right data, screens, dashboards, pages, and reports, response time to access the application, particular dashboard, page and report, capability and ease of using prompts to select data on a report with specific attributes, use of drilldown and navigation on Reports for OBIEE
  • Reported defects found during test cycles and Track the defects and retest fixed programs Developed and customized Excel reports and graphs using Dashboard for reporting in MTM.
  • Define and develop standard testing approach and time lines based on given testing time frame
  • Wrote and execute SQL Queries to get testing data to validate LOS Orders
  • Tracked, reported and follow-ups defects using Microsoft Test Manager

Environment: s: OBIEE 11.0, EPM 11, ODI 11.0, EPM 11, Microsoft Test Manager 2012, JIRA 7.0, TOAD, SQL Server 2010, ASP, Microsoft Visual Studio 11.0, VB scripts, TOAD 8.0,Window NT, SQL, .Net, Web Services, XML, XML Schema, DB2, Unix, Confluence, SharePoint.

Confidential, Kansas City, MO

Sr. Data warehouse Tester

Responsibilities:

  • Participate in gap analysis for the given requirement, Mock up reports and dashboards for review by the business users
  • Developed Test Approach and Test plan document for different Data Marts testing
  • Participate in production issues review meetings with business and development team
  • Analyze the Technical design specification documents, business rules, data validation specifications to identify deficiencies in test plans and procedures, and enhance test overage accordingly
  • Responsible for conducting QA entrance and exit criteria meetings with project stake holders
  • Responsible for developing the test cases to validate the ETL process, business rules, Data transformation rules, Reports layouts and data
  • Responsible for developing and executing test cases to validate the Tables/View structure, Data conversion, Data Mapping between source to staging, Staging to EDW and EDW to Data Mart
  • Responsible for writing and executing the test cases to validate the Transformation rules, Records counts, Logical and physical deletes, exceptions and errors for different stages of ETL
  • Responsible for validating the Schema and Fact tables, Dimension tables
  • Create requirements and Test matrix for ETL and Reporting Test cases in Quality Center
  • Develop test cases using standard test case template in Excel sheet and upload in Quality Center
  • Validated the access to the System - Security, navigation to a particular screen, dashboard, page, and report
  • Validate access to the right data, screens, dashboards, pages, and reports, response time to access the application, particular dashboard, page and report, capability and ease of using prompts to select data on a report with specific attributes, use of drilldown and navigation on Reports for OBIEE
  • Document software defects in defect tracking system Quality Center and keep defects up to date
  • Write and execute SQL queries using SQL developer, TOAD, Microsoft SQL Server
  • Requirement validation of the reports and drill downs with business users
  • Validated the customization of the OBIEE Repository (Physical, Business and Presentation layer)
  • Created dashboards with global prompts, column selectors, navigation, and automatic drill-down
  • Tested the user level security for intelligence dashboards based on Business Requirement
  • Validated the Dashboards with Dashboard prompts, Charts, Pivot tables
  • Performed Data Validation of data in the reports and dashboards for OBIEE
  • Used SharePoint’s to manage documents, schedule, pre-production meetings, Issues and Risks
  • Worked closely with national, State and County users, prepare the Information Bulletin for the releases and prepare the user’s manual for the field uses
  • Responsible for validating the files availability, format and data on UNIX server
  • Utilize defect tracking tool HP Quality Center to trace, assign, verify and close defects

Environment: Oracle Data Integrator 11g/10g, Oracle 11i, OBIEE 10.1.3, EPM 11, Hyperion 8.5, Erwin 3.1, TOAD, SQL Assistant.NET, Microsoft SQL Server 2008, SQL Server Management Studio, MS office 2007, Share Point, Subversion, Quality Center 11.0, Quick Test Professional (QTP)

Confidential, Philadelphia, PA

Sr. Data Warehouse/BI QA Analyst

Responsibilities:

  • Analyzed the Requirements from the client and developed Test cases based on functional requirements, general requirements and system specifications
  • Prepared test data for positive and negative test scenarios for Functional Testing as documented in the test plan
  • Prepared Test Cases and Test Plans for the mappings developed through the ETL tool from the requirements
  • Extensively used Informatica Workflow Manager to run the workflows/mappings and monitored the session logs in Informatica Workflow Monitor
  • Verified session logs to identify the errors occurred during the ETL execution
  • Created Test Cases, traceability matrix based on mapping document and requirements
  • Performed complex data validation using SQL queries
  • Verified the logs to identify the errors occurred during the ETL execution
  • Written several complex SQL queries for data verification and data quality checks
  • Reviewed the test cases written based on the Change Request document and Testing has been done based on Change Requests and Defect Requests
  • Preparation of System Test Results after Test case execution
  • Tested the ETL Informatica mappings and other ETL Processes (DW Testing)
  • Effectively coordinated with the development team for closing a defect
  • Prepared Test Scenarios by creating Mock data based on the different test cases
  • Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process
  • Debugging and Scheduling ETL Jobs/Mappings and monitoring error logs
  • Have tested reconcile and incremental ETL loads for the project
  • Tested data migration to ensure that integrity of data was not compromised
  • Ran Data Stage jobs by using Data Stage Director and by running UNIX scripts as well
  • Wrote complex SQL queries for extracting data from multiple tables and multiple databases
  • Used Rational Clear Quest as defect tracking tool
  • Before the Load is accepted to test, have performed the smoke or shake-out test
  • Used Oracle database to test the Data Validity and Integrity for Data Updates, Deletes & Inserts
  • Involved in Smoke and System Testing
  • Done data validation for the Cognos reports
  • Tested reports in Cognos using Analysis studio, Report studio and query studio
  • Worked in the role of data analyst in mapping and scrubbing sensitive data within the application
  • Analysis and data digging using SQL to find root cause of defects
  • Worked effectively to meet the deadlines for each release
  • Involved in testing Informatica mappings, by validating whether the mapping adheres to development standards and naming conventions; whether the mapping does what the design says it should do
  • Developed UNIX Shell Scripts for scheduling various data cleansing scripts and loading process
  • Provided the management with weekly QA documents like test metrics, reports, and schedules

Environment: Oracle 11g, Informatica Power Center 8.6, OBIEE 10.0, SQL, PL/SQL, UNIX, PuTTy, Session Log files, Flat files, XML files, DB2, Sybase, WinSCP, CompareIT, HP Quality Center, Autosys, SQL Server 2008, and TOAD

Confidential

Sr. Data warehouse/BI Tester

Responsibilities:

  • Defined requirements, designed and documented standards, procedures, and implementation plans for Software Configuration Management
  • Managed requirements gathering and technical design for Data warehouse and reporting
  • Administrate testing ids setup and maintenance using identity management tools.
  • Manage resource allocation & team members to design and execute test procedures
  • Customized and maintained Test management tool Quality Center
  • Led requirements definition and dimensional modeling for Oracle 11g ODI and OBIEE implementation. Utilized Oracle Warehouse Builder and early ODI ETL implementation Manage test lab resources and processes to allocate ETL testing and to simulate customer production environments.
  • Responsible for Test cases development, Execution, defect management and tracking
  • Facilitate defect review meetings and improved existing defect management process, driven team to work in the faster manner by reporting defects status.
  • Co-ordinate and execute the ESP jobs and ODI packages in test environment
  • Responsible for writing and executing the in SQL queries using TOAD
  • Responsible for ETL validation, Data validation between source to target, Report validation
  • Worked closely with the business users and developers to make sure that business requirements are correctly translated into database schemas and ultimately into the application
  • Requirement validation of the reports and drill downs with business users
  • Validated the customization of the OBIEE Repository (Physical, Business and Presentation layer)
  • Created dashboards with global prompts, column selectors, navigation, and automatic drill-down
  • Tested the user level security for intelligence dashboards based on Business Requirement
  • Validated the Dashboards with Dashboard prompts, Charts, Pivot tables
  • Performed Data Validation of data in the reports and dashboards for OBIEE
  • Responsible for Defect life cycle process, maintain and track Defects using Quality Center.
  • Generate Defect reports using Quality Center API and SQL statements, on the daily and weekly basis in the form of tables & Graphs using Microsoft Power Point, prepared associated action plans and documented, distributed across the various project teams.

Environment: SQL Developer, ODI Release 10.1.3, Business Object, Hyperion 8.5, Informix Client, Microsoft SQL Server 2008, SQL Server Management Studio, Windows Vista, Excel 2007, Word 2007

Hire Now