Etl Tester Resume
SUMMARY:
- Over twelve (12+) years of IT experience in the testing of Business Intelligence solutions using Data Warehouse/Data Mart, ETL, OLAP, and Client/Server Applications.
- Expertise in creating Test Plan documents and developing test strategy documents and preparing the Traceability Matrices.
- Experience in testing involving ETL systems (Data stage and Informatica)
- Strong noledge of different development lifecycles and methodologies, including SDLC, and Agile development approach.
- Strong experience in test management tools like Test Director, Quality Center, HPALM, JIRA, Rational Quality Manager
- Experience with Integration testing of the applications using various sources like flat files, Sybase and oracle databases.
- Strong working experience in the Data Analysis, Design, Development, Implementation and Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL)
- Experience in debugging the issues using Informatica Workflow manager, Informatica Workflow monitor, Autosys and database.
- Experience in Functional Testing, Integration Testing, Regression Testing, GUI Testing, Smoke Testing, System Testing.
TECHNICAL SKILLS
Operating Systems: UNIX and WINDOWS
Programming Languages: SQL, PL/SQL, JAVA, Unix Shell Scripting
Database Technologies: Oracle 10g/9i/8i,DB2, SQL server 2005,2008, 2012, Teradata 14 GUIToad, WinSQL, SAS Enterprise Manager and SQL Developer, Teradata SQL Assistant
Version Control Tools: VSS, SVN
Tools: & Utilities: Informatica, Datastage, Abinitio, SAS, OBIEE, Cognos, Crystal Reports, Business Objects, SQL*Loader, SQL Server DTS and BCP and Import/Export, Autosys, TWS
Others: SharePoint, MS Office, MS Visio, MS Project, Visual Source Safe
CASE Tools: ERWIN
Testing Tools: Test Director 7.0, 8.0, Quality Center 10.0,11.0, RQM
PROFESSIONAL EXPERIENCE:
Confidential
ETL Tester
Responsibilities:
- Ensure that quality processes are carried out through all phases of the Software Development Lifecycle
- Work with business partners, systems analyst, designers and programmers to create/analyze required project documents and ensures that quality assurance processes are incorporated
- Analyze and dissect system requirements and technical specifications to create and execute test cases for large business initiatives
- Ability to write test cases and execute testing within a Unix /Database environment
- Develop process for Data Analysis and testing Data flow.
- Apply solid understanding of the Software Development Life Cycle, Data Warehouse ETL and QA process.
- Establish and maintain test cases and test data
- Create and maintain test cases in Quality Center
- Actively participate in walkthrough, inspection, review and user group meetings for quality assurance
- Work with business users, system analysts, designers and programmers to create and analyze various required project documents
- Participate in production implementation verification and being accountable for validating system quality
- Plan, document, evaluate and track testing results to ensure system applications are free from defects
- Communicate and interact with appropriate areas on problems, changes and enhancements that may impact data, workflow and /or functionality within Information Technology software
- Comply with standards of the software development life cycle and follow strategies, plans and procedures within information Technology software
- Comply with standards and strategies of the software development life cycle
- Participate in Requirements and Design reviews, plan and estimate the QA effort. - Plan test strategies in accordance with project scope and develop schedules to meet delivery dates.
- Design test plans, scenarios, scripts, and procedures.
- Develop testing programs that address areas such as database impacts, software scenarios, regression testing, negative testing, error or defect retests, or usability
Environment: Datastage, DB2, TWS, Teradata, UNIX
Confidential
LEAD - ETL Tester
Responsibilities:
- Develop Test plans, Test Strategy, Test Cases & decided on automation when required.
- Run jobs from Informatica and Autosys.
- Created test data to validate business logic/requirements.
- Developed Test Plans, Test Cases, and Test Scripts for UAT tests.
- Tested Reports in the Dashboards for the Oracle Financial Analytics modules like GL, AP, AR and created the Ad-hoc report according to the client needs.
- Configured Interactive Dashboard with drill-down capabilities using global and local Filters, Metadata Objects and Web Catalog Objects.
- Tested Cascading Prompts and Multi Level Prompts on Dashboards.
- Identified issues, performance bottlenecks, and optimized the Business Intelligence Dashboards and Reports.
- Tested different kinds of reports (pivots, charts, tabular) using global and local Filters.
- Wrote Custom SQL Codes for the reports.
- Applied Optimization Techniques for the better performance of the reports.
- Involved in Monitoring and Scheduling different process flows using Oracle Work Flow Manager.
- Used external tables to Transform and Load data from Flat files into target tables.
- Used Debugger to validate Mappings by creating break points to analyze and monitor Data flow.
- Involved in troubleshooting the load failure cases, including database problems.
- Created Test Cases, Test Scripts and defects in RQM Rational Quality Manager.
- Managed and conducted System testing, Integration testing, Functional testing, and Regression testing.
- Execute batch processing and verify the jobs status and data in database tables.
- Tracked the defects using RQM and generated defect summary reports.
- Prepared status summary reports with details of executed, passed and failed test cases.
- Interacted with developers, Business & Management Teams and End Users.
- Validated requirements are correctly implemented using database, INFA mapping designer and Autosys.
- Deployed Autosys file (.txt and .jil) in Autosys database using Autosys GUI.
- Created and changed the database connections in INFA workflow manager.
- Changed the SQL Query in INFA mapping designer and INFA Workflow manager
Environment: Informatica, OBIEE, Autosys
Confidential
ETL Tester (QA Lead)
Responsibilities:
- Develop Test plans, Test Strategy, Test Cases & decided on automation when required.
- Run SQL query in Apache Hive to validate data.
- Created test data to validate business logic/requirements.
- Involved in the entire project life cycle from analysis, installation, development, testing, production and end user support
- Developed Test Plans, Test Cases, and Test Scripts for UAT tests.
- Created Test Cases in Quality Center and mapped Test Cases through developed Tractability Matrix and Test Coverage reports.
- Managed and conducted System testing, Integration testing, Functional testing, and UAT and Regression testing.
- Execute batch processing and verify the jobs status and data in database tables.
- Tracked the defects using Quality Center and generated defect summary reports.
- Prepared status summary reports with details of executed, passed and failed test cases.
- Used Autosys jobs to run Informatica Workflowa.
Environment: Informatica, SAS, Island Pacific (AS/400), OBIEE, Autosys
Confidential
Responsibilities:
- Involved in Creating and Administering the Physical Layer, Business Model & Mapping
- Involved in Design and Data Modeling using Star schema.
- Developed several mappings to load data from multiple sources to data warehouse.
- Develop Test plans, Test Strategy, Test Cases & decided on automation when required.
- Involved in the entire project life cycle from analysis, installation, development, testing, production and end user support
- Developed Test Plans, Test Cases, and Test Scripts for UAT tests.
- Developed shell scripts to run Abinitio jobs
- Executed Informatica workflows from Informatica workflow manager and Autosys.
- Layer and Presentation Layer using Oracle Business Intelligence Admin tool.
- Created connection pools, physical tables, defined joins and implemented authorizations in the physical layer of the repository.
- Created Dimensional Hierarchy, Level based Measures and Aggregate navigation in BMM layer.
- Managed security privileges for each subject area and dashboards according to user requirements.
- Created groups in the repository and added users to the groups and granted privileges explicitly and through group inheritance.
- Developed custom reports/Ad-hoc queries using Oracle Answers and assigned them to application specific dashboards.
- Developed different kinds of Reports (pivots, charts, tabular) using global and local Filters.
- Handled Full load and refresh load via staging tables in the ETL Layer.
- Developed and tested Store procedures, Functions and packages in PL/SQL for Data ETL.
- Managed and conducted System testing, Integration testing, Functional testing, and UAT and Regression testing.
- Loaded data to different databases using SQL scripts to create required test data.
- Executed shell scripts to run PL/SQL programs, Cognos reports from Autosys and jobs.
Environment: Oracle, Business Objects, Autosys, Informatica
Confidential
LEAD - QA Test Analyst
Responsibilities:
- Develop Test plans, Test Strategy, Test Cases & decided on automation when required.
- Involved in the entire project life cycle from analysis, installation, development, testing, production and end user support
- Developed Test Plans, Test Cases, and Test Scripts for UAT tests.
- Used Informatica as an ETL Tool for testing the Data Warehouse.
- Created Test Cases in Quality Center and mapped Test Cases to Requirements in Req Pro.
- Developed Tractability Matrix and Test Coverage reports.
- Managed and conducted System testing, Integration testing, Functional testing, and UAT and Regression testing.
- Loaded data to different databases using SQL scripts to create required test data.
- Used Shell scripts extensively for automation of file manipulation and data loading procedures.
- Execute batch jobs and verify status and data in database tables.
- Tracked the defects using Clear Quest and Quality Center and generated defect summary reports.
- Executed shell scripts to run PL/SQL programs and jobs.
- Executed Informatica sessions and tasks from Informatica workflow manager and validated results from database and Informatica workflow monitor.
- Prepared status summary reports with details of executed, passed and failed test cases.
- Validated report layout, data with requirements.
Environment: Informatica, Crystal Reports, Teradata
Confidential, Saint Paul, MN
Senior ETL Tester
Responsibilities:
- Develop Test plans, Test Strategy, Test Cases & decided on automation when required.
- Developed Test Cases for Functional, Integration and Regression Testing.
- Collected the test data from the central system to validate the test cases.
- Analyzed use case requirements and developed test cases.
- Performed queries to the database using SQL to check the data integrity using TOAD.
- Participated in Testing Methodologies like Planning, Execution, Bug Tracking and Analyzing.
- Created and executed SQL queries to fetch data from the database to validate and compare expected results with those actually obtained.
- Verified the bugs fixed by developers during each phase of testing such as Black Box Testing.
- Analyzed, documented and maintained Test Results and Test Logs.
- Run UNIX jobs in Putty to validate input and output files.
- Used manual testing for Regression testing for each new build under test.
- Used Mercury’s Test Director to log defects and bugs.
- Run Abinitio jobs to validate mappings.
- Managing the Defect Tracking process, which included prioritizing bugs, assigning bugs, and verifying bug-fixes, using Test Quality Center
Confidential
ETL Test Analyst Leader
Responsibilities:
- Involved in analyzing the business process through Use Cases, Work Flows and Functional specifications.
- Analyzed the business and functional requirements of the application and developed detailed test plans, test cases in Test Director.
- Defined the test criteria and project schedules, and baseline the Test Plan with the halp of project meetings, walkthroughs.
- Developed the Test plans for quality assurance based on functional requirements.
- Involved in the automation of testing process QTP as testing tool.
- Performed database testing using SQL.
- Run the Datastage jobs.
- Memory leaks, Network bottlenecks, Response time for each scenario was monitored.
- Involved in the documentation of the complete testing process.
- Interacting with the development and testing teams to improve overall quality of the software.
- Involved in creating periodic status reports.
- Followed Rational Unified Change Management Process.
- Created GUI, Bitmap, Database and Synchronization Checkpoints to compare the behavior of a new version of the application with the previous version
- Developed Test Strategy, prepared Track Sheets to keep track of the tasks assigned to the Jr. Testers, and resolved issues.
- Actively participated in BUG meetings to resolve the defects in efficient and timely manner.
- Assured that all QA Artifacts are in compliance with corporate QA Policies and guidelines.
- Used Test Director to detect defects, communicates to the developers and tracking the defects.
- Coordinated and communicated the whole QA effort by effectively working with different teams.
Environment: Datastage, Oracle 10g, Cognos
Confidential
Project Leader
Responsibilities:
- Reviewed the System Requirement Specs (SRS).
- Analyzed business requirements document and Involved in developing test plan, test objectives, test strategies, test priorities etc.
- Managed requirements and developed Test Scripts and Test Cases using Test Director.
- Mapped requirements to business scenarios to assure that all requirements were covered.
- Implemented and automated regression test scripts based on business requirements using Win Runner. Enhanced the scripts by adding control and conditional statements using TSL
- Used GUI, Text, Bitmap, Database checkpoints and synchronization statements extensively to customize the Win Runner scripts. Parameterized the scripts and data drove them to improve the flexibility of the tests.
- Involved in the performance testing and analyzed the response times under various loads.
- Used Performance monitor and Load Runner graphs to analyze the results.
- Manually performed Back-End testing by writing SQL queries.
- Performed testing in Mainframe using Attachmate
- Worked with users to develop user acceptance plan and test cases.
- Tracked bugs using Test Director and performed regression testing of the entire application once the bugs are fixed.
- Attended various meetings with the developers, clients, and the management team to discuss major defects found during testing, enhancement issues, and future design modifications.
- Developed detailed test conditions and documented test scripts and test procedures.
- Developed various reports and metrics to measure and track testing effort.
- Used Win Runner for regression testing and Load Runner for server performance testing.
- Planning and estimating for the testing efforts, keeping the plan up to date.
- Coordinating with development team at on-site.
- Coordinating with testing team at offshore.
Environment: Informatica, Oracle 9i, Microstrategy
Confidential
Test Analyst
Responsibilities:
- Created Quality Assurance test strategy. Coordinated the entire test effort with team members and developers.
- Developed test plan based on Business Specifications.
- Prepared test cases using Mercury Interactive Test Director based on the business requirements and the Test Plan.
- Reviewed the test cases written by the team members and suggested any updates if necessary.
- Update the test cases according to the change requests approved.
- Keep QA Network Drive Folders/Contents “Clean” and Up-to-Date.
- Provide daily and weekly updates to Project Manager regarding testing progress and notify issues if any.
- Follow up on issues and make sure they get resolved ASAP so that they do not impact testing plan.
- Organize and conduct weekly Defect Review meetings.
- Communicate with Offshore team regarding project updates and work assignment on a daily basis.
- Co-ordinate with the development team to have a quick turnaround of defects.
- Mentored team members on using the tools like Test Director and Clear Quest efficiently.
- Verify data in Mainframes and validate it against the UI.
- Enhanced existing Test Scripts based on changes to the requirements.
- Used Test Director for writing and executing Test Cases.
- Used Clear Quest for defect tracking.
- Involved in discussions with the Business Team and the developers regarding any changes in the requirements.
Environment: Informatica, Oracle 9i
Confidential
Senior Tester
Responsibilities:
- Reviewed the System Requirement Specs (SRS).
- Create Use Cases process Flows, Data Flows, transitions and Decision trees by conducting Interviews, requirement workshops, brainstorming sessions; questionnaires with actuarial, stats, finance teams and getting the requirement of the system and changes need to be done.
- Analyzed business requirements document and Involved in developing test plan, test objectives, test
- Created Test strategies, test priorities etc.
- Managed requirements and developed Test Scripts and Test Cases using Test Director.
- Mapped requirements to business scenarios to assure that all requirements were covered.
- Involved in the performance testing and analyzed the response times under various loads.
- Used Performance monitor and Load Runner graphs to analyze the results.
- Manually performed Back-End testing by writing SQL queries.
- Worked with users to develop user acceptance plan and test cases.
- Tracked bugs using Test Director and performed regression testing of the entire application once the bugs are fixed.
- Attended various meetings with the developers, clients, and the management team to discuss major defects found during testing, enhancement issues, and future design modifications.
- Developed detailed test conditions and documented test scripts and test procedures.
- Developed various reports and metrics to measure and track testing effort.
- Used Win Runner for regression testing and Load Runner for server performance testing.
- Planning and estimating for the testing efforts, keeping the plan up to date.
- Preparation of system test plan & performance test plan.
- Coordinating with development team at on-site.
- Coordinating with testing team at offshore.
- Preparation of system test plan & performance test plan
Environment: ASP.NET, SQL SERVER 2000, JavaScript
