We provide IT Staff Augmentation Services!

Sr. Etl - Qa Tester Resume


  • 6+ years of extensive experience in Quality Assurance testing of data warehousing and Business Intelligence processes, Web Based and Client/Server applications.
  • Excellent analytical skills for understanding the business requirements, business rules/processes, and detailed design of the application.
  • Experience in creating functional/technical specifications, data design documents based on the requirements
  • Expert in writing Test Plans, defining Test Cases, developing and maintaining Test Scripts, Test Case Execution, Analyzing Bugs and interacting with team members in fixing the errors as per specifications and requirements.
  • Experienced with Full Life Cycle and Methodology for implementing Data warehouse and Business Intelligence Reporting System.
  • Extensive experience in quality assurance using Manual Testing, Automated testing tools like Quick Test Pro, Quality Center.
  • Expertise in working in Agile (Scrum), Waterfall, Spiral methodologies.
  • Extensively strong on databases including Oracle 11g/10g/9i/8i, MS SQL Server 2012/2008R2 and IBM DB2, Teradata
  • Experienced in both Manual and Automated testing tools using HP Quality Center, Test Director and QTP with VB Script
  • Knowledge in Business Intelligence tools like SSRS, Micro strategy 9.0, Business Objects, Cognos and OBIEE
  • Having extensive experience in rational tools - Rational Quality Manager (RQM), Rational Doors Next Generation (RDNG), Rational Clear Quest (CQ) and RTC for defect tracking.
  • Good understanding of Relational Database Design, Data Warehouse/OLAP concepts and methodologies
  • Implemented Optimization techniques for better performance on the ETL side and on the database side
  • Excellent knowledge on Agile Methodology.
  • Extensive experience in writing SQL and PL/SQL scripts to validate the database systems and for backend database testing.
  • Experience in complete Software Testing life cycle: Unit Testing, Functional Testing, Regression, Integration, System Testing, Performance testing and Data driven testing.
  • Expertise in Defect Reporting and Tracking using Test Director/ HP Quality Center.
  • Extensive experience in testing applications on UNIX environment.
  • Excellent in meeting deadlines, time management and presentation skills.
  • Excellent team player with problem-solving and trouble-shooting experience
  • Worked on different platforms such as Windows XP,8 and UNIX, Sun Solaris, AIX, HP
  • Excellent Communication, interpersonal, analytical skills and strong ability to perform in a team as well as individually.


Quality Assurance Tools /Test Management: HP ALM 11.5/HP Quality Center 10, Test Director 7.5, IBM Clear Quest

Environment: Windows 98/2000/NT/XP, Unix, Linux.

Languages: SQL, PL/SQL

RDBMS: Oracle 8i/9i/10g/11g, SQL Server 2000/2005/2008 , Teradata, Sybase, DB2

GUI/RDBMS Query Tools: SQL* Plus, Oracle SQL Developer, SQL Server Management studio, Teradata SQL Assistant 13.0, TOAD, Win SQL

ETL Tools: Informatica Power Center, Ab Initio, SSIS and Data stage

Bug Tracking tools: JIRA, Jenkins, Github.

Other Tools: Putty, WinSCP, FileZilla, Ms-Office.



Sr. ETL - QA Tester


  • Review Mapping Documents formula and rule engine.
  • Analyzing change request for the existing mapping and creating an impact analysis for the change on Data.
  • Writing new mappings specification according to the business data requirements.
  • Developed Test Plan, Test Cases, Test Data and Test Summary Reports and followed Agile/Scrum process.
  • Involved in the Agile Scrum Process.
  • Involved in transactions claims analysis design, implementations and documentation.
  • Suggested and reviewed Materialized views to store aggregate values.
  • Suggested and reviewed Materialized views to store aggregate values
  • Prepared Test data preparation strategy and data masking strategy for the shared test environment
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Perform requirements gathering for Transaction claims, process and review Transactions claims.
  • Worked with different Department stakeholders to gather requirements in Scrum sessions to identify the business requirements, financial requirements to be processed. Identify and Create Test Cases and Scenarios.
  • Expertise in modifying and executing the python scripts to run the tests on big data.
  • Interacting with senior peers or subject matter experts to learn more about the data migration and Involved in completing data testing before the UAT.
  • Co-ordinate with development and QA team for handling of defects with respect to data.
  • Tested several Informatica Mappings to validate the business conditions.
  • Provide support during parallel and reverse parallel runs.
  • Identifying duplicate records in the staging area before data gets processed
  • Extensively written test scripts for back-end validations.
  • Automated and scheduled UNIX Shell Scripting with the help of DBA team.
  • Created data rules to handle SCD Type II data using PL /SQL.
  • Helped sequence the workflows for batch based on dependency to improve the time taken for end to end loading
  • Expertise in Rational Quality Management tools (RQM and RTC)
  • Tested and validated the BO reports by running similar SQL queries against the source system(s) as well as the target systems.
  • Worked in all areas of Jenkins setting up CI for new branches, build automation, plugin management and securing Jenkins and setting up master/slave configurations.
  • Expertized in using JIRA software with Jenkins and github for real time bug tracking and issue management.
  • Troubleshoot build issues in Jenkins, performance and generating metrics on master's performance along with jobs usage.

Environment: HPALM, SQL, PL/SQL TOAD, Oracle 11g, Agile/Scrum, MS Office, HIVE, Hadoop, SAP HANA Studio, UNIX, Shell Scripting, Informatica, Jenkins, Github, JIRA.

Confidential, Columbus, OH

Sr. ETL - QA Tester


  • Experience in migrating data from the Database by generating the SQL from the Pivot table
  • Performed dynamic pivoting by using dynamic SQL by dynamically building the pivoting query
  • Worked on creating a custom SQL query on the DB to test the query, and then put the query inside an Excel pivot table in order to display the data.
  • Performing integration testing and validating SIT and CT environments for UAT and Carrier Testing
  • Tested the reports generated by OBIEE and verified and validated the reports using SQL.
  • Manipulating, cleansing & processing data using Excel, Access and SQL
  • Writing SQL scripts to manipulate data for data loads and extracts.
  • Writing test cases to compare the data between source and target databases
  • Writing complex SQL queries to check the Views built on the source tables and to compare data between source and target
  • Worked with SQL Queries using joins to facilitate the use of huge tables for processing several records in the database.
  • Identify business rules for data migration and Perform data validations.
  • Provide the Business Users with all the required details/tools, user manuals & guides, related information to conduct their UAT
  • Performed front-end testing on OBIEE Executive Dashboard portal.
  • Developed stored procedures to validate the data obtained.
  • Testing the source data for data completeness and data correctness.
  • Participate in the creation of Test Scenarios & Test Cases with the UAT Team and the Business Analysts.
  • Played a functional data SME role in the development of the Marketing Measurement and Metrics reporting dashboard implemented in Siebel Analytics /OBIEE environment.
  • Checking the PL/SQL procedures that load data into the target database from standard tables.
  • Testing flat file data in Unix environment by using complex Unix commands
  • Coordinating with offshore team for testing purposes.
  • Assigning tasks to all testing team members.
  • Tested several data migration application for security, data protection and data corruption during transfer
  • Involved in the Agile Scrum Process.
  • Testing the various ETL processes that were developed
  • Verify the ETL process by running the Informatica workflows (both from Informatica Workflow Manager and through UNIX scripting), monitor the status in Informatica Monitor and verify logs
  • Worked with both offshore and onsite and co-ordinate with business teams and technology team
  • Tested all OBIEE Dashboards according to the requirement
  • Testing objects in the universe, to ensure the correct mapping of the objects.
  • Testing and resolving loops and contexts to ensure the correct results from the query.
  • Testing the universe structure to ensure the tables are properly updated.
  • Performing end to end testing once the individual processes were tested.
  • Performing Regression testing
  • Participating in the requirements gathering meetings, sprint planning meetings and defect review meetings.

Environment: Agile, Informatica, Oracle 11g, Unix Shell Scripting, OBIEE,SQL, PL/SQL, Toad for Oracle, Jira, Unix, Windows XP platform, MS Office, Soap UI, Data Profiling, Excel.

Confidential, Atlanta, GA

ETL/Big data Tester


  • Developed and conducted a wide range of tests and analysis to ensure that software, systems, and services meet minimum company standards and defined end-user and system requirements.
  • Involved in high and detail level design reviews to ensure requirement traceability and to determine application/component functional readiness requirements.
  • Being part of the test team, responsibilities involved writing complex queries using SQL and PL/SQL to generate data based on the complex derivations for each attribute.
  • Worked with systems engineering team to deploy and test new Hadoop environments and expand existing Hadoop clusters.
  • Developed Pig UDFs for preprocessing the data for analysis and handle any kinds of additional functionalities needed.
  • Used Spark-Streaming APIs to perform necessary transformations and actions on the fly for building the common learner data model which gets the data from Kafka in near real time and persists into Cassandra
  • Complex SQL queries were written to compare data generated by the application against the expected results generated based on mapping requirements for each interface.
  • Exclusively involved in execution of Autosys jobs, PL/SQL batch programs and responsible for reporting the defects to development team.
  • Ran UNIX shell scripts to count the records for EDW source to staging tables.
  • Worked on Autosys, Unix, Hadoop, Hive, Impala and shell scripting for big data testing. Leading the team for the same.
  • Expert in Waterfall Lifecycle, AGILE and Iterative project testing methodologies
  • Performed manual testing to conduct backend testing using UNIX shell scripts and SQL Queries
  • Validating the data passed to downstream systems.
  • Installed and configured Hadoop Map Reduce, HDFS, developed multiple Map Reduce jobs and PIG
  • UDFS in java for data cleaning and preprocessing.
  • Involved in maintaining the test environments; with activities like requesting data loads, data base backups, restarting the servers, requesting the deployments, troubleshooting issues.
  • Tracked and executed the User Acceptance Test Cases with respect to the requirements to determine the feature coverage.
  • Involved in backend testing for the front end of the application using SQL Queries in Teradata data base.
  • Responsible for testing Initial and daily loads of ETL jobs.
  • Involved in testing the batch scheduling programs by using the Autosys tool.
  • Responsible for preparing test cases and scenarios for System testing and UAT testing.
  • Involved in creating various data scenarios (Excel Sheets having expected data) and validating them before and after running the respective Autosys batches.
  • Worked with non-transactional data entities for multiple feeds of reference data using Master Data Management.
  • Developed UNIX scripts for file archiving, file transfers and file management.
  • Involved in Database Validations, Writing Test scripts (Including the related SQL Statements by joining various tables) depending on Requirements for both Positive and Negative Scenarios.
  • Automating the Data and Validating in various builds, using QTP and Quality Center.
  • Attended and worked with agile scrum standup meetings.

Environment: Agile, Informatica, Big Data/Hadoop, SQL, PL/SQL, Toad, Unix, shell script, Business Objects XI 3 Reports, DOORS, Oracle SQL Developer, HP/ALM Quality Center, QTP,UNIX.

Hire Now