We provide IT Staff Augmentation Services!

Sr. Big Data/ Hadoop tester Resume

4.00/5 (Submit Your Rating)

San Francisco, CA

SUMMARY:

  • 9 years of IT experience as Quality Assurance (QA) Analyst in Hadoop, Big data Technologies, Java, ETL and BI Tools.
  • In depth knowledge of Testing methodologies, concepts, phases, and types of testing, developing Test Plans, Test Scenarios, Test Cases, Test Procedures, Test Reports and documenting test results accordingly after analyzing Business Requirements Documents (BRD), Functional Requirement Specifications (FRS).
  • Well versed in Hadoop ecosystem - Map Reduce, Pig, Hive, HBase, Oozie and Sqoop..
  • Experience in installing, configuring, debugging and troubleshooting Hadoop clusters.
  • Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch. 
  • Experience in importing and exporting data from relational database into Hadoop cluster using sqoop.
  • Experienced in creating Hive, Pig and custom map reduce programs for analyzing data.
  • Experience in validating and analyzing Hadoop log files.
  • Experience in loading multiple larger datasets into HDFS and processing the datasets by using the Hive and Pig.
  • Experience in validating tables with Partitions, bucketing and Loading data into HIVE tables.
  • Experience in validating map-reduce jobs to support distributed processing using java, hive and pig.
  • Experience writing Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).
  • Experience in Map Reduce programming model for analyzing the data stored in HDFS.
  • Experience in validation of Map Reduce codes as per business requirements.
  • Experience in validating connectivity products that allow efficient exchange of data between core database engine and Hadoop ecosystem. 
  • Skilled in using Zookeeper and Apache Oozie Workflow Engine for coordinating the cluster, automation and scheduling workflow jobs.
  • Exclusive experience in all aspects of Software Test Life Cycle including System Analysis, Design, Development, Execution, Reporting and Closure Documentation.
  • Experience in writing and executing Test Plans and Test Cases from Requirements and Design documents.
  • Excellent experience validating web services using Soap UI.
  • Used ETL methodologies for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using Informatica PowerCenter.
  • Experience in report data validations and cosmetic data validations which are developed in Cognos /MicroStrategy and Business Objects.
  • Expert in writing complex SQL Queries to check the integrity of data to perform database testing.
  • Have solid experience on database query tools such as TOAD, SQL Navigator, Teradata SQL Assistant and SQL Plus.
  • Prepared Test Cases and Test Plans for the Enterprise Data Warehouses and Web Applications and executed those Test Cases and Test Plans.
  • Strong in testing Stored Procedures, Functions, and packages utilizing PL/SQL. 
  • Experience working in Data Integration projects.
  • Experience on UNIX commands and Shell Scripting
  • Experienced in Functional or System testing, Unit Testing, Integration testing, Regression testing, UAT, GUI or Web-based Testing.
  • Very good understanding of Data Warehousing concepts, Data Analysis, Data Warehouse Architecture and Designing. 
  • Excellent interpersonal, analytical and organizational skills.
  • Extensive experience in working with Tableau Desktop, Tableau Server and Experience in using Tableau functionalities for creating different Requests, Filters, Charts, Interactive dashboards with Page and dashboard Prompts. 
  • Understanding of Data Models, Data Schema, ETL and created extensive store procedures, SQL queries to perform back-end data warehousing testing.
  • Experienced in Bug Reporting and Defect tracking using tools like Rational ClearQuest, HP Quality Center and Test Director to log and defect management.
  • Strong analytical, dynamic trouble-shooting and requirement traceability skills.
  • Experienced in interacting with Clients, Business Analysts, leads, and UAT Users.

TECHNICAL SKILLS:

Big Data Technology: HDFS, MapReduce, Hive, Pig, HBase, Sqoop, Oozie, ZooKeeper, Cassandra, Flume

Databases: Oracle 11g/10g/9i, Teradata, DB2, SQL Server

Methodologies: Waterfall, Agile/Scrum

ETL Tools:   Informatica Power Center, Ab Initio, Data Stage, SSIS

Reporting Tools: Tableau, Cognos, Business Objects BOXI, MicroStrategy, SSRS

Testing Tools: Test Director, HP Quality Center, Quality Center 10.x, 9.x,

DB Query,SOAP UI:

Tools: TOAD, SQL Navigator, Oracle SQL Developer

Programming Languages: JAVA, Python, SQL, PL/SQL

Requirement Tools: Rational Requisite Pro, Doors

Version Control Tool: Rational Clear Case, Win CVS, SVN

XML Tools: XML Spy

Operating Systems: MS DOS, Win 98/2000/XP/Win-7, UNIX, Linux

Defect Tracking Tools: Rational Clear Quest

Scheduling Tools: Autosys, Cron-Tab

Other Tools: Putty, Text Pad, CompareIT, WinSCP, Data Flux, GS Data Generator

PROFESSIONAL EXPERIENCE:

Sr. Big Data/ Hadoop Tester

Confidential, San Francisco, CA

Responsibilities: 

  • Involved in CIT, SIT, UAT, Production parallel run testing based on the requirements surveys completed by development team. 
  • Conducted meetings as needed in SIT and daily test meetings in UAT with downstream users. 
  • Involved in creation of tables (build out of test region) in respective environments by fork lifting (loading and unloading) the data from production using various Teradata utilities like BTEQ, Tpump, FastLoad, FastExport/Mainframe JCLs. 
  • Tested Map Reduce programs to parse the raw data, populate staging tables and store the data in HDFS.
  • Created Hive queries which helped analysts spot emerging trends by comparing fresh data with historical claim metrics. 
  • Used Flume-ng to move data from individual data sources to Hadoop system.
  • Used MRUnit framework to test the Map Reduce code.
  • Involved in setting up the testing environments and prepare test data for testing flows to validate and prove positive and negative cases 
  • Validated the Map reduce, Pig, Hive Scripts by pulling the data from the Hadoop and validating it with the data in the files and reports.  
  • Involved in all over the quality controls (Hadoop & Teradata), data validations, Key business checks, RTAS implementation of Hadoop ETL flow and Teradata landing zone/target tables.
  • Coordinated various project specific access requests related to Hadoop components like Edge node and HDFS directories, Hive tables, Autosys instances, Mainframe LPARs. 
  • Coordinating various database requests like Access category, DB questionnaire, Initial Environment Setup, DDL notepad through nexus and other ticketing portals.
  • Performed various Hadoop controls like date checks, record checks, balance checks, threshold limits for records and balances etc. for various SORs (system of records) like Data and stat files on checkpoint basis. 
  • Performed data validations between summary files and extract files after ETL flow. 
  • Performed various Teradata controls like comparisons between Landing zone tables and tracing tables generated during the ETL process. 
  • Created the data extract by connecting to Teradata to place the files on Tableau server. 
  • Scheduled the Oozie workflow jobs with pig scripts and Sqoop (importing/exporting) process. Scheduling the workflow jobs with Autosys due to calendar flexibility for all environments. 
  • Prepared the Linux Shell scripts with profile files by maintaining all global environment/application level variables. 
  • Performed Defect Reporting, Analyzing, Tracking and Report Generation using HP Quality Center 11.52 ALM. 
  • Attended the daily Bug review meetings, weekly status meetings and walkthroughs and interacted with Business Analysts and Developers for resolving Defects.
  • Performance testing metrics were captured for all test phases before migration of components like validation scripts/Autosys JILS/DMX ETL jobs etc. to Production. 
  • Consulted certification teams regarding the associate KBEs tested for project and creating reusable test scripts to execute/provide results. 
  • Coordinated with all the teams, representatives, and downstream users involved in testing for sign off approvals.

Environment/Tools: Hadoop, Sqoop, Flume-ng, HiveQL, Pig Latin, HP ALM, UNIX, Informatica Power Center 9.1, Teradata, DB2, Tableau, VSAM files, Flat files, XML, Shell Scripting, Super Putty, WinSCP, SVN, TOAD, CA-7, Autosys, Agile

Sr. Hadoop/ ETL Tester

Confidential, Charlotte, NC

Responsibilities:

  • Experienced in validating the source data with the target data in data warehousing application and also in reports in Client/Server and Web Based environment.
  • Involved in developing Test Cases, Test Plans, Test Execution, Defect Tracking, and Report Generation using Quality Center / HP ALM based on functional specifications.
  • Involved in end-to-end defect management of assigned projects. Identified defects, assess root cause, and prepared detailed information for developers and business stakeholders.
  • Experienced in Data Validation and Backend testing of databases to check the integrity of data.
  • And also used extensively HQL Queries to analyze the HDFS data
  • Used HP ALM for Test Management, Defect Management and save/manage the automation scripts created using QTP
  • Experience in testing of Data Warehouse/ETL Applications developed in Informatica, Ab initio using SQL Server, Oracle, Hadoop, DB2, and UNIX and also have ability to evaluate ETL/BI specifications and processes.
  • Experience in UNIX, RDBMS, Hadoop, HIVE (HQL), Oracle (PL/SQL), MS Access.
  • Experienced in Black Box, White Box, Integration, Regression,Functional, Front End and Back End Testing.
  • Involved in establishing automated Hadoop Integration testing system and implementing oozie workflow.
  • Responsible for Analysis and Defect Tracking using HP Quality Center/ALM, Test Director, JIRA, IBM Clear Quest.
  • Implemented Optimization techniques for better performance on the ETL side and also on the database side
  • Experience with different file systems /databases like Oracle,HDFS Teradata, and MS SQL Server to extract and load data using sqoop.

Environment/Tools: HDFS, Map Reduce, Pig, Hive, Oracle 10g/11g, HQL, Data Analysis, Informatica Power Center 9.1, Rational ClearCase, ClearQuest, PL/SQL, Autosys, XML,TOAD, HP ALM, PuTTY, UNIX, Agile

Sr. Hadoop QA Analyst

Confidential, Columbia, MD

Responsibilities:

  • Prepared test cases, scripts based upon the business requirements documentation (BRD), use case documentations, and functional requirements specification (FRS).
  • Performed Data Analysis for all incoming feeds to ETL. Worked with Business Unit Managers for developing Mapping Document after Data Analysis.
  • Created different objects: Stored Procedures, triggers script to populate data into different tables according to different parameters specified.
  • Created Hadoop jobs for processing and analyzing millions of records of data. 
  • Developed shell scripts to validate Hadoop daemon services and reported accordingly to any warning or failure conditions.
  • Validated Pig UDF’S to pre-process the data for analysis.
  • Validated the data load process for Hadoop using the HiveQL qurey’s.
  • Error checking and testing of the ETL procedures and programs using Informatica session log.
  • Focused on Data Quality issues/problems that include completeness, conformity, consistency, accuracy, duplicates, and integrity.
  • Used Workflow Manager for Workflow and Session Management, Database Connection Management and Scheduling of jobs.
  • Performed testing the ETL code and was also involved in Unit testing, System testing and integration testing of the project.
  • Involved in validating the XML files coming from third party vendors as EDI transactions.
  • Validated the quality of data coming through the EDI transactions.
  • Validated the responses of web services by passing various requests using soap UI.
  • Exclusively involved in execution of Autosys jobs, PL/SQL batch programs and responsible for reporting the defects to development team.
  • Performed parallel testing on EDIs with the third party vendors till both parties are satisfied with the results.
  • Used HP Quality Center as Test Management Tool.
  • Performed regression testing using QTP.
  • Used TFS to store, schedule the test cases and report the Defects.
  • Involved in preparing the Mock test data for both positive and negative scenarios.
  • Involved in running SSIS ETL packages, and tested them according to requirements.
  • Have performed Data Analysis on SSIS ETL packages, and worked on Data Validation as well. Involved in data integration of the project.
  • Extensively performed backend testing on databases by writing complex SQL queries.
  • DW was tested for the row counts and errors after each transaction loads.
  • Coordinating with source system owners, day-to-day ETL progress monitoring and maintenance of daily Informatica batch schedule run on a nightly basis.
  • Defects were identified, provided documentation to the development team for debugging.
  • Tested standard and Ad hoc reports and undergone data validation for the Cognos reports.

Environment/Tools: HDFS, Map Reduce, Hbase, Cassandra, Pig, Hive, Oracle 10g/11g, SQL, Data Analysis, Informatica Power Center 9.1, Rational ClearCase, ClearQuest, TFS, Cognos, Soap UI,EDI, Web Services, SOA, QTP, SQL Server 2012, SSIS, SSRS, SSAS, Manual Testing, PL/SQL, Autosys, XML, XSLT, TOAD, Quality Center,, PuTTY, UNIX, Agile, DOORS

Sr. DWH/BI QA Analyst /UAT/Data Analyst

Confidential, Birmingham, AL

Responsibilities:

  • Participated in requirements gathering, project definition, analysis and scoping activities associated with the SLD project.
  • Lead and review the project assigned and worked as a POC for all QA related work of the project assigned.
  • Prepared test cases by understanding the business requirements, Data Mapping documents and technical specifications.
  • Prepared Test Strategy document, System Test Plan, Performance Test Plan, Unit Test plan, Traceability matrix and Test Summary Reports.
  • Involved in creating the UAT test scenarios and worked with the business users to execute the UAT test scenarios.
  • Validated the most granular data items against OLTP systems and identified the source tables from which the data warehouse extracts data.
  • Extensively worked with backend activities for testing several reports developed by BI tools. Written several complex SQL queries.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
  • Validated the Envelope, Encoding rules, and communication styles in soap UI.
  • Tracked defects through quality Centre.
  • Involved in preparing the Mock test data for both positive and negative scenarios.
  • Extensively tested several BOXI reports for data quality
  • Manual verification of data in the backend using SQL.
  • Maintained the Requirement Traceability Matrices (RTM) to track the coverage of Requirements v/s designed test cases.
  • Involved in maintaining the test environments; with activities like requesting data loads, data base backups, restarting the servers, requesting the deployments, troubleshooting issues.
  • Executed QTP scripts for regression analysis.
  • Validated data migration from different databases in to HDFS using Sqoop.
  • Validated the Map Reduce programs, Hive UDFs by using HQL queries.
  • Validated the responses of web services by passing various requests using soap UI.
  • Written UNIX Shell scripts for file management and batch scheduling.
  • Involved in verifying the initial and daily load Ab Initio graphs ETL jobs.
  • Debugging and Scheduling ETL Jobs/Mappings and monitoring error logs.
  • Have tested reconcile and incremental ETL loads for the project.
  • Tested data migration to ensure that integrity of data was not compromised.
  • Have worked in production support team for EDW ETLs execution support in 24X7 shifts.
  • Involved in System Integration Testing and User Acceptance Testing. 
  • Tested standard and Ad hoc reports and undergone data validation for the BOXI reports
  • Involved in status meetings which involved business, testing and core technology group on weekly basis.
  • Work with the QA and Project Managers to set and evaluate milestone criteria to ensure released products are on schedule with high quality.

Environment/Tools: HDFS, Map Reduce, Hive, Oracle 9i/10g, DB2, SQL, UNIX, Use Cases, Business Objects BOXI, Data Warehousing, Ab Initio 2.13/2.14, Teradata, Teradata SQL Assistant, Microsoft TOAD, QTP 9, Rally, Quality Center, web services, Soap UI, XML Spy, XML Files, XSLT, MS Visio, Agile, RUP, COTS, Rational ClearCase, and Rational ClearQuest, Rational Requisite Pro

ETL/DWH/Reports Tester / UAT

Confidential, Schaumburg, IL

Responsibilities:

  • Participated in business requirement walk through, design walk through and analyzed Business requirements and Coordinated with the business analysts and developers to discuss issues in interpreting the requirements.
  • Analyzed the user requirements, functional specifications and Use Case documents and created the Test Plans, Test cases for Functional testing.
  • Worked closely with ETL team in each phase of project (data requirement analysis, data field analysis, data mapping and ETL testing)
  • Built external mapping to transfer data from SQL Server database to Oracle Data Warehouse. This includes writing SQL Statements.
  • Coordinated in testing Data Stage ETL using complex stored procedures.
  • Acted as first point of contact for all critical issues related to data mapping and data sourcing.
  • Developed test plan with testing scenarios from the end user perspective for User Acceptance Testing (UAT).
  • Reviewed and tested Informatica ETL code
  • Involved in Integration Testing, and Data Testing including boundary testing.
  • Developed procedures to ensure conformity, compliance with standards and lack of redundancy, translating business rules and functionality requirements.
  • Actively participated in Functional testing, System testing, and Ad-hoc testing.
  • Analyzed bugs, worked with development team and ETL team members in fixing defects and validating the solutions.
  • Worked on Autosys for Batch Processing ETL, PL/SQL subprograms and performed backend testing.
  • Tested many ETL packages for the project according to data mapping requirements.
  • Interacted with the development and product management team to find out the end user actions and scenarios.
  • Tested adhoc and standard reports generated by using MicroStrategy.
  • Involved in providing the Testing effort estimates and provided the timely feedback on progress of the testing activity. 
  • Updated QA Manager and Lead weekly with the testing status, which included Test Task Plan, Defect Management, and Test Metrics.
  • Successfully Coordinated User Acceptance Testing (UAT) on each release of the project with the help of end user requirements.
  • Involved in doing Business analysis and Requirements gathering for new Informatica mappings.
  • Coordinate all aspects of the solution delivery including design, development, testing, and deployment.

Environment/Tools: Oracle 9i, MS SQL Server 2005 SSIS, SSRS, Informatica Power Center 8.1/7.1, Data Stage, MicroStrategy, Manual Testing, QTP, PL/SQL, Autosys, Rational Rose, Rational Clear Case, Rational ClearQuest,, Cognos, MS Office Suite.

DW/BI/ETL Quality Analyst

Confidential, Charlotte, NC

Responsibilities:

  • Involved in QA Analysis, Design, and Testing of the application.
  • Created Test strategy and Test Plans, reviewed requirements to ensure they were clear and testable
  • Executed Test Scripts and Test Cases using manual testing procedures.
  • Designed test design documents and QA project development design documents.
  • Coordinated test activities with all testing resources
  • Performed Regression testing on corporate and personal documents and fixed the errors.
  • Tested Data Stage ETL jobs according to data mapping requirements.
  • Testing to make sure that data is moved from source database to destination database by writing SQL queries.
  • Analyzed records loaded into staging table that are extracted from several tables in FACETS core database.
  • Worked on Value added routines in Facets and provider and subscriber modules.
  • Involved in validating the claims, invoices coming as Electronic data Interchange transactions.
  • Involved in validated the communications, syntax and compatibility of information between EDIs.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
  • Worked on improving stored procedure and trigger performance.
  • Conducted Black Box (Functionality Testing), User Acceptance, and Regression testing.
  • Created Traceability Matrix to ensure that all the requirements are covered by the test cases
  • Involved in testing Cognos reports for data quality and cosmetics according to requirements.
  • Conducted status report meetings with internal team on a weekly basis and Documented and tracked status meeting minutes and activities
  • Provided clear concise feedback to Development team on recurring errors both on individual and team with aim of long-term reduction of defects found in final releases

Environment/Tools: SQL, Oracle 9i, DB2, Data Stage, Cognos, Crystal Reports, Manual Testing, SQL Server 2003, UNIX log files, XML, Flat Files, MS Office, MS Visio, MS Excel.

Manual Back-end Tester

Confidential, St Paul, MN

Responsibilities:

  • Responsible for writing Test Plan and Test scenarios from design requirements and executed Test cases, test scripts as per Test plan
  • Prepared test data for positive and negative testing used in data driven testing for testing the application dynamically
  • Applied the relational database concepts like Tables, Primary and foreign keys, Views and Referential Integrity.
  • Integration testing was done to ensure data processing, interface validity and proper communication among components of each application
  • Regression includes black box/white box, Positive/Negative testing.
  • Involved in tracking defects and generating defects reports.
  • Actively attended meetings with fellow testers and other groups to evaluate the performance of the application and to discuss the issues arising out of testing.
  • Held periodical meetings with the Development team to gather information on the progress and resolve project issues
  • Acted as a liaison between system users who had business requirements and developers who could create automated solutions for those problems 

Environment/Tools: Oracle 8i, SQL, TOAD, MS-Access, MS-Visio, Test Director, XML, Java, MS Project, MS Office, UNIX

We'd love your feedback!