Big Data Tester/pega Resume
Irvine, CA
EXECUTIVE SUMMARY
- Having 8 years of IT experience as QA in Bigdata environment with ETL and Business Intelligence Tools.
- Excellent Interpersonal, Analytical skills with depth knowledge of testing methodologies, concepts, phases, and types of testing, developing Test Plans, Test Scenarios, Test Cases, Test Procedures, Test Reports, and documenting test based on BRD and FRS.
- Well versed inHadoopecosystem - HDFS, Hive and Sqoop.
- Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Good understanding of Cloud Services like Amazon Web Services (AWS).
- Proficient in Web UI automation testing using Selenium, QTP and Cucumber, SoapUI
- Experience writing Hive Queries for analyzing data in Hive warehouse usingHive Query Language (HQL).
- Experienced in Bug Reporting and Defect tracking using tools like HPALM, HP Quality Center and Test Director to log and defect management.
- Strong analytical, dynamic troubleshooting and requirement traceability skills. Good analytical skills in
- Having good knowledge in Pega Applications, Testing Json files, Querying Json via SQL
- Developing ad-hoc reports and analyzation reports to improve business revenue is also my expertise.
- Experienced in interacting with Clients, Business Analysts, leads, and UAT Users.
TECHNICAL SKILLS
Skills include: Quantitative Methods, Data Warehousing, Quantitative Methods, Data Warehousing Advanced, Data Mining, Business Intelligence (BI), Data Structures, Regression Analysis, Data Visualization, Data Technologies, Data Science Research Methods, Research Data Management, Statistical Computing Methods, Experimental Design & Analysis, Retention Expert, sales negotiator.
Operating Systems: Windows, LINUX, UNIX
Big Data Technology: HDFS, MapReduce, Hive, pyspark
Languages: SQL, PL/SQL, Pega
Test Management: HP ALM 11.0/12.0
ETL Tools: Informatica 10.2/10.0/ 9.5
RDBMS/ Databases: Oracle 11g/10g, SQL Server 2008/2005
BI Tools: Tableau, BIX.
Automation: Selenium-Web driver (Cucumber, Gherkin, Junit, Maven, BDD)
PROFESSIONAL EXPERIENCE
Confidential, Irvine CA
Big Data TESTER/PEGA
Responsibilities:
- Worked on Pega workflows, Tested the flows based on rules based on insurance industry standards
- Created Hive queries which helped analysts spot emerging trends by comparing fresh data with historical claim metrics.
- Involved in setting up the testing environments and prepare test data for testing flows to validate and prove positive and negative cases
- Validated the Map reduce, Pig, Hive Scripts by pulling the data from the Hadoop and validating it with the data in the files and reports.
- Creation of scripts by using the Selenium WebDriver (Maven, BDD- Cucumber, Gherkin, Junit) for the IDV-UI (Internal Website/UI)
- Writing the Java code for the Selenium Scripts and executing them.
- Validated Json input format, constructed valid format of Json, querying Json via SQL.
- Performed data validations between summary files and extract files after ETL flow.
- Performed various validations like comparisons between Landing zone tables and tracing tables generated during the ETL process.
- Created the data extract by connecting to Oracle, verified Store procedures, working with Json files.
- Performed Defect Reporting, Analyzing, Tracking and Report Generation using HP ALM 11.52.
- Expertise on Testing REST API using Robot Framework and SOAP UI
- Strong knowledge in web services testing via Tool SOAP UI/ Restful web services
- Attended the daily Bug review meetings, weekly status meetings and walkthroughs and interacted with Business Analysts and Developers for resolving Defects.
- Project and creating reusable test scripts to execute/provide results.
- Prepared test data for positive and negative test scenarios for Functional Testing as documented in the test plan.
- Coordinated with test team in offshore to get daily test status updates during offshore hours
- Have good experience in creating effective ETL data warehouse test cases and scenarios based on business and user requirements
Bigdata Tester/Automation
Confidential, Wilmington, DE
Responsibilities:
- Involved in CIT, SIT, UAT parallel run testing based on the requirements surveys completed by development team.
- Conducted meetings as needed in SIT and daily test meetings in UAT with downstream users.
- Tested Map Reduce programs to parse the raw data, populate staging tables and store the data in HDFS.
- Creating detailed functional test cases based on the User and Business Requirements
- Using Selenium WebDriver for creating and executing test scripts for functionality testing of the web application
- Created Hive queries which helped analysts spot emerging trends by comparing fresh data with historical claim metrics.
- Involved in setting up the testing environments and prepare test data for testing flows to validate and prove positive and negative cases
- Validated the Map reduce, Hive Scripts by pulling the data from theHadoopand validating it with the data in the files and reports.
- Performed variousHadoopcontrols like date checks, record checks, balance checks, threshold limits for records and balances etc.
- Performed data validations between summary files and extract files after ETL flow.
- Performed various validations like comparisons between Landing zone tables and tracing tables generated during the ETL process.
- Created the data extract by connecting to Oracle.
- Strong knowledge in web services testing via Tool SOAP UI/ Restful web services
- Performed Defect Reporting, Analyzing, Tracking and Report Generation using HP ALM 11.52.
- Attended the daily Bug review meetings, weekly status meetings and walkthroughs and interacted with Business Analysts and Developers for resolving Defects.
- Project and creating reusable test scripts to execute/provide results.
- Prepared test data for positive and negative test scenarios for Functional Testing as documented in the test plan.
ETL TESTER/Hadoop
Confidential . AZ
Responsibilities:
- Experienced in validating the source data with the target data in data warehousing application and in reports in Client/Server and Web Based environment.
- Involved in developing Test Cases, Test Plans, Test Execution, Defect Tracking, and Report Generation using Quality Center / HP ALM based on functional specifications.
- Involved in end-to-end defect management of assigned projects. Identified defects, assess root cause, and prepared detailed information for developers and business stakeholders.
- Experienced in Data Validation and Backend testing of databases to check the integrity of data.
- And used extensively HQL Queries to analyze the HDFS data
- Automation of functional testing for all modules, Automated Smoke testing and Regression testing using Selenium on multiple browsers
- Involved in walkthroughs / Reviews of the business & functional specifications
- Developed Data driven framework with Selenium WebDriver, TestNG and C#
- Used HP ALM for Test Management, Defect Management
- Experience in testing of Data Warehouse/ETL Applications developed in Informatica, Ab initio using SQL Server, Oracle, Hadoop, DB2, and UNIX and have ability to evaluate ETL/BI specifications and processes.
- Experience in UNIX, RDBMS, Hadoop, HIVE (HQL), Oracle (SQL).
- Experienced in Black Box, Integration, Regression, functional, Front End and Back End Testing.
- Responsible for Analysis and Defect Tracking using HP Quality Center/ALM, Test Director, JIRA
- Implemented Optimization techniques for better performance on the ETL side and on the database side
- Experience with different file systems /databases like Oracle, HDFS, and MS SQL Server to extract and load data using Sqoop.
DATA/QA ANALYST
Confidential, Chennai IN
Responsibilities:
- DevelopedTest Scenarios, Test Cases, Traceability Matrix, Test Summary ReportsandTest Execution Metrics.
- Developed and Executed theTest cases & ScriptsforFunctional, System, Regression, Integration, PerformanceandUAT.
- Involved in writing queries usingSQLforData correctness and data completeness checks.
- Participation inRequirement / Use Case analysis, Risk analysisandconfiguration management.
- Validating the data in various stages of data movement fromSource to Stagingand fromStaging to Data Warehouse tables.
- CreatedTest Casesusing the SDLC procedures and reviewed them with the Test Manager.
- Executed all theTest Casesin theTest Environmentand maintained them and documenting the test queries and result for future references.
- Validating the load process of ETLto make sure the target tables are populated according thedata mappingprovided that satisfies thetransformation rules.
- Validating the Archive process to purge the data that meet the defined business rules.
- Writing complexSQL queriesusing Case Logic, Intersect, Minus, Sub Queries, Inline Views, and Union in Oracle.
- Involved in Defect Review meetings with Business Core Team and Developed use-case Analysis
- Actively participating in project specification reviews, writing and maintaining QA technical documentation.
- Identify/debug, troubleshoot, modify, document, test the production issues and create production support schedule.
- Hardworking Customer Service Representative with 6 years of experience working with Symantec Software, Dell, Sage. Trained in project and time management with extensive knowledge of Retention/ upselling Skills and proven multitasking abilities.
- Committed to maintaining professional relationships with clients to increase profitability and drive business results.
- Qualified Advanced Technical Support Specialist with 4 years of helpdesk and customer service experience. Provides comprehensive CRM support, Products like Citrix, Zendesk, Salesforce, JIRA. Adept at engaging customers by identifying issues and streamlining steps to effectively resolve technical issues.