Big Data Tester & Etl Tester Resume
OBJECTIVE
- To excel in the area of Big Data Testing, ETL Testing, Automation testing, WebBased Testing, Functional Testing, Reports Testing, ETL Jobs automation and Data visualization within a progressive organization where I can utilize my I.T. & Testing Skills for achieving company goals.
SUMMARY
- Total 7.5 years of IT experience as Testing and mainly worked as ETL Tester, Big Data Tester & Automation Tester in Confidential Consultancy Services for different clients.
- IKM Big Data Hadoop Foundation, ISTQB Foundation, ITIL V3 foundation, six sigma yellow belt certified.
- Ample working experience on - Hadoop and Big Data Analytics.
- Knowledge on Database concepts and hands on experience in preparing test automation scripts by using Java and sql scripts to do less manual efforts.
- Comprehensive knowledge & experience on ETL tools (Talend, Teradata) and reporting tool Microstrategy.
- Well versed with all stages of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
- Good Exposure on Requirements Analyzing and Management.
- Exposure on Automation Testing using Java concepts.
- Hands on experience in preparing Test Plan, Test cases, and Test Data and executing the same.
- Experience in Defect Management/Defect Life Cycle.
- Good experience in attending Business calls and understanding Business expectations.
- Working knowledge of Java, Unix, SQL, Hadoop Testing, Testing Process, Talend tool, Function Testing, Big data, Hadoop Ecosystem, ETL Process & methodologies, Java concepts and MicroStrategy Reports testing.
- Executing the code via Control M, Autosys scheduler in SIT environment and performing validations as per FSD and mapping sheets
TECHNICAL SKILLS
Big Data Technologies: Hadoop ECO System (Cloudera & Hortonworks), Hive, HDFS
Testing Knowledge: Manual Testing, unit testing, black box testing, functional Testing, ETL Testing, Silk central
Methodologies: Agile, SDLC
Database knowledge: SQL, Teradata, Oracle, DBFit
Programming Languages: Linux, Unix, Java
Business Domain Knowledge: Banking and Financial services
Defect Ticketing Tools: JIRA, LeanKit
Other knowledge: MS Excel, IBM Data replication Tool, MSTR reporting Testing
Scheduler: Autosys and Control M
QA knowledge: UAT, System Integration Testing, Test Management, QA, Regression Testing, ETL/BI Testing
BI Tool, ETL Tool knowledge: Tableau, Talend
PROFESSIONAL EXPERIENCE
Confidential
Big Data Tester & ETL Tester
Responsibilities:
- Followed AGILE Methodology with a 2-week sprint process, took an active participation in creating User stories, sprint planning sessions and Used JIRA & LeanKit board for Sprint planning.
- Created HQL scripts to compare source table columns with Target table columns.
- Attended regular Scrum meetings for Feedbacks, optimizations and the design changes.
- Exclusively involved in to make and maintain Requirement Traceability Matrix.
- Involved in Data analysis and prepared required test data for validation of test cases and Created Test cases for all the functional requirements and scenarios.
- Created and maintained SQL scripts to perform back end validation on TOAD for DB2.
- Monitored the defects logged at the requirement stage and escalated High-priority defects to the Management to ensure quick bug fixes.
- Conducting Regression Testing whenever some code module is changed.
- Worked closely with build integration team, business analysts and developers in order to perform testing activities as per schedules.
Confidential
Big Data Tester/ ETL Tester/ Talend Job automator/ MicroStrategy Report Tester
Responsibilities:
- Review and analysis of user stories, thorough understanding of the system, identification of the flows and the business process.
- Experience in ETL Teradata testing and sql script writing.
- Used Teradata FSLDM model.
- Sound knowledge in preparing Test Plan, Test Strategy, Test data, Test Summary, status report, Test case preparation, Testing, Reporting, Requirements Traceability Matrix, other test deliverables and metrics.
- Experience in Back end data validation and reconciliation techniques in HIVE.
- Validation of Load test delta load, insert, update and incremental load.
- Involved in Regression Testing, checking whether data moved correctly from one platform to another.
- Validation of load source files with invalid records and checking logs of reject reasons.
- Identify the test case based on functional requirement for regression testing.
- Preparation of Test Plan for a module, Test Log/ Test Scripts and Execution.
- Execution of the test case with the given parameter and Verify/Validation the data.
- Finding Issues/Defects and Defect Tracking till its resolved Root Cause Analysis and possible solutions for errors encountered.
- Sending Daily Status to team lead about the work done for the day.
- Involved in Source data loading in Hadoop environment.
- Root cause analysis and Defect Management.
- Managing various cycles for different validation scenarios like insert, updates.
- Once SIT is completed, to provide sign off mails for successfulcompletion with proper defect stats and execution summary.
- Prepared the test case documents with the help of BRD, FRS and other supporting documents.
- Attended walkthrough meetings with the Business Analysts and DEV team to get a clear understanding of the business and the applications being used.
- Created test cases and queries to test the data move for business data to load from source to stage and to target.
Confidential
Functional Tester
Responsibilities:
- Interacting with Business Analysts for Business Requirements, testing scope reviews, inspections and test planning.
- Participated in review meetings for creating Test plan and test results summary.
- Effort Estimation for the project and preparation of the staffing plan.
- Test case review and Test scenario matrix review.
- Performed functional testing of web application.
- Managed defect review meeting for applications thereby ensuring timely closure of thedefects.
- Tested source data for data completeness, data correctness and data integrity.
- Analyzed the data by performing Hive queries.
- Performed Header, Trailer, Empty file, Delimiter check for Flat file files.
- Performed audit log check validations.