We provide IT Staff Augmentation Services!

Senior Big Data Tester Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • 8.6 years of experience as a Senior Software quality Engineer in Big Data, ETL, Functional, API & Interface testing using Ascential Data Stage 7.5.2 Server edition, IBM Info Sphere Data Stage 8.7 Server & Parallel edition, Hadoop, Spark, Unix, SQL, Python.
  • Experience in all phases of Software Testing Life Cycle (STLC) and good exposure to Software Development Life Cycle (SDLC).
  • Experienced in Testing of ETL jobs and mappings in Server and Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts.
  • Experience in testing Big Data Hadoop (HDFS, Map reduce, YARN, Lamda, Spark, Python, AWS, SQOOP, Impala).
  • Very good Knowledge on Telecom billing domain especially Amdocs Enabler Billing System.
  • Worked on web services based on SOAP and REST using SOAPUI.
  • Strong skills include interpersonal communication, strong analytical and verbal skills, accustomed to work in a dynamic culture and excellent learning skills.
  • Experienced in different software testing methodologies like Agile Methodology & Waterfall model.
  • Experience in different Scheduling tools like Crontab and ControlM for automating and scheduling jobs run.
  • Efficient in in corporation of various data sources such as Oracle and Flat files into the staging area.
  • Experience in Data warehousing concepts, Metadata, dimensional Star Schema and Snowflakes Schema methodologies.
  • Spent significant amount of time working in Telecom domain and have deep understanding and experience of the various Order Workflow related business domain knowledge as well as OSS/BSS.
  • Worked in requirements gathering and functional analysis, creating Test Plans, high level test scenarios, Detail level test cases and Test Execution throughout Functional, System, Regression, UAT, and PVT.

TECHNICAL SKILLS:

Testing Tools: - Hadoop 2.2, HDFS, Map Reduce, Spark, IBM Info Sphere 8.7 (Data Stage, Quality Stage), Ascential Data Stage 7.5.2, Oracle 10g, TOAD, Amdocs Enabler v7.5,V9.1, AMC, CRM, OMS 7.5 .

Languages: SQL, UNIX Shell Scripting, Python.

Management Tools: - HP ALM Quality Center 10.0/11.0, WebTrax, I-Track, TOAD, TDP, PRISM, Rally, USH, JIRA.

Operating Systems: UNIX, Linux, Windows 98/2000/2005, Subversion (SVN).

Job Scheduling: Data Stage Internal Scheduler, Crontab, ControlM.

PROFESSIONAL EXPERIENCE:

Confidential

Senior Big Data Tester

Application & Tools: UNIX, Putty, Hadoop, AWS S3,Spark, Lambda, Agile Project Methodology, HP ALM Quality Center 11.0, QA, MOCK, PROD, Oracle 10g, PRISM, WebTrax, I-Track, TOAD, TDP.

My Responsibilities:

  • Work with all the stakeholders (Technical Architects, Solution Designers, interfacing Team owners and Business) to analyze requirements and change requests.
  • Participated in the requirements review meetings to define QA deliverables.
  • Prepared the QA estimations based on the requirements and participating in the team building activities.
  • Performed the Schema and Structural Validation based on Low level design document.
  • Created the Test Scenarios, Test Cases and Test Plan for review and approval.
  • Responsible for tracking the resolutions for QA clarifications and Test Readiness Review.
  • Validated data from source to target Confidential each individual phase by ensuring the right external data source is mapped the right internal data source.
  • Validated data duplicates between data warehouse and HDFS and whether there is any real business benefit in duplicating the data.
  • Validated data is extracted correctly as per business rules.
  • Performed Hadoop Eco system behavior validation by changing some parameters in Hadoop profile file.
  • Evaluated in advance number of nodes required in each test environment.
  • Prepared and used the test data by dumping some masked production data (realistic).
  • Performed negative testing by Changing the input data, replicate the data files and validate if map reduce has taken care of all possible variations of input data.
  • Performed Failover testing with additional clusters added, removed dynamically.
  • Validated all system access is intact after node/cluster is added, removed.
  • Validated alternate data paths once one or two node removed from the system and note down change in total execution time.
  • Validated Hadoop eco system is working as expected after node failure or network failure.
  • Responsible for metrics submission and team leading activities.

Confidential

Big Data Tester.

Application & Tools: Hadoop, Spark, SQOOP, Impala, Agile Project Methodology, HP ALM Quality Center 11.0, QA, MOCK, Oracle 10g, PRISM, WebTrax, TOAD.

My Responsibilities:

  • Involved in Understanding the Functional Requirements of the Application.
  • Writing Test automation script in Python.
  • Configuring Map Reduce Framework to execute Python scripts.
  • Provided Technical support to the team as the onsite Test engineer. Addressed best practices and productivity enhancing issues.
  • Participated in the requirements review meetings to define QA deliverables.
  • Prepared the QA estimations based on the requirements and participating in the team building activities.
  • Performed the Schema and Structural Validation based on Low level design document.
  • Created the Test Scenarios, Test Cases and Test Plan for review and approval.
  • Responsible for tracking the resolutions for QA clarifications and Test Readiness Review.
  • Validated data from source to target Confidential each individual phase by ensuring the right external data source is mapped the right internal data source.
  • Validated data duplicates between data warehouse and HDFS and whether there is any real business benefit in duplicating the data.
  • Validated data is extracted correctly as per business rules.
  • Performed Hadoop Eco system behavior validation by changing some parameters in Hadoop profile file.
  • Evaluated in advance number of nodes required in each test environment.
  • Prepared and used the test data by dumping some masked production data (realistic).
  • Performed negative testing by Changing the input data, replicate the data files and validate if map reduce has taken care of all possible variations of input data.
  • Performed Failover testing with additional clusters added, removed dynamically.
  • Validated all system access is intact after node/cluster is added, removed.
  • Validated alternate data paths once one or two node removed from the system and note down change in total execution time.
  • Validated Hadoop eco system is working as expected after node failure or network failure.
  • Responsible for metrics submission and team leading activities.

Confidential

ETL Tester

Application & Tools: Ascential Data Stage 7.5.2 Server edition. IBM Info Sphere Data Stage 8.7 Server, Parallel edition, Oracle 9i/10g, Unix, SQL, PRISM, UNIX, HP QC 10.0, TOAD, PuttyClient : Confidential & Confidential .

My Responsibilities:

  • Analyzing requirements and creating complete test cases, test plans, test data, and reporting status ensuring accurate coverage of requirements and business processes.
  • Worked with Business Analysts to define testing requirements to satisfy the business objectives.
  • Involved in ETL process testing using Data Stage ETL tool.
  • Involving in writing SQL queries and Database Checkpoints to verify data quality and calculations, reviews.
  • Performed data validation testing writing SQL queries.
  • Performed Integration, End-to-End, system testing, Functional Testing & Regression Testing.
  • Supported the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Data Stage.
  • Provided Technical support to the team as the ETL Tech lead .Addressed best practices and productivity enhancing issues.
  • Working on Quality Center for creating and documenting Test Plans and Test Cases and register the expected results.
  • Using HP Quality Center for storing, maintaining the test repository, bug tracking and reporting.
  • Experience assessing testing processes, creating and implementing testing strategies in UAT phase.
  • Collaborated with SE team in, High Level design documents for extract, transform, validate and load ETL process data dictionaries, Metadata descriptions, file layouts and flow diagrams.

Confidential

Test Engineer.

Application & Tools: AMDOCS ENABLER 7.5, AMDOCS ENABLER 9.1 Turbo Charging, AMC, OMS 7.5, OMS 8.1, SOAP UI .Oracle 10g, PRISM, UNIX, HP Quality Center 10.0, WebTrax, I-Track, TOAD, TDP, Putty.

My Responsibilities:

  • Involved in project planning, coordination and implemented QA methodology.
  • Involved in Understanding the Functional Requirements of the Application.
  • Work with all the stakeholders (Technical Architects, Solution Designers, interfacing Team owners and Business) to analyze requirements and change requests.
  • Technical task allocation within team and monitoring testing progress.
  • Prepare High & Detail Level Scenarios for the execution.
  • Documentation of UAT Test Scenarios, & Test Cases.
  • Enabler Turbo Charging: Maintaining Event Servers, Event Processing, Instant Rating, Rerate, Error Flows, High Availability
  • Enabler Invoicing: Performed invoicing on customers, Re bill
  • Enabler AR Module: Worked on Payment, Back out, Write off& Journaling.
  • Ensure timely deliveries and manage compliance to SLAs agreed upon.
  • Having good hands-on experience in API testing using MI tool, SOAP UI, in all the enabler modules.
  • Preparing Test Logs and Test Summary Reports during test runs and distributes the status reports to the external stakeholders like the Clients, Customers and Project Teams.
  • Well Experienced in interface testing which connected with enabler, almost 95 + application like, AMS, EBMIS, TDMS, BIBA, CFAS, UDAS, CRM, OMS, ODR, EMS, etc.
  • Defect reporting, issue resolution, risk and improvements plans and provides updates on technical documentation.
  • Performing Root Cause Analysis for Production Defects.
  • Working with Development Leads and Confidential & Confidential Test Management Leads to resolve and escalate issues in a timely manner in order to meet the test schedule.

We'd love your feedback!