We provide IT Staff Augmentation Services!

Qa Lead Resume

Richardson, TX

PROFESSIONAL SUMMARY:

  • 6+ years of experience in Software Quality Assurance and Development together on web - based and client server applications.
  • Extensive experience in QA methodologies such as waterfall, V-model and Agile with scrum.
  • Experience in full project Software Development Life Cycle (SDLC) and Software Test Life Cycle (STLC) from requirements gathering through deployment in production.
  • Experience in analyzing Business Requirements document, Functional Requirements document and Use Cases.
  • Proficient in different types of testing like Functional testing, Integration testing, System testing, Regression testing, Database validation testing, GUI testing and User Acceptance Testing.
  • Conversant in writing Test plans, Test strategy, Test Scenarios, Test cases and Test Results documents.
  • Experience with estimation of small to medium scale projects.
  • Expertise in the manual testing of AS400 and web applications.
  • Good knowledge of PBM and pharmacy claims adjudication system.
  • Strong experience of writing complex database queries for data validation.
  • Good knowledge in Linux & Unix shell scripting or shell commands.
  • Well experience on developing solution into Hadoop ecosystem (Pig, Hive, Flume, Sqoop, Oozie).
  • Excellent understanding of distributed storage systems like HDFS and batch processing systems like MapReduce and Yarn.
  • Knowledge on NoSQL databases including Hbase and data access using HIVE.
  • Experienced in using Flume to transfer log data files to Hadoop Distributed File System (HDFS)
  • Experience in developing customized UDF's in Java to extend Hive and Pig functionality.
  • Good experience in Troubleshooting, Performance tuning, Optimizing, Performance large scale Hadoop cluster with Data Ingestion.
  • Strong background in Object-Oriented analysis and design OOAD .
  • Development experience in Java/J2ee using Spring and MongoDB.
  • Ability to blend technical expertise with strong Conceptual, Business and Analytical skills to provide quality solutions and result-oriented problem solving technique and leadership skills.
  • Excellent analytical, interpersonal, leadership and problem solving skills, Experience in strong project & people management capability.

TECHNICAL SKILLS:

Big Data Ecosystems: Hive, Pig, Sqoop, Oozie, Flume,HDFS, MapReduce, YARN,Hbase

Java Technologies: Java/J2EE, Spring MVC, REST web service, Spring Integration

Database: MongoDB, Oracle, MySQL

Operating Systems: Linux Red Hat/Ubuntu, Windows 10/8.1/7/XP

Tools: HP ALM, selenium

Test Approaches: Waterfall, Agile/Scrum, SDLC, STLC

QA Artifacts: Test Plan, Test Case, RTM, Test Summary Report, Defect report, User Stories

PROFESSIONAL EXPERIENCE:

QA Lead

Confidential, Richardson, TX

Responsibilities:

  • Participate in all stages of Software Development Life Cycle (SDLC) from planning till deployment to provide high quality product.
  • Perform deep analytical analysis of claim adjudication for paid and Reject claims.
  • Emphasis on complex claim adjudication such as prior Authorization, Transition Fill, Drug coverage, etc .
  • Understand the overall business model of PBM claim adjudication system and translate concepts into practice .
  • Review Business Requirements and Functional Requirements, analyze gaps and provide feedback to the clients.
  • Responsible for creating Test Plan, Test Scenarios, Test Cases and Test report.
  • Expertise in implementation of Automation framework using Selenium
  • Executing Selenium Test Cases and Reporting defects Experience in Data Driven Testing, Cross Browser Testing and parallel Test Execution Using Selenium WebDriver
  • Perform extensive Functional testing on Rxclaim platform.
  • Experience in Agile process actively supports team for the process improvement.
  • Maintained test cases, test execution and defect life cycle using HP Quality Center.
  • Prepared Requirement Traceability Matrix (RTM) of various modules and functional changes involved in the release.
  • Proven ability to work in fast pace environment with tight deadline projects.
  • Self-motivated with ability to work individually and as a team.
  • Excellent written and verbal communication skills.

Environment: IBM AS400, Quality Center, HP ALM, SharePoint, MS Office

QA Analyst

Confidential, Allen, TX

Responsibilities:

  • Participate in all stages of Software Development Life Cycle (SDLC) from planning till deployment to provide high quality product.
  • Review Business Requirements and Functional Requirements, analyze gaps and provide feedback to the clients.
  • Responsible for creating Test Plan, Test Scenarios, Test Cases and Test report.
  • Review Test Plan, Test cases and Test results documents with Business and IT teams.
  • Prepare Test cases for positive and negative test scenarios as referred in the user stories keeping in mind the Boundary Value Limitations.
  • Thoroughly understand technical design documents & requirements to ensure comprehensive testing coverage.
  • Perform the UI Testing, Functional Testing, System Testing, Regression Testing, User Acceptance Testing and End to End testing to ensure the developed functionality meets the Business-user requirements.
  • Participate in the bug scrum meetings on daily basis with IT team to discuss about the severity of logged issues during sprint execution.
  • Involve in daily scrum meetings to update progress and key issues.
  • Maintain Test cases, test execution and Defect Management Life Cycle through HP Application LifeCycle Management (ALM).
  • Lead offshore QA team to finish the assigned projects in the timely manner.
  • Provide training, guidance and direction to more junior staff, balancing leadership and teamwork values.
  • Identify opportunity for process improvement and provide the feedback to QA management.
  • Excellent experience with testing on web applications and AS400.

Environment: IBM AS400, DB2, MySQL, Cybersource, Quality Center, HP ALM, SharePoint, Eclipse, Java, JavaScript, XML, MS Office, HTML

QA Analyst

Confidential

Responsibilities:

  • Analyzed requirements thoroughly, identified gaps and suggested to the clients before requirements sign-off.
  • Prepared Test Plan, Test cases, Test data and Test Results to ensure the actual results meet end clients need.
  • Worked closely with business team and other teams to understand business requirements and successful implementation.
  • Involved in Software Development Life Cycle of the project from requirements gathering to deployment in production.
  • Compared the sourced data to pushed data to ensure correct data is extracted and loaded into the HDFC location
  • Loading and transforming structured and semi structured data in to HDFS successfully using Flume engines.
  • Tested and validated the data at all stages of the ETL process using Flume.
  • Verified business logic on every single node and then by running multiple nodes for data aggregation and segregation rules.
  • Performed output validation for transformation rules, data integrity and successful data load into the target system.
  • Involved in performance testing for data storage, timeout, commit logs, message queues.
  • Developed medium to complex HiveQL queries for validation, involving joining multiple tables.
  • Developed shell scripts to automate the test cases preparation and test results validation across multiple zones in the Hadoop data lake
  • Identified and Assesses risks and issues and propose mitigation strategies.
  • Actively involved in Data conversion testing, Integration testing and functional testing.
  • Maintained test cases, test execution and defect life cycle using HP Quality Center.
  • Involved in creating hive tables, loading and analyzing data using Hive queries.
  • Setup test environments and prepared test data for testing flows to validate and provide positive and negative cases.
  • Worked with systems team to deploy and test new Hadoop environments and expand Hadoop clusters.
  • Used Apache Flume to collect, aggregate and move large amount of Web server logs into HDFS.
  • Log data are generated by the web servers which have Flume agents running on them. These agents receive the data from the data generators which will be collected by collector and finally, the data from all these collectors will be aggregated and pushed to a centralized store such as HDFS.
  • Involved in creating Hive external tables having partition on date, loading with data and writing hive queries that will run internally in map reduce way.
  • Written hive jobs using HiveQL to analyze huge amount of web logs and generate reports on them, which will be helpful to monitor user activity and take proper business decisions based on that.
  • Increased performance of the HiveQL by splitting larger queries into small and by introducing temporary tables in between them and Created 30 buckets for each Hive table based on clustering by ID’s for better performance (optimization) while updating the tables.

Environment: Hadoop2.x, HDFS, MapReduce, Hive, Flume, Git, YARN, Cloudera, SQL scripting, Linux shell scripting, Gradle, Git, Eclipse, Java, JavaScript, HP QC

Java/Hadoop Developer

Confidential

Responsibilities:

  • Understanding the user requirement coming either from the BAs or other stakeholders and driving it from brainstorming to implementation.
  • To remove old documents (expired irrelevant data) from MongoDB implemented TTL index and written java class using Spring Task Execution and Scheduling for the case where TTL index cannot be implemented.
  • Implementation of feeds using hadoop ecosystem which comprises of capturing user activity on the apps and sending it to business units for analysis and thus giving better user experience.
  • Developed end-to-end process to load feed data into mongoDB using HDFS, pig and Oozie workflow. Third party puts “|” delimited feed file into our ingest server using FTP, a scheduled monitoring task polls ingest server at every 5 mins and ingested file gets moved onto HDFS using Hadoop java API and finally Oozie workflow gets triggered using Oozie Client which will run pig script. The pig script loads the feed file and stores relevant data into MongoDB.
  • Learned concepts like YARN, HDFS Federation and High Availability introduced in Hadoop 2.X.
  • Involved in addressing issues related to slowStart, CPU resource utilization in YARN, Capacity vs Fair scheduling and speculative task execution during Hadoop upgrade from Cloudera CDH 3 (Hadoop 0.20.2) to CDH 4 (Hadoop 2.0).
  • Bug fixing and maintenance of existing code for older versions of the apps running on Web platforms.

Environment: Hadoop2.x, HDFS, MapReduce, PIg, YARN,, SQL scripting, Linux shell scripting, Eclipse, MongoDB, Oozie, Maven, Git

Java/J2EE Developer

Confidential

Responsibilities:

  • Understanding the user requirement coming either from the BAs or other stakeholders like consumers of our REST APIs and driving it to from brainstorming to implementation.
  • Designed and developed the presentation layer which includes the development of standards-browser compliant - user interactive web pages using CSS, Spring MVC and javascript with Client side validation and unit testing.
  • Designed and developed business layer which includes the creation of action classes, beans to handle user interactions through forms.
  • Developing REST APIs for the clients of the application comprising Android, Web, iOS and Windows platforms.
  • Development of JUnit test cases to test business components.
  • Extensively used Java Collection API to improve application quality and performance.
  • Vastly used Java 5 features like Generics, enhanced for loop, type safe etc.
  • Providing production support and enhancements design to the existing product.

Environment: Core Java, Spring MVC, REST Web service, JUnit, Maven, SVN, Javascript & CSS

Hire Now