Automation Engineer Resume
Atlanta, GA
SUMMARY:
- More than 8 years of experience in Software Testing, Quality Assurance, and Quality Control of various business applications in Client/Server environment, Data Warehousing solutions
- Strong knowledge of Software Development Life Cycle and Software Test Life Cycle and highly proficient in Software Quality Assurance methodologies in Waterfall and Agile model
- Extensive knowledge of testing web based applications and GUI using Manual and automated testing procedures
- Experience in testing the various Service Oriented Architectures (SOAs) spanning across various departments using SOAP and Restful UI
- Expertise in writing, executing and maintaining Test Plans, Test Strategy, Test Procedures, Use Cases, Test cases
- Good understanding of Object Oriented Programming (OOP)
- Experienced in Test Automation using Selenium WebDriver with Java
- Experienced in Web services and APIs and understanding of automated testing for REST / SOAP services, SOAPUI
- Experience in Automotive, CRM, Healthcare, and Banking domains
- Experienced in Functional Testing, Systems Integration Testing, Regression Testing
- Proficiency in Defect management, including Defect Creation, Modification, Tracking, and reporting using Industry standard tools like Quality Center
- Expert in writing SQL queries for back - end testing
- Experience in interacting with Clients, Business Analysts, UAT Users, and Developers
- Involved in preparation of Defect Reports, Daily Status Reports, and Weekly Status Reports
- Excellent interpersonal, communication, documentation, and presentation skills
- Experience in facilitating and leading a Quality Assurance team
TECHNICAL SKILLS:
QA Methodologies: Agile, Waterfall, Iterative
QA tools: HP ALM/QC, Rally, Jira
Automation tool: Selenium WebDriver, Jenkins, TestNG, JUnit, SOAP UI, REST, Cucumber, Appium
Languages: SQL, Java, Python, SQL, UNIX, HTML, CSS, JavaScript
ETL tools: Informatica Power Center
BI/Reporting tools: Business Object, OBIEE
DBMS tools: SQL Developer, TOAD
Operating Systems: Microsoft Windows, Mac OS, Linux
Databases: Oracle, SQL Server, DB2, MySQL
Version Control tools: SharePoint, GIT
PROFESSIONAL EXPERIENCE:
Confidential, Atlanta, GA
Automation Engineer
Responsibilities:
- Involved in automating the testing of company website using Java and Selenium WebDriver in Hybrid Automation Framework
- Reviewing and analyzing business requirements and technical specifications to come up with Testing Scope
- Escalation for unresolved bugs to the concerned developers and module leaders
- Prepared Test cases, procedures, Bug Tracking, Logging, and reporting bugs using Quality center
- Analyzed the manual test cases for the feasibility of automation in regression phase
- Identified test data and organized in excel files scenario wise for test input at run time
- Executed automated test cases for regression and analyzed test failure for defects
- Prepared the review reports (code reviews, execution reviews) for the automation scripts
- Prepare status reports such as daily status report and weekly status report
- Involved in automating the testing of in-house e-commerce website using Java and Selenium WebDriver
- Performed Input Validations, User Interface Validations, Browser Compatibility testing and Navigation testing
- Used Data Driven Framework
- Prepared Test cases, procedures, Bug Tracking, Logging, and reporting bugs using Jira
- Executed scripts for regression test and analyzed failed test cases
- Debugged failed test scripts in Eclipse IDE
- Escalated unresolved bugs to the developer and project manager
Environment: Java, Selenium WebDriver, JUnit, TestNG, SQL, Jenkins, SOAP UI, REST, HP ALM, TestNG, SQL, HTML, CSS, JavaScript, Jenkins, MySQL, Fire Bug, Fire Path
Confidential, Atlanta, GA
Automation Engineer
Responsibilities:
- Understand the application, Analysis of Business requirements provided by BAs and conversion of these requirements into test scenarios and its review
- Created automated test cases for major functionalities - Checkout, Shopping Cart, etc.
- Captured Test Scenarios from Requirement documents
- Created test cases and test data in support of test plans and investigate data integrity issues adhering to all the standards and guidelines provided by the client
- Created TestNG xml file and ran the builds using Continuous Integration tool Jenkins
- Developed API tests for the web service
- Optimized Selenium scripts for Regression testing of the application with various data sources and data types
- Reported and tracked the bugs using Jira
- Used TestNG framework
- Supported and updated test libraries, applications, scripts and data files
- Performed Functional, Regression and System Testing for Various Modules
- Tested the functionality of individual module (Unit testing) and tested the interface between the linked modules (Integration Testing)
Environment: Selenium WebDriver, Python, Java, Jira, SQL, TestNG, Jenkins
Confidential, Dallas, TX
Automation Engineer
Responsibilities:
- Designed and Developed automation script using Java and Selenium WebDriver
- Automation of functional testing framework for all modules using Selenium and web driver
- Created Master Test plan, Test strategy, critical scenarios and Test Scripts and Schedule for testing
- Verified requirements coverage by conducting walkthrough meetings of Test plan and scenarios with business analysts, project manager and test supervisors.
- Created scripts for Regression, Security, GUI, Integration
- Created traceability matrix and mapped requirements to Test Cases
- Performed Smoke Testing to make sure all the Test Channels and Test Environment is working as desired
- Ensure that UAT results are signed off by the right people in user community; worked with several various levels of users in UAT
- Executed test cases manually to verify the expected results and worked with technical designers and architects to understand the requirements for a test environment setup
- Developed Integration and System test cases using Quality Center
- Involved in all phases of Test Life Cycle from test planning to defect tracking and managing defect lifecycle
- Tested the application in highly dynamic environment with sprint team using Agile Methodology
- Performed Input Validations, User Interface Validations, Browser Compatibility testing and Navigation testing
- Interacted with Developers and management to identify and resolve technical issues
- Conducted GUI, Functional, Front end back end testing and reviewed pages for content problems, graphics problems, and link verifications
- Tracked and reported defects into Quality Center and notified management with details. Written, executed Test cases, and documented defects in the Quality Center
- Performed back end testing using SQL queries in Oracle database
- Ran SQL queries to performed database validation per the business logic
Environment: Selenium WebDriver, JAVA, Eclipse, JUnit, Jenkins, Quality Center, Fire Bug, Fire Path, Oracle, SQL
Confidential, Atlanta, GA
QA Analyst
Responsibilities:
- Analyzed the requirements from the client and developed test cases based on functional requirements, general requirements, and system specifications
- Tested a brand new ETL architecture to populate a data warehouse to provide customers the one view of the transactions for companywide reporting leading to increased customer satisfaction
- Tested new fact and dimension tables as they were added to the data warehouse
- Performed Systems Integration testing that included creating test data in the Main frame (IBM- AS400) and made sure the data ended up in data warehouse and generated Business Object reports and validated if the test data is reflected correctly in the report
- Performed Systems Integration Testing that included creating test data at the front end (Oracle E-Business Suite - Order Management) and making sure the payments and invoices activities end up in the data warehouse
- Performed Systems Integration Testing: validated the data integrated from various data bases during the multiple acquisitions of new companies and their data
- Extensively tested business object reports, validated the cosmetic of the report, validated the data in the report against the data in the data warehouse using SQL
- Validated a new ETL architecture, new workflows, mapping and sessions when we added new sources to the data warehouse
- Performed Production Validation after every release for all user stories assigned
- Analyzed the Production Defects and reported to the developers and concerned parties
- Involved in Three Amigo process to ensure that I understood the business requirements and the test cases covered the overall scope of the user stories
- Worked on multiple projects simultaneously
- Used HP ALM as defect tracking tool
- Tested Business Object reports for data quality
- The “go-to” person amongst QA engineers
- Actively participated in every retrospective meeting and acted as the voice of QA team
- Thoroughly tested the integration of Salesforce to the data warehouse in a SANDBOX testing environment
- Verified destination system data requirements such as field names, field type, mandatory fields, and other field-level validation checks
- Verified data transformations, data conversion when needed
- Verified that the filters are applied appropriately in some cases
- Validated that all the objects and fields have been correctly setup for the data loading to begin
- Validated the data to detect duplicates, truncation, and count of records between the data warehouse and cloud after loading the data to the Salesforce using APEX Data Loader
- Generated Salesforce report from salesforce.com to validate the data as a part of end-to-end testing
Environment: Oracle 10g, DB2, SQL Server 2008, Informatica Power Center 9.5, SQL, TOAD, SharePoint, HP ALM 11, SAP Business Object, Salesforce
Confidential, Birmingham, AL
ETL QA Analyst
Responsibilities:- Analyzed the Requirements from the client and developed Test cases based on functional requirements, general requirements and system specifications
- Prepared test data for positive and negative test scenarios for Functional Testing as documented in the test plan
- Prepared Test Cases and Test Plans for the mappings developed through the ETL tool from the requirements
- Extensively used Informatica Workflow Manager to run the workflows/mappings and monitored the session logs in Informatica Workflow Monitor.
- Verified session logs to identify the errors occurred during the ETL execution
- Created Test Cases, traceability matrix based on mapping document and requirements
- Verified the logs to identify the errors occurred during the ETL execution
- Written several complex SQL queries for data verification and data quality checks
- Validated custom ETL Packages using PL/SQL
- Written several control files for SQL Loader to load the data into tables
- Reviewed the test cases written based on the Change Request document and Testing has been done based on Change Requests and Defect Requests
- Tested the ETL Informatica mappings and other ETL Processes (DW Testing)
- Effectively coordinated with the development team for closing a defect
- Prepared Test Scenarios by creating Mock data based on the different test cases
- Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process
- Debugging and Scheduling ETL Jobs/Mappings and monitoring error logs
- Wrote complex SQL queries for extracting data from multiple tables and multiple databases
- Used Rational ClearQuest as defect tracking tool
- Extensively used Oracle database to test the Data Validity and Integrity for Data Updates, Deletes & Inserts
- Used rational ClearCase and SharePoint as version control tools.
- Provided the management with weekly QA documents like test metrics, reports, and schedules
Environment: DB2, Informatica Power Center 9.5, Clear Case, Clear Quest, SQL, PL/SQL, TOAD, UNIX, Putty, SharePoint, Flat files, XML files