Software Engineer, Test Automation Resume
Eagan, MN
SUMMARY:
- To obtain a challenging career in the information technology field utilizing programming, system analysis, automating, quality assurance, analytical & problem - solving, designing test framework, and database development and testing skills.
- Experienced at working in different domains: Products & Publishing Services, Financial, Transportation, E-Commerce, Postal Services, Health Insurance, and Manufacturing B2B national and multi-national industries.
- Experienced with software development life cycles, software developing, testing and automation within agile process.
- Experienced in developing test automation (combination of development and test), automating build processes, application deployments and continuous integration system like Jenkins/pipeline as code, Maven and Gradle.
- Proficient with programming in Object oriented and scripting languages and technologies like Java, Selenium Web driver, Http REST, Junit, HtmlUnit, Jmeter, Spring Integration, Cucumber-JVM/SpecFlow/Gherkin, Spock, Groovy, Geb, GIT, C#, MsTest, Nunit, JSON, XML, JavaScript, JQuery.
- Experience in designing database tables, creating views, and proficient in writing complex SQL queries and PL/SQL(as needed) and data integrity validation against databases and data warehouses in Oracle, SQL Server, MySQL.
- Experienced with other quality languages and tools like Jira, VSTS/TFS, Jmeter, Silk Performer, Quality Center, Soap UI, STS/Eclipse, IntelliJ, SQL Client, JMS, and Visual Studio.
- Good experience in Perl, Python, and Shell Scripting.
- Experience in working onshore/offshore model.
- Knowledge of applicable data privacy practices and laws.
- Proficient at managing multiple projects and testing AWS cloud based micro-services.
- Exposure with applications developed in various client side scripting technologies such as AngularJS, AJAX, JavaScript, JQuery etc.
- Exposure with NoSQL technology like MongoDB, Cassandra, PostGres.
- Strong skill set on Requirement Management (RQM), Configuration Management (CM), Bug Management (BM), and Development and Testing Methodologies practices.
TECHNICAL SKILLS:
Languages: Java, PL/SQL, VB/C#.NET
Relational/NoSQL Databases: Oracle 8i/9i/10g/11g, MS SQL Server 2005/08/R2, Postgres
Java Technology: JAVA, Spring MVC Framework, REST, Design Patterns, JMS, Log4J .NET Technology: C#, MsTest, NUnit, SpecFlow, Coded UI, Visual Studio 2017, VSTS/TFS
Scripting Technology: JSP, JavaScript, JQuery, JSON, XML, ASP.NET, VBScript, HTML, CSS, Python
Other Tools and Technologies: Selenium, N/Junit, HtmlUnit, Spock/Groovy, Cucumber-JVM/SpecFlow, gherkin, SQL, JMS/RabbitMQ, Toad/Oracle SQL client, GIT, Jmeter, Borland Silk Performer, STS/Eclipse, IntelliJ, MS Visual Studio 2017 Configuration/Build/Release Jenkins (CI/CD), GIT, Gradle, Maven, Jira, MS VSTS/TFS
Management: Apache Directory Studio, IBM Directory Mgmt. tool
Web Server/Protocols: Tomcat, IIS, HTTP, SOAP/REST, FTP
Operating System: Windows, Macintosh, Unix/Linux, and Solaris
Knowledge/Exposure to: Coded UI, MongoDB, Cassandra, AngularJS, Geb
Methodologies: Agile, OOAD, SDLC
PROFESSIONAL EXPERIENCE
Confidential, Eagan, MN
Software Engineer, Test Automation
Responsibilities:
- Working together with three other Sogeti Automation Engineers and the client we designed an automation framework that would support existing development technology at Blue Cross and Blue Shield and support Behavior-driven development (BDD).
- The automation framework was designed using object-oriented principles. The framework included base classes and various helper classes to support the Page Object Model, BDD, web services, assert management, screen capture, reporting, test configuration and logging - making the framework extensible and flexible.
- The framework was designed to be extended to support test automation of multiple applications. The framework was built using C#, SpecFlow, Selenium, Coded UI and integrated into Microsoft Visual Studio Team Services.
- With the application test automation framework in place, we began to design and implement tests as a proof of concept for the new framework. With successful results from the proof concept and positive feedback from management at Blue Cross & Blue Shield, we began to design and implement regression tests for various in-house and customer facing web applications.
- As the new application test automation framework matured we began to mentor other teams on its use. We mentored junior developers in object oriented design patterns, use of BDD, SpecFlow, automation and software engineering best practices. Collaborated with business and development to enable communication and leverage technical resources.
- The software development process was implemented using the Agile Methodology. Helped to lead sprint planning, write stories, collaborate with other teams and prepare deliverables for client. Software is developed and tested using, Microsoft Visual Studio, Git (open source distributed version control system) and Microsoft Visual Studio Team Services.
Confidential, Eagan, MN
Contract Software Development/Automation Engineer in Test
Responsibilities:
- Designed, developed and executed Acceptance Test Driven Development (ATDD) for various AWS cloud-based micro services using Java REST, Junit, JSON, and continuously integrated with Jenkins to build and run fully automated regressions on nightly basis as per scheduled CRON job.
- Designed, developed and implemented REST services and Selenium web driver based automated tests.
- Clear and regular communication on Iteration story, backlogs & bug updates/ status/ results to project stakeholders and across other teams for iteration & end to end testing in preparation for feature’s production release.
- Worked closely with product owners and development team to identify areas for improved efficiencies, issues etc on a weekly basis.
- Monitored and maintained automated nightly test runs, analyzed test results & quickly be able to target & fix test failure.
- Played a key role in investigation, troubleshooting and resolutions of customer issues in development, test and production environments and provided support to the partner and development on these issues.
- Debugged software and performed reviews of automation and application code.
- Constantly evaluated the test automation strategy and approached to identify areas of improvement e.g test automation frameworks, dynamic data collection methodologies, coding standards etc.
- Supported an agile development team in designing, building and testing quality software products.
- Developed test strategies, practices, and artifacts that minimize testing effort and increase efficiency and scalability.
- Performed story acceptance testing as part of an agile scrum process.
- Refactored and maintained existing automation test framework.
- With successful results of proof of concept, designed, developed and implemented automated scale and performance tests automation framework using, Java, Junit, Jmeter, Jmeter plugin for gradle, and Jenkins to measure scalability and performance of micro service APIs.
Confidential, Eagan, MN
Contract Software Engineer in Test
Responsibilities:
- Worked together with the Confidential Test Automation Architects.
- Researched and reviewed various existing test automation frameworks both internal and external to Confidential and selected the qa-base test automation framework for its support of Cucumber, extensibility and flexibility. qa-base is a Confidential designed software component.
- Refactored existing base test automation framework 'qa-base' to provide a common and consistent framework to build test automation across the multiple business units. The framework was designed to be extended to support test automation of multiple applications. The framework was built using Java, Spring Integration, Selenium Webdriver and Cucumber- JVM. The framework is provided as a JAR file.
- Designed, developed and implemented a new test automation framework that extends from the qa-base. The test automation framework provided implementation for the primary base classes that are used to support test automation: AbstractWebdriver, AbstractPage, AbstractStepDefinition, PageElement and AbstractRESTService.
- Refactored existing tests as a proof of concept for the new framework. With successful results of the proof concept and positive feedback from management on the new framework, refactored and migrated existing tests from the old framework successfully.
- As the new application test automation framework matured we began to mentor two other teams on its use. We mentored junior developers in object oriented design patterns, use of Spring Integration, Cucumber- JVM, automation and software engineering best practices. Collaborated with business and development to enable communication and leverage technical resources.
- Implemented the software development process using the Agile methodology.
- Helped to lead sprint planning, write stories, collaborate with other teams and prepare deliverables for client.
- Software developed and tested in a continuous integration environment using Jenkins, Maven and GitHub and automated regression tests run on a nightly basis.
- Authored and edited user and technical documentation using read me.
Confidential, Richfield, MN
Contract Software QA/Automation Engineer
Responsibilities:
- Analysed business requirements, designed test specifications accordingly, developed and implemented tests automation for front-end(POLAR UI) and back-end (Phoenix Shipping and Cargo Delivery Services) automation tests using Java REST API, Junit, Selenium Webdriver, and continuously integrated with Jenkins to run tests on nightly basis.
- Implemented test automation using existing framework and technologies and also as needed, researched, explored and adopted better technologies to aid the productivity.
- Developed test plans, reviewed test cases, tracked defects, identified quality risks, and communicated status to ensure that the product meets business requirements.
- Ensured compliance with testing and automation development methodology and policies.
- Responsible for establishing and maintaining testing infrastructure and tools.
- Reported test progress, issues and problem resolution on regular basis to the management.
- Prepared and managed test data by coordinated with test data management team for end-to-end testing.
- Collaborated with product owner and development team member during development phase to ensure stories are complete and have testable acceptance criteria.
- Collaborated with software scrum team to develop risk based test plans that in corporate defect prevention practices, unit testing, test driven development and use testing strategies to assure quality software products.
- Assisted with DevOps support as needed.
- Lead, facilitated and represented testing team as a key member of an agile group using agile practices.
- Created SQL queries against databases and data warehouse in Oracle.
- Performed story acceptance testing as part of an agile scrum process.
Confidential, Eagan, MN
Contract Software Automation Engineer
Responsibilities:
- Member of the Platform Quality Assurance Automation Group (QED), responsible for design, development, testing and implementation of software components to support test automation for Optimus Learning and Video platform software products: West Legal EdCenter/Redesign, AccelusLMS/Redesign, Optimus Viewer (further leverages the Kaltura video platform for streaming both on demand video and live events).
- Designed, developed and tested application in Java based Selenium Webdriver Test Automation Framework.
- Designed, developed and tested various web-services using Java, Junit, REST.
- Converted existing Selenium Webdriver automated test scripts written in C#.NET into JAVA based Selenium Webdriver Automation Framework.
- The automation test components were integrated into TEX (TesTEXecuter) tool to support a continuous integration build model and test reports were integrated into Apollo reporting platform.
- Wrote helper (utility) classes to implement test data, support of data parser (JSON/HTML/XML), Data Access Object (DAO) and support of REST services.
- Created SQL queries as needed.
- Designed, developed and implemented test cases within an agile model.
- Executed automation/manual test scripts for Functional, Non-functional, Backend, GUI, Integration and End-to-end System testing.
- Documented, logged and tracked stories, tasks, & defects in Jira.
- Worked closely with business client, business analyst, development and Quality teams
Confidential, Eagan, MN
Contract Software Automation Engineer
Responsibilities:
- Worked with the Judicial Workbench Team, responsible for design, development, testing and implementation of software automation components to support the CPP components on BPMS workflow- Product Builder & Delivery Services, Explicit Relationship Service, Judicial Summarization Service, Judicial Metadata Service, Load Traffic Control (LTC), PACE/History Service, Classification Hierarchy Service, Content Service, Bermuda, WestlawNext Site, Mainframe applications (Feeds, CHD back-feed, IOP Composition).
- Involved and successfully launched complex software projects that require increased skill in multiple technical disciplines.
- Worked closely with business client, business analyst and development to review requirements and to define detailed acceptance criteria for the features being developed to achieve the expected quality of the product before delivering to customers within an agile process.
- Developed test plan, test cases and Performed test case review, automation code/script review in agile model.
- Designed, developed, and tested Java REST APIs and Selenium Webdriver Automation Frameworks for various front and back end applications.
- Wrote complex SQL queries to validate back-end data integrity within complex data model over its entire life-cycle.
- Effectively contributed and implemented new ideas on improving the software testing framework development & testing process and been creative and flexible to deal with changing product requirements.
- Implemented and actively participated in build test automation using Java, Web Services (REST), spring, JMS, Junit, Selenium/Webdriver, SQL, XML and other technologies and execute automated tests and debug failed tests.
- Designed, developed, executed and maintained automation/manual test scripts for Functional, Non-functional, Backend, GUI, Integration and System End-to-end tests to ensure code meets requirements and also addressed different types of testing needs.
- Coded mock clients and mock services to test server based software with REST, Spring/MVC, XML & other technologies as needed.
- Prepared, extracted and modified test artifacts (Judicial Content, Metadata) in XML format by conforming DTD/schemas.
- Configured and maintained test environments, and also maintained continuous build integration project using anthillpro, investigated build issue, and analyzed the regression test result discrepancies. Also, responsible to analyze cloud regression (runs against every development build) test result discrepancies as needed.
- Prepared and provided testing metrics and reports about test progress, issues (if any) and problem resolution on regular basis to the development team and management.
- Investigated, troubleshoot and resolved or recommended solutions to customer issues in a diverse range of Complex software development, test, and product environments and provide support to business, client, and development on these issues and use judgment within defined policies and practices.
- Worked on onshore/offshore model, and provide quality mentoring and guidance on product domain knowledge, business rules, support to peers on Tools and promote knowledge sharing.
- Managed multiple tasks and primary point of contact/lead for projects, features, and iteration, regression & integration testing efforts and include careful planning, prioritization and readjusting timelines as needed.
- Exposure on JMeter, LogStash, ElasticSearch, Kibana, Jenkins, and MongoDB.
Confidential, Minneapolis, MN
Systems QA/Backend Software Engineer in TEST
Responsibilities:
- Performed data analysis tasks as part of an agile development team by attending daily stand-up status update meeting and communicating frequently and clearly with team members on product issues and performing testing to products with high quality.
- Participated Agile Sprint planning meetings to discuss about goal, backlog, roles, time-estimation and lesson learned.
- Participated in requirements (user stories), design, code and test case review meetings.
- Involved in all phases of Software Development, Testing and Defect management Life cycle.
- Worked closely with business analyst, application/database development and other team members to ensure that all requirements/stories have been met and tested.
- Designed, developed and updated tests and test scripts utilizing complex SQL program data comparisons while assuring complete coverage of business requirements by creating traceability matrix.
- Documented and maintained test plans, test cases, test scripts, defects and report in the test case management systems (Test Track Pro, SharePoint, Quality center/MS Test Manager) following the standard guidelines.
- Analyzed, and tracked all test results including discrepancies/defects, progress, Performed root cause Analysis on discrepancies and defects by reviewing the .NET application/GUI, SQL Stored Procedures/Views/Functions code and made recommendations on options for how to handle the issue or discrepancy.
- Worked to identify data related problems and solved discrepancies or data integrity issues by ensuring high quality data is collected and integrity of the data is maintained.
- Interacted with developers to develop and execute test strategies and to understand data needs and issues/risks while testing of highly complex, high volume, data warehousing initiatives.
- Executed and maintained appropriate test operating methodology/processes related to functional, integration, regression, smoke, end-to-end, back-end testing of all application services targeted to production via manual as well as automated scripts.
- Prepared XML based statements (buyer payments, seller fee payments), moved funds between MSA and SSA accounts using Agent bank processing system by validating the schema/mapping document and worked on various data feeds like XML, Text, CSV, and EDI in daily basis.
- Built and enhanced data (Freight, goods and services transactions) loads & extracts from Transportation/Payment Manager into the Settlement db and moved the transactions through different app services and finally settled the transactions by creating and executing the SQL server Agent jobs and events.
- Maintained, tested and debugged batch scripts, SQL stored procedures, Functions and Views to make sure the business logics being implemented are correct.
- Performed web services testing to verify pricing request that involves different client transactions and validated the XML messages (request and responses) by conforming the business rules.
- Support and advice offshore team working on various new feature testing, preparing test cases, preparing xml message for web services testing and creating automation scripts for regression.
- Developed appropriate SQL queries (basic to complex), batch scripts and used application utilities to validate/analyze test data and results for SSIS package.
- Tested ETL’s to ensure that the source and the target are mapped as per the user stories and helped team on data conversion testing (from baseline old legacy system to new Renaissance system) within Network Settlement for Citibank.
- Performed User Interface against various browsers Compatibility Test.
- Performed price verification web application acceleration (Performance) testing against Syncada pricing engine into different browsers.
- Worked in a fast paced environment with time driven releases and successfully drove the issues to closure.
- Supported Software verifications, Client and Production releases and troubleshooting of system or customer issues.
- Conducted meetings by inviting end-users and demonstrated software changes before client/production releases.
- Worked with other colleagues to perform all test related activities in support of demanding and aggressive project schedules.
- In collaboration with others, develop and maintain QA test databases and data systems necessary for projects and day to day functions.
- Provided technical leadership for testing enterprise applications developed using new and emerging technologies and in implementing new, innovative ways of providing quality testing services.
Confidential, Eagan, MN
Software Test Automation Engineer/Data Analyst
Responsibilities:
- Involved and successfully implemented mission critical software development national and international projects: West law-UK/Japan/China/German/Spain-Aranzadi, Carswell, Saegis, TTA, Dialog, Thomson West Migration, Thomson Financial, Parallel Doc Feature, Dup Doc Detection, Character/Word-based Search, Suffix Array Searching, Search Summarization, Unmerged Locator, Auto/Manual Reclaim Rewrite.
- Gained in depth and thorough understanding of Novus suite, a highly complex publishing, search, and retrieval environment used by National and International partners around the world.
- Participate and analyze Software Business Requirements (BRS), Software Requirements Specification (SRS), Software Design Specification (SDS) & Code reviews to understand changes impacting the Novus Suite environment and design & develop test plans and test cases and implement them using JUnit/XML scripts based on these reviews and analysis.
- Perform and analyze unit, functional, integration, end-to-end system and performance testing against new feature and regression. Before Client release, perform versioning regression by pointing new controller to old engines. Manage a QA release from end-to-end, coordinating all activities, providing leadership and mentoring teams.
- Actively participate in planning, estimation, and prioritization design, development and test processes to ensure successful execution of software development activities and project management.
- Analyze, administer, document, and submit status reports and track application defects discovered during testing cycle. Verify, update and close issues being resolved. Mentor and coach test resources on the test methodology throughout the life cycle of projects.
- Plan, analyze, design, develop, integrate, and execute proprietary Java and XML based white box tests for services like search/retrieval/index/load/wheel by utilizing in-house built testing framework (XMLTester/NovusTester) by conforming various Novus Wrapper DTDs against Product and Publishing APIs and Java Messaging Service/MQ within Novus Architecture layer.
- Create and validate XML test data and scripts by conforming Novus Load Wrapper DTD/Schemas. Interpret rules, parameters, methods and symbols being used in DTDs and make XML API call test scripts and data.
- Design, develop, integrate and execute unit and functional tests in Java/JUnit using Novus Regression Testing harness product and publishing APIs, JMS, DBs and build appropriate test drivers and regression suites. Continuously add, integrate and maintain JUnit regression suites.
- Maintain Novus Regression Testing harness written in Java continuously as per new changes and test requirements.
- After code coverage build, analyze code coverage results and prepare gap analysis report and based on the code coverage reports, design and develop additional JUnit tests and executed them to cover the gap code.
- Maintain test requirements, test cases and JUnit codes in Quality Center. Automate Regression suites and Execute suites via Quality Center. Create requirement traceability matrix and generate requirement coverage report using quality center after each test run against new build.
- Write simple to complex SQL queries and PL/SQL programs to perform back-end data integrity testing in multiple RDBMS i.e. Oracle, DB2 and create, delete, update various publishing features (collection/sets, index element set, dictionary set etc.) directly via SQL scripts on daily basis.
- Create and modify Perl, Python and Shell Scripting for daily work to manage & organize regression launching scripts, to pull/maintain server logs, check processes, JVM, spaces, extract test data and generate test results.
- Configure LDAP to update parameters based on design spec for testing new features in QC environment as per needed using IBM directory management tool.
- Maintain ant and anthill scripts (build) for JUnit based regression suites and configure properties files regularly as per needed. Configure and install JBoss, Jetty/ Tomcat servers.
- Set up testing platforms for XMLTester/NovusTester and configure collections, index data domains for highly distributed database servers (Oracle/DB2) at various architectural layers within Novus in different Linux boxes.
- Work independently as well as in team to understand features, issues, explain and reproduce defects, anticipate design and coding weaknesses and vulnerabilities. Effectively contribute and implement new ideas on improving the software development & testing process and be creative and flexible to deal with changing product requirements.
- Investigated old regression suite performance issues and submitted investigation report to the management and restructured and redesigned Index Regression suites running in different boxes and improved overall regression execution performance level by 35% to 90% and reduced execution time to 4 hours from 12 hours. Performed test scripts and data migration from old suites to new suites using shell scripting.
- Worked closely with Performance testing team to plan and execute Saegis performance testing for Suffix Array Based Searching using Java Test Harness and Silk Performer. Independently designed and developed testing requirements and test cases for Performance testing based on Business requirements.
- Create and execute Silk BDL script for Java Framework to measure speed, scalability of Novus Search Engines.
- Regularly communicate with Development Team, Release Manager, CM, and QA Manager on the features assigned.
- Attend daily stand up meeting to update status and issues being found during testing cycle. Perform lead role to educate QA test engineers about testing process, systems and features being tested.
- Keep abreast of current methodologies (Agile, iterative), technologies (Java/C/.Net platforms) and tools (Eclipse, XML Spy, JEdit, XML Tester, and Novus Tester) and proactively seek ways to in corporate those into testing effort
- Converted existing XML script based tests into JUnit tests.
- Performed a lead role in re-engineering new load regression suite and successfully performed the following responsibilities:
- Actively involved in Novus Regression Testing Application/Framework analysis, design, development, testing, support and proof of concept.
- Wrote testing requirements for 33 load features within DOC/NORM/CSLOC/METADOC/NIMS/TOC services.
- Designed & developed unit and functional JUnit test cases and implemented, reviewed and executed JUnit tests.
- Designed, developed and integrated test suites and monitored nightly ant build and analyzed testing report generated by JUnit in Anthill. Responsible for bug/error tracking as well as test status and results reporting, working within version controlled test cycles.
- Developed requirement traceability matrix for managing requirement coverage using HP Quality Center. Created requirement, testing and bug tasks using TFS.
- Responsible for supporting testing teams by answering, analyzing, investigating, and explaining their questions, issues and concerns related to NRT framework, requirements, test cases, and configurations.
- Supervised 4 interns, and 4 Offshore QAs.
- Provided NRT framework and testing support to QA team in India and attended Tele-Presence conference meeting with Eagan and India team during transition discussion.
- Conducted JUnit Code review meeting.
- Worked actively and closely with performance, development team and business analysts to perform the following responsibilities:
- Actively involved and successfully performed the load & stress testing for the following services/applications: Saegis Suffix Array Based Serching, Persist Batch Delete Redesign, Reduce CSLoc DB Utilization, Doc Retrieval, DocLoc, MetaDoc, NORM, NIMS, V/NV-TOC, CSLoc, Search, Index, and Load Services.
- Extensively performed load and scalability testing for all Novus services/applications during Oracle 11g upgrade.
- Designed and developed performance testing requirements, test plan and test cases based on business requirements.
- Installed and configured Silk Performer Testing Environment.
- Configured performance reporting tool ‘RUSH’ in VitalSigns Site to generate performance metrics/charts regularly and tracked the results over time and across builds.
- Prepared, Extracted, and loaded large set of test data
- Developed Java Based Test Harness using Novus Product APIs and integrated with Silk Performer tool.
- Created Silk BDL Scripts to implement performance test cases
- Generated performance report after each test run using Silk Performer data repository.
- Maintained performance regression suite regularly.
- Maintained Performance source code version control.
- Conducted performance code review meeting to maintain coding standard and bug control prior to test execution
- Analyzed, discussed, recorded and tracked bugs that are being identified during test results verification.
- Worked closely with DEV, DB and Performance teams for planning, executing and monitoring performance testing.
Confidential, Wheeling, IL
Software Quality Engineer
Responsibilities:
- Successfully implemented Risk Based Postal Verification Pilot System by researching, designing, developing, installing, configuring, testing, integrating and managing software applications- PostalOne, MMOD (J2EE application Server), MERLIN (.NET client) using the technologies- VB/C#.NET, Java/JSP/Struts, Hibernate, JMS, Web Services, SOAP, Oracle 9i, MS-SQL Server, JDBC/Hibernate and tools- Eclipse, Visual Studio.Net, XMLSpy, Hermes JMS, RAD for Websphere/MQ 6.0, JMeter, Test Director, JEdit, Toad for SQL Client in the environments- UNIX/LINUX, Windows XP.
- Involved in all product development life cycle phase. Participated in SRD + RTM, SDD and Code reviews and based on these reviews analyzed, designed & developed Software Master Test Plan (SMTP), Use Cases, Test Cases and Technical documents for both MERLIN & MMOD and implemented QA processes by abiding USPS & IEEE standard (CMM level 4 Quality) throughout the software development life cycle.
- Worked closely with clients, project managers, architects, & developers to understand needs, priorities, preferences, workflows, and schedules and executed test strategies on the development of quality best test practices. Conducted reviews of test artifacts and deliverables and articulated the quality-related trade-offs when faced with a difficult decision.
- Set up test platforms (test-beds) for the application/product testing and performed unit, functional, integration and end-to-end systems (PostalOne, MMOD, MERLIN) testing in highly distributed postal network and measured the performance of n-tiered applications-MMOD/MERLIN by simulating a continuous heavy loads on the MMOD server using JMeter and Hermes JMS tools. After successful completion of engineering testing, integrated MERLIN/MMOD system with the PostalOne System.
- Created real-time postage statement data in xml format and populated XML data elements by conforming to data interchange standard (xml schemas/DTD) and written SQL queries to retrieve data from the various databases using Toad for Oracle database client during backend testing.
- Configured Hermes JMS to interactively process JMS/ topics in queues and monitored the Websphere MQ. Simulated PostalOne System by processing Postage Statement message in the incoming queue via SOAP and simultaneously, operated MERLIN machine and verified accuracy of mail verification jobs by analyzing and validating outbound and inbound JMS/PSJI message based on data interchange standards.
- Communicate and based on business requirements, developed specification, a search test client page and performed all aspects of verification through Automated and Manual tests which include unit, functional, integration, regression, load, scalability and performance testing on MERLIN/MMOD products across all over the platforms to ensure the products are bugs free.
- Analyzed the bug and reported the bug in the Test Director and Documentation of Test Results in the Test document also get the Test reports through Test Director.
- Presented visibility and management control into the quality process through clear and concise weekly metrics and test reports, and escalated problems in timely and accurately after each tests run.
- Mentored and coached test resources on the test methodology throughout the life cycle of a project and Presented about the new feature (software), enhancements, defect fix and tools and technology to the clients in detail at the end of the project delivery.
- Knowledge of networking and protocols, especially TCP/IP, FTP, SOAP and HTTP.
Confidential, Omaha, NE
Software Data and Quality Assurance Analyst
Responsibilities:
- Involved in complete Life Cycle of various software application development enhancement projects like Movement Tracker System, UOP, GBOL from Business requirement preparation to user acceptance testing to project deliverable stage
- Investigated, Defined and Developed Software requirements specifications based on the business requirements and enhancement request documents
- Involved in Business Requirement Specifications, Software Requirements Specification, Software Design Specification, Database Schema Design and Code reviews to understand changes impacting the Union Pacific’s main operating system and delivery of shipment events to the trading partners
- Analyzed the Business Requirement (BR), software requirements High Level Design (HLD), High Level Application Design (HLAD) and Technical Requirement (TR) documents to design and develop a testing strategy documents to discuss with the Development and business team
- Involved in the process of Design, Creation, and Publishing of several test documents that formalized the Test procedures of IEEE standard.
- Designed and developed Test Plans, Use Case Scenarios, Testing Methodologies, Process flow diagrams, Test Cases and Test Schedules as per Functional requirements and Design Specifications to assist in the execution of JUnit, Functional, Integration, Performance, User Acceptance and Regression Testing using UOP APIs.
- Contributed and coordinated QA activities such as; review functional specification, extract-testing requirements, document and execute manual test cases. Maintained and enhanced existing test automation framework and data driven test cases, review/update regression test cases for UOP and MTS applications
- Responsible for performing all phases of Quality Assurance software testing during development and prior to the application being launched and continuing during the product life cycle.
- Gained knowledge of EDI Communication Specification 004010 and 005020 versions of ANSI X12 standards to exchange EDI data. Processed EDI messages using UP’ Value Added Network (VAN); Transentric, which has a direct connection to all the commercial VANs (GEIS, AT&T, Rail inc, Kleinschmidt and 20+ other third-party exchanges.) and using FTP options during system performance testing
- Performed EDI translations and validations by analyzing and verifying outbound and inbound EDI messages (PXML format) based on data interchange standards
- Created UPDS root cause analyzer tool, which is used to create daily vendor communication performance report.
- Collected and analyzed data specific to each partners i.e. Chrysler, Ford, TNT, Schneider and based on the data collection and analysis, created daily/weekly/monthly data reporting performance metrics and reported directly to Assistant Vice President, Director and Development managers which is used to monitor and manage the business.
- Analyzed root causes of performance failure and recommended solutions to the management and improved overall performance level from 30% to 95%
- Written simple to complex SQL queries, PL/SQL stored procedures and functions by applying PL/SQL programming techniques and extensively used SQL queries to perform Data Integrity testing and all back-end transaction testing evaluating data stored within several databases
- Written Unix Shell Script for file and data validations, FTP from different servers and Generated UNIX scripts for extraction of Test data, loading Historical test data to Oracle tables using SQL Loader and test data load in Oracle tables using various SQL Statements
- Designed, documented and implemented reliable QA solutions (tools, scripts, methodologies) to resolve redundant tasks and blocking issues
- Tested Trouble Tickets, Change Request, and Maintenance Request and used test director for incident reports and communicating with teams regarding issues and recommended solutions in order to improve business logic
- Identified defects in software, analyzed the root cause of the defects at the component level and used PVCS tracker to report and track defects, escalated the issue depending on risk and criticality and communicate with dev team regarding issues, recommend solutions to fix the issue, helped dev team to recreate the issue and revalidate the defects after being fixed and closed trackers
- Assisted development team to test performance and correctness of UPDS accounting software i.e. Agresso and GBOL (General Bill of Lading)
- Worked closely with configuration management team and participated in build and deployment of MTS and UOP applications and performed end-to-end system and integration testing during pre-build release in QC environment
- Monitored Partners systems- Chrysler’s Shipment Tracking and Analysis Systems (STAR), TNT, Ford’s Penske Logistics Systems to confirm flow of correctness and performance of EDI message which contains Invoice, Container Arrival/Departure events, Shipment Cancellation, Train ETAs information
- Designed and developed business reports and software modules with Siebel Analytics (UPDS’ Central Data Repository System), Teradata, XML, XSL, HTML, MS Excel (macros), PL/SQL, Toad for SQL
- Experience in handling EDI system, Siebel Analytics and Union Pacific Logistics
- Conducted white box testing at unit level which includes data flow, conditions, loop testing and developed process flow and data flow diagrams at unit and component level
- Provided MTS, Penske, Transentric, UOP and STARS web support and training to other team members