We provide IT Staff Augmentation Services!

Consultant Business System Analyst Resume

3.00/5 (Submit Your Rating)

Buffalo Grove, IL

SUMMARY:

  • Over 7 years of experience in Information Technology with emphasis on Software Testing/Quality Assurance.
  • QA Lead for sprints’ planning, estimations, daily scrum calls, monitoring and tracking of defects, communication plans, and application demos using Agile/Scrum methodology
  • Extensive multi - tasking testing experience in the Financial, Insurance, Healthcare and Pharmaceutical industries.
  • QA Lead for sprints’ planning, estimations, daily scrum calls, monitoring and tracking of defects, communication plans, and application demos using Agile/Scrum methodology
  • Skilled in developing Test Estimations, Test Plans, Test Cases/Scenarios, Test Procedures, analyzing Test Results and Test documentation and Test Metrics.
  • Well versed in Software Development Lifecycle Management (SDLC) phases/QA processes using Waterfall, Object-Oriented, and Agile methodologies
  • Sound Knowledge in testing process from end-to-end perspective including requirements analysis for testability, analyzing and identifying testing risks, coming up with effective risk mitigation and contingency plans, developing and executing complex test strategies and plans to ensure delivery of quality information systems and/or software programs that conform to organization standards, developing and executing test cases, defect tracking and resolution
  • Experience in leading and implementing the testing/assuring activities and efforts for the Quality of Stand-alone, Multi-Tier Client-Server/Web based, and Mainframe applications
  • Involved in Testing for quality of application modules through all phases of testing manually and using automation tools. Phases of the testing included Unit testing, Integration testing, System testing, Regression testing, Product testing, Smoke/Sanity testing, and Interface testing.
  • Experienced in using Automation/Testing tools like: Selenium, Silk Test, QTP, Win Runner, and Quality Center.
  • Hands-on Browser compatibility and web applications testing using various Mobile devices/operating systems.
  • Strong knowledge of Business Intelligence tools like Business Objects 6.0, Business Objects SDK, Cognos Series 7.0 and other reporting tools like Crystal Reports and Jasper Reports.
  • Experience in creating Requirements Traceability Matrices to ensure comprehensive test coverage of requirements
  • Experience in design and implementation of stored procedures, triggers and functions at database level using PL/SQL.
  • Experience in ETL testing on Data Extraction, Data Transformation, Data Loading, Data conversions and Data Analysis.
  • Ability to monitor/measure the testing progress, generate and present detailed and summarized status reports to higher management
  • Excellent skills of identifying defects and managing them during testing cycle.
  • Assisting in creating User Acceptance Testing (UAT) plan, training the key users and performing it based on the test data provided.
  • Excellent analytical ability and communication skills, ability to quickly adapt to fast paced environments, quick learner and attention to details.

TECHNICAL SKILLS:

Operating Systems: Windows 98/2000/XP Professional, UNIX.

Methodologies: Agile, WIP (Work Implementation Process), RUP, Waterfall.

Middleware: Servlets, JDBC, JSP, RMI, JMS, Struts, Tibco, JCL, Vitria.

Designing Tools: Rational Rose 2000, MS Visio, Magic Draw, and Enterprise Architect.

Requirement Tools: Rational Requisite Pro, Borland Caliber, Jira.

Testing Tools: Test-Director, Quality Center, QTP and ClearQuest.

Databases: MS Access, SQL server 2005, Oracle 10g, Sybase, DB2.

Editors: Edit plus, Eclipse, MS Word, VI.

Languages: C/C++, Vb Script, XML, Java, HTML, SQL, PL/SQL.

Hardware Environment: Mainframe, mid-range and client/server environment.

Rules Engine: Pega, Ilog.

Reporting Tools:: Business Objects, Cognos-8, Crystal Report, Jasper Reports, MS Excel.

Database/ETL Tools: Magic Draw, Erwin, Powercenter Informatica, IBM DataStage.

Version Control:: Rational Clear-Case, Subversion, Microsoft SharePoint.

PROFESSIONAL EXPERIENCE:

Confidential, Buffalo Grove, IL

Consultant Business System Analyst

Responsibilities:
  • Participated in development of sprints’ activities: feature complexity analysis, resource estimation with scrum masters, PM, BSA, and Dev teams for User stories including backlog planning in Jira.
  • Validated requirements by means of daily scrum benefit call with the SMEs to fix the issue of the plan and then again executing the plan in the salesforce to validate the consistency of data.
  • Analyzed assigned project’s environment, architecture, and interdependencies to determine and apply appropriate test methodologies and cases relative to applications being tested.
  • Participated in the gap analysis/ legacy data mapping of functional and technical specifications.
  • Tested the Extraction, Transformation and Load process (ETL) for a Data Warehouse from Salesforce to Pre-processor system (Java architecture) and Benefit Builder system (Java architecture) and finally to target Rx claim system (AS 400 application) with requirements encompassing disparate Data sources ranging from flat files, XML Files, Oracle, SQL server, DB2 databases to load the data into Heterogeneous targets AS 400 system.
  • Conducted the Metadata testing, Data Completeness testing and Incremental ETL testing.
  • Integrated data results with intensive sql queries and compiled results for daily stand up business user validation check points.
  • Basic understanding of the Java Code to be able to find bugs in Junits and further checking in the code for nightly builds.
  • Involved in the peek review of Test Plans, Test Cases with the QA team to verify implementation of new features and enhancements.
  • Worked with Data Analyst for ETL Design review meeting to get approvals from cross-functional teams.
  • Good understanding of the Automation procedures and implemented java program for the BPL team to reduce the execution time by 1 hr per run.
  • Facilitate implementation of new functionality through UAT testing, training sessions, demos, and the development of appropriate documentation. Provide relevant test scenarios for the testing team.
  • Good understanding on the test data generation through the Salesforce tool like uploading the plan, checking the status of the plan and triaging defects resulting in the re-generating the plan.
  • Communicating all test phase results, Defects tracking, risks lessons learned and area for improvement to stakeholders and multiple levels of management.

Environment: Agile, Jira, AS400 Mainframe, J2EE, IBM DB2, Microsoft Word Suite.

Confidential, Chicago, IL

Consultant Quality Analyst

Responsibilities:
  • Involved in analyzing, understanding and development of the business requirements for VMS ODS Data Quality Checks, ODS-UAT2, Data mart, ECM and Image Plus departments through feature driven approach and uploaded in Quality Center.
  • Participating and contributing to all phases of system development, including requirements, design, development, metadata conversion and testing to achieve the successful development and implementation.
  • Working extensively on the SRS and CRS of the current and major release to carve out the scope and requirements.
  • Worked on Axure and Visio to create/manage Use Case, Activity/ State Chart Diagrams for requirements related to Enterprise Health Care Management and Data mart.
  • Generated scope of testing by analyzing functional specification during JAD sessions with SME’s interfacing the Opermart, ImagePlus and ECM systems.
  • Extensive ETL testing involved during the test planning and execution phase of opermart project for the Online and Paper Enrollment, Invoice, Billing, Refund, Membership Removal, Delinquency and Recurring ACH processes.
  • Extensive report testing on cognos for the generation of various reports for the above mentioned modules.
  • Developed Positive and Negative test cases in Quality center and executed them in Test Lab based on the Test Strategy schedules.
  • Involved in the peer review of Test Plans, Test Cases with the QA team to verify implementation of new features and enhancements.
  • Developed Requirement Traceability Matrix in Quality Center by mapping the uploaded requirements to the developed test cases in Quality Center.
  • Developed and execute SQL queries for validating data integrity and database validations to perform Back End Testing.
  • Troubleshoot the defects and environment issues and thoroughly documented defects along with specific error messages and steps to duplicate in HP Quality center.
  • Designed regression test bed of re-usable, data-driven tests that were decomposed to a level that should be usable by existing and future test scenarios. This resulted in cost benefit of reducing time for test creation, maintenance, and execution by design.
  • Extensive use of UNIX and AS400 Mainframe commands during testing for various types of activities like checking the component status, log errors, application issues, environment issues, etc.
  • Maintained Automated Test scenarios for GUI, Functionality, Boundary, Security and Regression Testing using Quick Test Pro.
  • Resolve help desk tickets for the existing ECM users.

Environment: Waterfall, HP Quality Center 10, AS400 Mainframe, J2EE, IBM DB2, Microsoft Word Suite, Axure, HP QTP.

Confidential, Waukegan

Sr. Quality Analyst

Responsibilities:

  • Worked with Business Analysts and Abbott Scientists to clarify detailed requirements specification and business needs and decomposed detailed requirements specifications, detail system design documents and use cases into high level test scenarios.
  • Attended daily SCRUM meetings (AGILE methodology) to give daily testing status and discuss about the roadblocks if any.
  • Developed detailed System Test Plans, including Test Procedures and Test Cases for different modules/forms added in the Confidential R&D projects and created manual test cases in HP Quality Center.
  • Responsible for maintaining the Traceability Matrix which maps the test cases to the corresponding requirements.
  • Create Test Calendar which includes Test data requirements, number of batch cycles required for the project & dates on which the batch should be run.
  • Execute SQL queries to conduct Data Integrity testing and comprehensive backend testing to ensure coherence between the data entered in the frontend UI and the data reflected in the backend database.
  • Tested various reports using Cognos for ALL the lines of IT and Business departments based on requirements provided enterprise wide.
  • Conducted Batch processing and testing using UNIX jobs for daily information updating, automated transaction processing, and validating generated log reports as per daily/weekly/monthly basis.
  • Performed Batch testing as per required cycles- configured automated tests, distributed tests to different machines, processed results from test runs.
  • Maintained the automation scripts for enhancements and modifications to perform regression testing. Participated in peer reviews of functional specification, application previews, and test plans/test cases.
  • Responsible for attending scheduled SIT Exit, conduct UAT testing kick off and UAT package distribution.
  • Responsible to run daily UAT incident meetings and provide daily UAT incident reports to all participating LOB testers and development vendors
  • Report UAT incident status and overall testing status/Metrics on Daily and Weekly basis to project Management.
  • Participate in the development of post UAT defect remediation action plans
  • Gather signoff’s from all participating teams and Prepare UAT to Production Gateway documentation.

Environment: HP Quality Center, Agile, SOAP Web Services, Mozilla Firefox, Spring Framework, XML, Oracle, UNIX, Snag IT, Cognos, Text Pad, QTP, MS Office, and Internet Explorer.

Confidential, Chicago

Sr. Quality Analyst

Responsibilities:

  • Analyzed user requirements, attended Change Request meetings to document changes and implemented procedures to test changes.
  • Involved in the Design of Test Plan and Developing Test Cases based on business requirements using Quality Center.
  • Extensive Nightly Batch job testing involved in verifying multiple data points all across the application on UNIX boxes with simultaneous verification of the logs generated on the same UNIX box.
  • Extensive Web Service testing through SOAP-UI for validating the orchestration as mentioned in design document.
  • Performed extensive setup prior to testing Plan Setup and Manual Review features i.e. plan setup, participant setup dry runs from eligibility state to auto enrolled state including opt out scenarios.
  • Extensive Black box testing for stored procedure with both negative and positive test data sets.
  • Developed test harness allowing real time complies for the batch jobs and verify results as passed or fail.
  • Involved Manual validation of business script and recorded the same using QTP.
  • Performed State based testing for various processes in the AE business process.
  • Analyzed and verified the reports using Crystal Reports 8.5
  • Executed PL/SQL scripts to ensure data integrity throughout the database.
  • Participated in the defect management life cycle all along in QA and UAT environments.
  • Involved in System Testing and Integration testing for dot releases.

Environment: Agile, UML, Magic-Draw, SOAP-UI, Crystal reports, Quality Center, MS - Word, Oracle 10g,Toad,Tibco,IBM Web sphere, J2EE,.NET.

Confidential, Chicago

Quality Analyst

Responsibilities:

  • Involved in analyzing, understanding and development of the business requirements for Scholarship Services, Program Evaluation, Educational Services and Finance department through feature driven approach and uploaded in Quality Center.
  • Interact closely with the business users from project initiation phase to the implementation phase in order for the successful completion of system testing and regression testing through Quality Center.
  • Defining and performing the Test strategies and associated scripts for the verification and validation of the application and ensuring that it meets all defined business requirements and associated functionality.
  • Worked parallel with the Enterprise Test Data Management Team (ETDM) for acquiring the test data to be used during the Testing.
  • Developed and execute SQL queries for validating data integrity and database validations to perform Back End Testing.
  • Created Positive and Negative test cases and executed test cases based on available documents and also based on updates made on Scholarship Renewal, JD Edwards, Manual and Stipend checks for Scholars.
  • Validate Perl Script Cron jobs by retrieving data from a source database (AS400 files), and compare with data of target database (Postgress) as per the data mapping document.
  • Designed regression test bed of re-usable, data-driven tests that were decomposed to a level that should be usable by existing and future test scenarios. This resulted in cost benefit of reducing time for test creation, maintenance, and execution by design.
  • Single point of contact for smoke testing existing functionality with upcoming new releases and applying database patches accordingly.
  • Used Quality Center to track and report system defects for various releases and applications.
  • Analyzed the test results and graphs by generating different reports reflecting the stability of the application with the QC, development and business team.
  • Supported Business SME’s during UAT for Input Test Data, running cron jobs, data backup and data refresh. Triage the defects between developers and Business users.

Environment: Agile, HP Quality Center 10, UNIX, AS400 Mainframe, J2EE, Postgress db, AS400 SQL.

Confidential, Chicago

Intern Quality Analyst

Responsibilities:

  • Participated in and contributed to all phases of system development, including requirements, design, development, data conversion, testing and maintenance to achieve the successful development and implementation of IDES business needs.
  • Involved in development of Business and Technical Requirements in preparation of System Architecture, Information Strategy, and Detailed Design documents for Business Needs of Claims, Adjudications, Appeals and Overpayment Business Processes in Requisite Pro.
  • Involved in gathering requirements from business users and created logical and physical data model using ERWIN Studio.
  • Generated scope of testing by analyzing functional specification during JRP sessions with change champs interfacing the State SME’s and Business Analyst.
  • Involved in the peek review of Test Plans, Test Cases with the QA team to verify implementation of new features and enhancements.
  • Managed the process of gathering, backing and refreshing test input data for Manual, Automation and Regression testing. Test Input data includes Pre-condition, Test Input, Test Result, Test Regression data.
  • Developed requirements and tested xml blob generation for over fifty notifications/reports generated by nightly batches for different processes in IBIS dry run tests.
  • Extensive Web Service testing through SOAP-UI for validating the orchestration as mentioned in design document.
  • Validate Informatica Data Conversion jobs by retrieving data from a source database (Oracle/SQL Sever), and compare with data of target database (SQL Server/Oracle) as per the data mapping document.
  • Performed Defect Tracking and Change Control Procedures using Rational ClearQuest, and Configuration Management and Version Control using Rational ClearCase.

Environment: Agile, UML, Informatica, SOAP-UI, Business Objects, Rational ClearQuest, Rational Clearcase, MS - Word, Oracle 10g, Tibco Business Works and IBM Web sphere, J2EE.

We'd love your feedback!