Production Support/automation Resume
MiamI
SUMMARY
- Ten years of Software Quality Assurance, and three years of development experience, Script writing and executing test plans, automating test cases in various environments.
- Expertise on enrollments/claim processing for financial, Insurance and Health care systems
- Rich experience working in V - Model, Waterfall and Agile/SCRUM software development
- Expertise in leading manual and test automation projects using, HP-UFT (QTP), Selenium WebDriver, HP ALM, Microsoft Visual Studio Test (TFS), and IBM rational suite.
- Design and maintain Data driven, Hybrid automation frame works, page object models, UI function libraries, Utilities, and Reporting functions.
- Experience writing test scripts in VB descriptive programming, Java OOPS concepts
- Experience on testing Web Services, SOAP and REST using SOAP UI/API tool.
- Knowledge on Cucumber BDD/TDD concepts, Mevan, Jenkins continuous integration tools
- Created customized automation solutions using SSIS/MS-Access macros for back end business rules validations
- Performing data audits, assessments, root cause analysis, impact analysis and control reports
- Automating business processes and test data using Access-Web/Windows, SSIS, SharePoint and Report Builder.
- Experience in writing SQL queries, perform back end testing, data validation and reconciliation.
- Expertise on Software Development and effective QA implementation processes and methodologies.
- Knowledge on Verification and Validation models, Reviews walk through and Inspection processes
- Extensively used test case design techniques Boundary value, Equal partition, error guessing, decision table, and code coverage and verification techniques.
- Knowledge and execution experience on risk based techniques All-Pair and exploratory testing.
- Hands on experience White Box, Black Box, integration, system, regression, user acceptance (UAT), end to end, security, IVR, configuration, User Interface and performance Testing.
- Hands on experience performing Data Analysis and Data Quality on business processing systems, source and target applications.
- Experience validating ETL (Informatica/SSIS) mappings, data integrity, consistency, and business transformation rules.
- Coding and Testing experience on Mainframe applications using COBOL, JCL, DB2 and Natural/Adabas technologies.
- Subject Matter Experts on claim processing, configurations, business rules and closely worked with developers, DBAs and other business stake holders
- Knowledge on data ware house/ETL principles, schemas, data models, triggers, store procedures, and BI reporting tools.
- Team player, self-motivated and rapid learner with the aptitude to work in fast-paced surroundings
TECHNICAL SKILLS
Testing Management: HP ALM, MS-TFS, IBM Rational, MTM
Test Automation: UFT/QTP, Selenium Web Driver,SOAPUI/API, Mevan,Jenkins
Bug Reporting Tools: HP Quality Center, IBM Clear Quest, Bugzilla, qTest
Scripting: Java, VB Script, Groovy, SQL, PL/SQL,JCL,COBOL
Database: SQLServer,Oracle,DB2,MSAccess, VSAM
Web Technologies: HTML, COM, DOM, XPATH and XML.
Operating Systems: Windows, UNIX, MVS
Business Tools: Share Point, Report Builder, InfoPath, Visual Studio, eclipse
ETL Tools: SSIS, Informatica
Legacy Applications: Mainframe, iSeries,AS400
PROFESSIONAL EXPERIENCE
Confidential, Miami
Production Support/Automation
Responsibilities:
- Perform root cause and impact analysis, data fixes, support operational team on generating customer impact and revenue risk reports.
- Support enhancements, work requests, reporting, UAT testing, client portals, and files.
- Reconcile Claims transaction and Financial details and support on issue resolutions.
- Create control reports for temporary fixes, reduce customer impact and find defects early.
- Enhance regression scripts in UFT for Internal and external portals and support for release
- Maintain object repository, use descriptive programming and develop common Utilities
- Perform batch testing, back end rule validations and Data integrity using Access Macros
- Co-ordinate with business and development teams on reviews and process enhancements
- Work in Agile environment, adopt changes, focus on working software and customer/client.
Confidential, Miami
Test Lead/Automation Specialist
Responsibilities:
- Lead a team of 4 Quality Assurance Testing Specialists for the Project.
- Responsible to create test strategies, test approach, test estimations and test plans
- Create test coverage, test case design, test data, test execution and defects
- Prepare regression test baseline, design hybrid automation frame work in Selenium API.
- Involved creating page objects, UI operations, error handling, common utilities, excel utilities, application objects.
- Involved setup selenium environment using eclipse, Mevan and Jenkins, Java, TestNG, log4J and Selenium Web driver
- Performed POC on selenium vs UFT capabilities for MAX automation
- Developed modular hybrid automation frame work for existing and new client’s regression
- Developed VB scripts and functions in descriptive programming
- Communicating with client and stake holders on daily for project updates and outstanding issues
- Evaluating project requirements and coordinate testing/release activities
- Working an Agile/Scrum methodology to ensure delivery of high quality work with supporting release deployments
- Work with business and development teams on defect prevention, reviews, risks and process gaps and automation
- Automate test data needs using SSIS/SQL procedures and schedule tasks for each release
- Perform back testing for business rules validation data reconciliation using VB functions and Access forms/Macros.
Confidential, Miami
Data Quality Analyst
Responsibilities:
- Involved in requirement gathering, estimating, planning, organizing and co coordinating the tasks with multiple stake holders.
- Perform root cause analysis and impact analysis on production issues or system gaps
- Analyze complex problems and facilitates control reports to identify impacted population
- Analyze data, perform data fixes, validation and reconciliation source and target systems
- Support UAT testing, client rollout activities, process automation and data validation
- Performed data validation and automated data validation rules using ACCESS/SSIS/DTM
- Automated business approval process (eDCF) using Share point and InfoPath designer
- Participate monthly meetings, summarize data quality issues, plans for continuous data quality improvements and measurements.
- Ensure client setup, enrollments and claims data meet data quality standards, and deviations reported to department stake holders
- Closely worked with Accounting, Actuarial, Business Analyst, Development, Marketing, Security, Digital communications, Compliance and other groups on operational support.
- Developed operational reports and scheduled using share point subscriptions and Report builder.
- Analyze data, understand trends and identify business growth/improvements areas
- Continuous support and enhancements UFT(QTP) test scripts on EPS applications
- Create or execute Web Services using SOAP/UI tool for enrollment data creation and UAT support
- Create and prioritizing defects, data quality issues using HP quality center
Confidential, Miami
Sr Test Engineer
Responsibilities:
- Involved reviewing Business requirements (BRD), System specifications (SRS) and design documents.
- Involved in Release support activities, test environment setup and test deliverables setup
- Create test strategies, test approach, test metrics and test plans
- Design test conditions and test cases using boundary, partition values, error guessing, negative flows to maximize test coverage
- Prepare Requirement traceability and Test coverage for each sprint /iteration
- Execute test cases for Smoke, Sanity, Integration, System, Functional, End-to-end and UAT phases
- Prepare test data executing SQL queries, SSIS workflows and PL/SQL store procedures
- Involved identifying Regression test cases and prioritize them for automation testing.
- Reconcile customer claims, client reports, and analyze root causes for discrepancies.
- Responsible for automating test cases using UFT/QTP for new features and integrated the scripts with master script.
- Well worked with Object Repositories, error handling, function libraries, descriptive programming, Regular expression and reporting features
- Involved developing Data driven fame work with Main driver, Sub Driver, supportive and common functions, external data interfaces
- Automate back end data validation using VB functions/Macros and Forms to validate business rules
- Verify and Validate conversion data integrity, transformation rules, mappings in stage and target which application is under test.
- Involved defect prevention meetings, risk analysis and mitigation, quality process improvements
- Performed Interface testing with external and internal systems using Web Services, FTP.
- Report defects, manage defects, and generate various daily/weekly reports using TFS/Clear Quest.
- Closely worked with business, data administrators, developers and clients
- Performed Root cause analysis, impact analysis, temporary fixes, mitigation plans and control reports
- Assist Business /Test Manager on test estimations, prioritize test cases and generate status reports
- Mentor and train new team members on testing process and application flows.
Confidential, Miami
Sr Test Engineer
Responsibilities:
- Leading offshore testing team and reporting to Business Manager at onsite
- Involved test strategies, test approach, test estimations, test plans and other test artifacts design and development
- Reviewing requirements(BRD), test strategy and responsible for developing detailed test plans for each sprint
- Involved creating UAT, System, End-to-end, Regression test cases and smoke checklist.
- Responsible for validating data integrity, processing claims on converted data from legacy application
- Reviewed data mappings, transformation rules between source and target systems
- Review design documents, business rules, data base designs and provide system or requirement gaps to stakeholders
- Reconcile conversion data at different level client, product, coverage, account and address discrepancies
- Execute Smoke, System, Functional, End-to-End and Regression test cases for each deployment.
- Prepare test data for each deployment and work with data administrators for environment setup.
- Run validations, to validate claims data integrity, business rules, mapping rules which application under test.
- Creating and tracking bugs using Quality Center and maintain various daily/weekly reports
- Responsible for designing automation frame work, identify test cases to be automated, co-ordinate tasks with script developers.
- Responsible for developing validation scripts in SQL to validate claim duration, payment amount and other functional areas.
- Responsible on coordinating release activities, test results and GO-No-go evaluation for approvals
- Performed various tasks security, IVR, disaster recovery, load testing, client/customer communications, XML files
- Used Web services to create enrollment data using SOAP/UI and perform SOAP services testing
- Played key role to perform root cause analysis, impact analysis on production issues, data fixes for immediate resolutions, and create CAPA for code fix.
- Support production release tasks on every month, schedule automation functional suite on internal and external applications.
Confidential, Woonsocket, RI
System IT Analyst
Responsibilities:
- Meetings with SME’s, and responsible to gather business requirements and develop use cases
- Played a key role design and develop financial, transactional reports for daily business operations
- Meetings with development team to review design documents and provide feed back
- Played a key role design batch sequence scheduling using Control-M tool
- Responsible to create test plan, and test supplement documents
- Create test scripts for functional, sanity, integration, system and regression test phases
- Responsible to execute functional, integration, system, regression and UAT test scripts
- Responsible to generate requirement traceability and test coverage using Quality Center
- Report and track defects using Quality Center, co-ordinate tasks with development team to fix defects
- Design claim transformation mapping documents, co-ordinate tasks with ETL developers, analyze ETL scripts and document ETL process
- Review ETL transformations like Source Qualifier, Normalizer, Router, Lookup, Aggregator, Expression and sequence generator and identify system or business gaps
- Perform ETL testing, validate source and target data with expected test data values
- Responsible for execute mainframe jobs to update legacy files
- Responsible for automation design frame work, identify test cases to be automated, co-ordinate tasks with script developers and script maintenance.
- Allocate work to offshore, co-ordinates tasks with developers, review the deliverables and quality processes.
- Meetings with SMEs (Subject Matter Experts) to resolve business queries or system gaps
- Analyze automation, functional, end to end and regression test results and assist Test Manager for release acceptance.
Confidential, OH
Developer
Responsibilities:
- Responsible for create functional requirement document (FRD).
- Responsible for identify process/functional gaps in legacy system.
- Responsible for create high level design document.
- Played key role develop reusable component design and coding
- Wrote COBOL, JCL programs and unit testing
- Created data sets, copy books, procedures
- System testing and performance testing for the developed components
- Regression test script design and execution
- Responsible for migrate the developed components to QA environment.
Confidential, PA
Developer
Responsibilities:
- Responsible for analyze the requirements and resolve requirement issue
- Responsible for crate impact analysis document for impacted components
- Responsible for create and review low level design document.
- Played key role develop reusable component design and coding
- Wrote COBOL, JCL programs and unit testing
- Created data sets, copy books, procedures, data base fields and screens
- System testing and performance testing for the developed components
- Regression test script design and execution
- Responsible for migrate the developed components to QA environment.