We provide IT Staff Augmentation Services!

Pos Migration/etl Qa Analyst Resume

3.00/5 (Submit Your Rating)

Phoenix, AZ

PROFESSIONAL SUMMARY

  • Over 9 years of experience Software Quality Assurance.
  • Experience testing Data Warehouse (ETL & BO), Database, Web, Client - Server Systems Applications in Finance, Insurance & Retail Domains.
  • Handled multiple roles Data warehouse QA Analyst, ETL QA Analyst and QA Tester.
  • Expertise in Analyzing Business Requirement, Functional Requirement, Design specification and Developed Test plan, Test Scenarios & Test Cases in every release on all phases of SDLC, STLC.
  • Experience in software development methodologies like water fall and agile development model.
  • Experience in working in Agile-scrum methodology having understanding of QA process in agile environment.
  • Experience on System Integration, Functional, Regression UAT and Migration Testing of various Web ETL Applications
  • Created Traceability matrix to ensure that all the requirements have been covered.
  • Experience in testing of Data Warehouse/ETL Applications tested in Data stage, Informatica SSIS using SQL Server, Oracle, DB2 and Linux.
  • Expertise in testing several OBIEE, BO Reports for several business needs including Dashboards, Drill-Down, and Master-Detailed.
  • Experience in Validated the end-to-end business intelligence solution by Testing metadata and OBIEE Analytics Repository (.rpd) consisting of three layers (Physical Layer, Business Model & Presentation Layer).
  • Extensively used Automated Test Tools such as Quick Test Professional (QTP), Win Runner, Load Runner, and HP ALM.
  • Expertise the Application Structures and Functionality for White Box and Black Box testing Usage of Trackers / Bug Reports, HP Quality Centre,HP ALM& Rational Clear Quest.
  • Familiar with concepts of SQL queries ranging from DML, DDL and DCL.
  • Worked on various range of platforms such as flavors of UNIX, Linux, AIX, Solaris and Dos.
  • Active participation in decision making and QA meetings and regularly interacted with the Business Analysts, development team to gain a better understanding of the Business Process and Requirements.
  • Good communication, interpersonal skills, self-motivated, quick learner & team player.

TECHNICAL SKILLS

Operating Systems: Windows XP/ NT, LINUX, HP UX, AIX and Solaris.

Languages: SQL, PL/SQL, C, C++, VB Script.

Databases: Oracle/8i/9i, MS SQL Server2008, Sybase, DB2,Teradata DB.

Web/Internet: HTML, DHTML, XML,WSDL, JSON, Fix Protocol4.4.

Scripting Languages: Java Script, VB Script.

Testing Tools: Win - Runner 7.2, QTP 8.2, Quality Center 9.2, ALM, Jira, Remedy and Rational Clear Quest.

ETL: DATASTAGE 7.5X / 8.1, Informatics 9.1, IMS and File-aid.

Reports: Crystal, SAP Business Objects, Sap BEX, Cognos & OBIEE 11X

PROFESSIONAL EXPERIENCE

Confidential, Phoenix, AZ

POS Migration/ETL QA Analyst

Responsibilities:

  • Analyze Data Model Mapping requirements for the Point of Sales using tables.
  • Participated in Agile Kanban projects for the development and delivery of ETL and back end processes to support financial, Sales & Marketing business reports.
  • Use of Version One for management and tracking of stories.
  • During Production support cycle always worked on stories, incidents to clear the production issues and troubleshooting Data Stage jobs.
  • Analyze POS requirements, Designed Test plans, Test scenarios and Test Case.
  • Working closely with Business Analysts & Data Architects to understand the source system and data assets.
  • Involved in daily stand-up meetings and weekly sprint meetings to discuss workflow and project specifications.
  • Business Requirement to use a reporting tool such as OBIEE to access the data from the integrated data warehouse.
  • Verifying the data in the source Teradata SQL Assistant (PHTDPRD) server.
  • Validate the different scenarios like Structure, Count, Nulls, Referential Integrity in staging & ODS.
  • Created data load jobs from flat file to Oracle Exadata for EDW layer, Using SQL*Loader.
  • Verified the data for POS Environment like Dev, Testing, and UAT & Production.
  • Validate the Source Teradata Using Teradata SQl Assistant for Initial & Incremental Load in staging and ODS tables.
  • Involved in Initial and Incremental Data Load validation from source Teradata to target Oracle.
  • Executed and validated the huge volume Test data in different Environments.
  • Using Data Stage designer and Director executed and validated the Test data for Initial load.
  • Test Data Parameter to run the DS job in DS-Director using different Data periods and different environment.
  • Played a key role in the development of Teradata SQL queries while preparing test case scripts.
  • Data Stage Logs finds the execution of jobs Pass, Warning, Rejected &Performance Status in Data Stage Director.
  • Involved in System Integration Testing and Data Validation Testing of OBIEE Reports & Data Stage Mappings.
  • Created detailed Test Scenarios and Test case to test various aspects of the RPD for OBIEE reports.
  • Involved in System Integration Testing and Data Validation Testing of OBIEE Reports.
  • Worked on Repository Physical Data Layer, Business Model and Mapping Layer and Presentation Layer using OBIEE Administration Tool.
  • Validate the different level Reports, Filters, and Prompts using Oracle BI Answers.
  • Validate OBIEE Dashboards and reports in BI Answers.
  • Verifying the data in the OBIEE Report like Drill down the data, Navigate to the other reports.
  • Verifying the Data sorting, Graphs, Dashboard in the Report for File, Excel and PDF format.
  • Created the Defect, Reported and tracked the defect.

Confidential, Des Moines, IA

Seattle Merger/ETL QA Analyst.

Responsibilities:

  • Analyze Data Mapping requirements documents for the Stocks, Collateral and Advance.
  • Involved in daily stand-up meetings and weekly sprint meetings to discuss workflow and project specifications.
  • Created detail Test scenario and Test case for B Stock, FAS150 Stock, Collaterals and Advances.
  • Excess FAS150 stock that is eligible for redemption will be redeemed prior to conversion.
  • Validated the Data coming from Seattle source contained in XML, CSV format in Bank mate DB.
  • SIC’s will transform the CSV into a IMP file fitting the format specified by AdTek
  • Verified the data in POE system like Calypso, Bank Mate, Sales logic & Advantage in SIT Testing.
  • Validated the Stocks, Advances & Collateral conversion logic in Target Bank Mate System and in down steam Data ware house system.
  • Verified the accounting Scenarios in Bank Mate Extract and In Reports.
  • Validated RAI, Eadvantage and Data ware house testing in End to End.
  • Run EOM Period flip Job and make sure it works and have the SEA Data got populated with the new calendar period (Verify Security Counts).
  • Dividends will be declared, calculated, and paid by SEA prior to the merger.
  • Created the Defect, Reported and tracked the defect in the JIRA.

Confidential, Miami, FL

Sr QA Tester

Responsibilities:

  • Created Business Risk Levels to determine Test Priorities to Test application.
  • Analyze Product requirements, Designed Test plans, Test scenarios and Test Case.
  • Lead the User the Acceptance Testing team with the end-User and reported the Issue to developers.
  • Worked with the Business Analysts to determine Business Requirements and set standards Test design documents.
  • Analyze& created test data for Bank Mortgage audited by FHFA and FMCC regulator.
  • Understanding the Requirements, Interacted with the On-Site and offshore development Cognizant team.
  • Validated the imported data from Oracle transformed and loaded into Data Warehouse Targets using Informatica.
  • Validated data by using SQL Queries which is loaded using Informatica as ETL tool.
  • Configure test data needed for validation of DB testing & other functional testing as accordingly.
  • Validated the mappings from varied transformation logics like Lookups, Aggregator, and Joiner& Filter.
  • Retested the application after the defects were fixed.
  • Attended the meetings with the offshore cognizant testing and Production team.
  • Validated the QA Tool for Administrator, QA supervisor, Oper User and QA Auditor Module.
  • Validated the data using SQL queries in Oracle for Intellimod application.
  • Worked with the development team ensure testing issues are resolved.
  • Validate the Loan Modification web application in different testing phase.

Confidential, Miami FL

ETL QA Analyst

Responsibilities:

  • Manually tested the Dealer Load Merge Process is related to Enrollment of, Claim, Payment, Billing, and VSC Enrollment.
  • Validated the data on source to CDC change data capture.
  • Involved in testing data loading into the Data mart.
  • Analyzed the Source-Target mapping document and testing documents.
  • Prepared Test Scenarios and Test Cases based on change request and executed the test cases for GPMR-SFS.
  • Documented tasks likeETL process flow diagrams, mapping specs, source-targetmatrix and test documentation
  • Extensively used Data Stage Tools like Info sphere Data Stage Designer and Data Stage Director for validated jobs and to view log files for execution errors.
  • TestedDimensional Hierarchiesfor theDrill Down functionalityand implemented Business Logics for Facts
  • Analyzed the Source-Target mapping document and testing documents.
  • Verified the Error Messages do not convey correct message from Business. Error Messages should be configurable from Translation.
  • Experience in testing of business scenarios for Data Warehouse Systems using (ETL) Data stage.
  • Verifying the data in the Crystal Report like Drill down the data, Jump to the other reports.
  • Verifying the Data sorting, Graphs, dashboard in the Report for File, Excel and PDF format.
  • Verified the Error Messages do not convey correct message from Business. Error Messages should be configurable from Translation.
  • Backend testing using SQL queries Count of Records in Header ELP DEALERFILE PROCESSED does match with actual records in Detail ELP DEALER RECON WRK.
  • Using SQL queries verified the Contract, Dealer, Company, Company Group Use Bulk Loading.
  • Written several complex SQL queries to validate the Data Transformation Rules for ETL testing.
  • Validated the Trans all Notification Email Bad file generated with the SQL Log, Data and ftp File.
  • Using SQL queries verified the Contract, Dealer Group Use Bulk Loading.
  • Maintained the HP Quality Center for Bug Tracking and Defect Reporting.
  • Validate the backend flag validation table elp contract,elp dealer,elp product etc.

Confidential, Houston, TX

ETL QA Analyst

Responsibilities:

  • FASTER is the system that holds core financial data P&L, Cash Flow, Balance Sheet, Volume and Stock volume, numerous legacy applications had to access FASTER for their reporting purpose.
  • Prepared Test Scenarios and Test Cases based on change request and executed the test cases for GPMR-SFS.
  • Validated the retail and Non retail data which is coming from GSAP.
  • Validated the data in source GPMR, FIM and target tables using business rules.
  • Used the Designer to test the processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database.
  • Extensively used Data Stage Tools like Info sphere Data Stage Designer and Data Stage Director for validated jobs and to view log files for execution errors.
  • Scheduled job runs using Data Stage director for debugging and testing.
  • Validated the data in business Object using drill down and Drill up approach.
  • Validated the Info cube and query data from the reporting system
  • Validated the report generated using Business Objects using SQL queries.
  • Working closely with the Developers to isolate, track and troubleshoot defects.
  • Executed queries in Oracle SQL Server for Faster Fascade.

Confidential, Littleton, MA

Sr ETL QA Analyst

Responsibilities:

  • ETL application testing for CIBC bank’s Cash Management Online (CMO) platform delivers.
  • Analyzing the requirements in business scenarios, understanding the scope of the project and dividing the work among the team, appropriate.
  • Extracting data as per Vendor requirements and reviews the code, job and Peer Reviews before delivery.
  • Creation/Updating test case based on Technical Specifications of the Jobs tested.
  • Testing the modified the data stage jobs on demand of business.
  • Tested the requirement against different browsers (IE11, Firefox, and Chrome).
  • Validated the Data in source and Target tables using Business Transformation Rules.
  • Execute UNIX scripts to transfer/load data from flat file to Database and compare them running the SQL queries.
  • Testing the follow approved processes for solving, reporting, and escalation of load/extraction issues.
  • Validated cube and query data from the reporting system back to the source system.

Confidential, Littleton, MA

Senior ETL QA Analyst

Responsibilities:

  • Analyzed the Business Requirement document for LHS & RHS.
  • Involved in high level design taking inputs from Functional Specification document.
  • Experience in resolving Data transformation issues and Knowledge of PX stages for multiple databases. Experience with RDBMS systems, DB2, Oracle, etc.
  • Created Business Risk Levels to determine Test Priorities the Regression part of Test Case in DE for checkout based on business and functional requirements.
  • Involved in Thin Slice testing and created high level Test scenarios and test case.
  • Prepared Test Scenarios and Test Cases for IMAP & Manual Adjustments.
  • Validated the different source data coming from DB2, IMS, ADAM and CIRVIE.
  • Validated Annuities related data which is coming from Manual Adjustments to RDS and ACVS.
  • Verify the data cleansing in key process, Database level testing & data base oracle for backend testing.

Confidential, Littleton, MA

Data Stage ER -ETL QA Analyst

Responsibilities:

  • Created jobs in Data Stage to process data from heterogeneous sources like SQL Server, Oracle and Flat files.
  • Expertise in Data Stage issue resolution, debugging and application performance tuning.
  • Patch verification testing setting the environment and installing the patch and executing the regression test bed.
  • Extensive experience using Data Stage technologies 7.5//8.0/8.01/8.1 (Manager, Designer, Director, and Administrator) and Data Stage PX (Parallel Extender).
  • Experience on L3 Data stage Product support and CRM’s handling activities.
  • Involved in extracting the data from staging to Relational and provided the analytical solution to the data by validation.
  • Change node setting in the configuration file for processing and storage resources within Data stage Manager.
  • Using multiple stages like Sequential File, FTP enterprise stage, Sort, Merge, Join, Filter, Transformer, and Aggregator during ETL development.
  • Performing Client side and server side testing for the all DS stages.
  • Proficient in Rational Clear Quest and Quality Center for Defect Tracking.
  • Working with different IBM teams UK Milton Keynes and Japan (Yamato SL).

Confidential, Columbus OH

QA Analyst

Responsibilities:

  • Analyzed the Mutual Fund Trades in the SunGard Investor one System.
  • Analyze account receivable for credit, cash and Customer management in a timely manner with accurate, Up-to-the day information.
  • Validated the Crossroads database for the migration from oracle to the MS SQL Server. Analyzed the Mutual Fund Trades in the SunGard Investor one System.
  • Performed the Back-End integration testing to ensure data consistency on front-end by writing and executing SQL Queries.
  • Tested Reports generated from Crossroad and PIT Database available to Custody Operations for moving money with the brokers/banks.
  • Validated the Trade data in the BSMS BISYS Cash Management system and crossroad using different queries.

Confidential, Franklin, MA

Quality Assurance Analyst

Responsibilities:

  • Analyze the Derivative Trades CDS, IRS, TRS from the Kondor System and XFI System feeds.
  • New Trade Tickets are tested that will be created as part of this initiative that will be sent to the Official Record-keepers from TR in the new environment.
  • Tested Reports generated from Fix Protocol to DE and made available to Custody Operations for moving money with the brokers/banks.
  • Analyze Derivatives in the Trade Scrub region like Splitting using MTP, TID.
  • Using the Trade Control System for controlling the Trades coming from XFI, TOP, and Kondor to the GPS, UCM and Accounting System.
  • Validate the Int Prem Bought/sold, principal and net amount for the traded currency in the TR Region.
  • Verifying the CDS, IRS, TRS Leg Info and Contract Info in the Derivatives engine.

Confidential, Seattle, WA

Quality Assurance Tester

Responsibilities:

  • Involved in the front-end functionality testing for Checkout Pipeline process for DE and JP.
  • Testing the DE and JP web application in German and Japanese language.
  • Created the New Functionality for checkout Test in DE and JP based on business requirements.
  • Logged and updated all the issues that came from developer in Remedy defect tool.

We'd love your feedback!