Dwh/etl Tester Resume
Warren, NJ
SUMMARY:
- 6+ years of solid experience in Quality Assurance and Data Analysis of Various Business Applications in Client/Server environments, web - based Applications, Data Warehousing Solutions, ETL and Business Intelligence Solutions. Excellent knowledge on HealthCare Insurance, Banking and Financial - - Mortgage Loan Origination Systems - wireless (Telecom) and Airlines domain applications.
- Experience in implementing various QA methodologies involving preparing Test Plans, writing Test Cases and executing them.
- Exposure to Data Analysis, Data Migration, Data Validation, Data Cleansing, Data Verification.
- Experience in Functional testing, System Testing, Regression Testing, Integration testing, GUI Testing, Security testing, Smoke testing, System Integration Testing (SIT), Software Validation testing, Configuration testing, Interface testing, Stress and Performance Testing.
- Developed and maintained Test Matrix and Traceability Matrix.
- Involved in Testing and Analysis in Guidewire Billing center, Guidewire Policy Center and Guidewire Claim Center.
- Very strong in handling defect tracking and reporting process using ALM, Rational Clear Quest, Team Foundation Server- TFS, JIRA, Rally, Bugzilla.
- Worked on Database Query tools such as TOAD, SQL Navigator, Rapid SQL and SQL Plus.
- Extensive experience in writing complex SQL and PL/SQL scripts.
- Worked with different models such as Star and Snowflake and strong working knowledge in dimensional modeling.
- Experience in banking domain involving residential Mortgage and Loan Origination Systems.
- Solid knowledge on test management using HP Quality Center and maintained versioned objects using Rational Clear Case.
- Experienced with ETL tools like Informatica, SSIS, Datastage, and Ab Initio.
- Used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using Informatica.
- Used Informatica Workflow Manager to run the workflows and monitored the status and logs in Workflow Monitor.
- Validated data consistency, data completeness, de-duping logics in data migration and data conversion projects.
- Experience worked on any requirement upgrade and/or change request while doing UAT.
- Skilled knowledge on different modules within Healthcare Claims Adjudication Process (Membership process, Billing process and enrollment and claims processing).
- Experienced in testing Crystal, SSRS, Cognos and Business Objects, OBIEE Reports.
- Experience in Data Analysis, Data Validation, Data Verification, data profiling, identifying data mismatch and Good experience with data sources.
- Strong in UNIX shell scripting, writing UNIX wrapper scripts and monitoring the UNIX logs to check for any errors.
- Extensively worked on Batch scheduling tools Autosys, Control-M to run and monitor ETL jobs.
- Knowledge on the phases of Agile methodology.
- Expert in writing Complex SQL quires for backend testing and created several queries on the fly for user specific requirements for trouble shooting day to day production issues to identify root cause.
- Experienced in analyzing Data Flow Diagram (DFD) and Data Model, Entity relation shop diagrams.
- Performed well under stress, working with multiple applications testing concurrently, coordinating QA activities with various teams and supported project management teams in release activities.
- Coordinated Release management for Application Maintenance Releases and Production Defects fixes.
- Ensured SDLC standards and application quality goals met by Quality Assurance Testing strategy.
- Excellent communication, coordination and interpersonal skills.
- Experienced in leading Software Testing projects and expert in coordinating testing effort between On Shore and off shore teams.
TECHNICAL SKILLS:
Methodologies: Agile Modeling, SDLC model, Waterfall Model
Languages: SQL, PLSQL, XML, T-SQL
Databases: Oracle11g, MS SQL Server 2012/2008/2005, DB2, Teradata
Testing Tools: HP Quality Center / ALM, QTP, TFS, JIRA
Operating Systems: Windows 95/98/2000/XP, UNIX, Linux
Defect Management Tools: Rational Clear Quest and HP Quality Center / ALM
ETL Tools: Informatic, Data Stage, SSIS
Reporting Tools: OBIEE5, Micro Strategy, Cognos, SSRS
PROFESSIONAL EXPERIENCE:
Confidential, WARREN, NJ
DWH/ETL Tester
Responsibilities:
- Analysis of Business requirements & Design Specification Document to determine the functionality of the ETL Processes.
- Prepared Test Plan from the Business Requirements and Functional Specification.
- Developed Test Cases for ETL Data Validation and Report testing.
- Developed validation steps and SQL using business requirements and rules.
- Identified Test Cases for Automation for repeatable test steps.
- Analyzed the log file for an ETL job in Data Stage Director and reported to developer.
- Interacted with DBA for setting up test environment in Oracle database.
- Executed complex SQL queries using Advanced Query Tool to validate ETL jobs.
- Written complex SQL queries and queried the Oracle database to test Data Stage ETL code.
- Involved in Regression, Functional, Integration and User Acceptance Testing.
- Used Ab Initio components like Reformat, Rollup, Join, Sort, Partition by Key, Normalize, Input Table, Output Table, Update Table, Gather Logs and Run SQL for developing graphs.
- Performed database validation using SQL according to the business rules and specifications.
- Created test data to test the functionality of the Data Stage jobs.
- Written complex SQL queries to query DB2 database.
- Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
- Enter test script and enter defect into HP quality center.
- Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
- Configuring Informatica MDM platform including IDD.
- Used HP ALM for Test management/Defect Tracking.
- Gather data required to conduct analysis from several sources, compile it together in a common record format to send data through Trillium.
- Standardization of rules in Trillium for profiling, matching and cleansing logic.
- Configured messages to be sent to MDM (Master Data Management) in XML format.
- Successfully managed in setting up the data and configured the components needed by Hierarchy Manager for MDM HUB implementation which included in implementing hierarchies, relationships types, packages, and profiles by using hierarchies’ tool in model workbench.
- Experienced with UNIX commands like GREP, PS, LS and CHMOD etc.
- Worked with Users to develop Test cases for user acceptance testing.
- Validating the data thru various stages of data movement from staging to Data Warehouse tables.
- Validated business reports like Drill down and Cross Tab reports developed by Cognos.
- Maintained the test logs, test reports, test issues, defect tracking using Clear Quest.
- Involved in preparation of Defect Report using Clear Quest.
- Performed integration and performance testing on Data Stage ETL jobs.
- Used Database links in SQL for Querying the Oracle database.
- Attended daily project status meetings and weekly QA status meetings.
Environment: Oracle 10g, Data Stage 7.0, HP Quality Center 10, Ab Initio, Informatica, WebSphere, AQT, HP ALM, DB2, UNIX Shell Scripting, Clear Quest, SQL, PLSQL, Cognos 8.0, Informatica Data Director (IDD), Business intelligence (BI), Data Modeling, UNIX, Windows XP.
Confidential, Plano, TX
ETL Tester (Abinitio)
Responsibilities:
- Interacted with clients to create Business Assurance strategy (BAS) document as per User requirement.
- Participated in analysis of Business and functional requirements and developed Traceability Matrix of Business Requirements mapped to Test Scripts based on the risks involved in the ETL process.
- Ran complex SQL queries to verify the number of records from Source to Target and validated the referential integrity, Time variance, Missing records, Nulls/Defaults/Trim spaces rules as per the design specifications.
- Tested several Ab Initio graphs, Wrapper Script, PSet’s based on the transformations rules that were applied.
- Involved in testing the metadata to test the integrity of the Business Objects Universe (SAS).
- Involved in Testing Business Objects reports.
- Involved in Testing of complicated graphs with various AB Initio components such as Join, Join with DB, Validate, Generate Records, Rollup, Partition by Key, Filter by Expression, Gather, Reformat, Merge, Dedup sorted and Scan to validate Business requirements.
- Prepared the test data for various source systems coming from Upstream by raising a TDM request Using HP service manager.
- Elicited, analyzed, documented, and communicated requirements with Business Analysts in Agile environment for conversion of legacy systems to EDW (Enterprise Data Warehouse) standards.
- Write and execute data centric tests cases to validate data.
- Used agile systems and strategies to provide quick and feasible solutions, based on agile system, to the organization.
- Configuring Informatica data director (IDD) applications.
- Automation of functional testing framework for all modules using Selenium.
- Provide Informatica MDM HUB development.
- Used data modelling tool Erwin 7.3.3 for creating models, Use ER Diagrams, DDL’s (Data Definition Languages), Logical and Physical data models.
- Analysis of application requirements and entering of requirements into HP ALM.
- Responsible for validating Data Definition Language (DDL) and improving data quality for designing or presenting conclusions gained from analyzing data using statistical tools like Microsoft Excel, Teradata SQL Assistant, DB Visualizer, and others.
- Used HP/Mercury Quality Control (QC) to create and execute test cases/scripts, maintain and track requirements, and Generated standard and customized reports.
- Executed regression test scripts along with testing new enhancements by using QTP and analyzed results
- Moved data from one database to other database, Used SQL*Loader for data migration.
- Verified correctness of data after the transformation rules were applied on source data.
- Used various AIR commands to check the error, Log and Recovery files after project set up in the sand box path.
- Scheduled and executed ETL jobs using Data Stage Director.
- Worked with IBM WebSphere Application Server and WebSphere Process Server.
- Used FTP to move Multi file and serial files from UNIX to Local host for data mocking based on the requirement.
- Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store.
- Involved in UNIX shell scripting and configuring cron-jobs.
- Mocked the data for Multi files (Positive and Negative Scenarios) by converting it to Serial file and converted it back to multi file.
- Ran the DTS packages to load the data into Staging and Target tables and validated the data based on transformation rules that were applied.
- Coordinated execution of User Acceptance Testing, regression, System Testing (positive and negative scenarios) and integration testing with multiple departments.
- Worked with development team to ensure testing issues are resolved on the basis of using defect reports.
- Used Quality Center as test management tool for storing the automated test scripts from where Scripts can be executed directly by manual testers
- Wrote complex SQL to validate target data and Staging data based on the business requirements.
- Used SQL and PL/SQL scripts to perform backend database testing.
- Involved in coordinating the White box and Black box testing for the data warehouse by checking ETL procedures/mappings
- Write complex SQL queries to validate that actual test results match with expected results
- Check the naming standards, data integrity and referential integrity.
- Responsible for monitoring data for porting to current versions.
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
- Modified Stored Procedures and Functions to generate reports based on the downstream requirements (Naming Standards, data types).
- Conducted Black Box - Functional, Regression and Data Driven. White box - Unit and Integration Testing (positive and negative scenarios).
Environment: HP Quality Center 10, QTP 9.0, Oracle BI Applications 7.9.5.2, Ab Initio, Teradata, SQL Server Enterprise manager, SQL Server query Analyzer, DataStage7.1, SQL Server Management studio PL/SQL, Oracle Database 10g, BM WebSphere Application Server, Informatica Data Director (IDD), Data Modelling, Oracle EBS 11, SQL Server, HP ALM, UNIX Shell Scripting, Cognos 8, Windows 2003/2007, Business intelligence (BI), UNIX, SAS
Confidential, Rensselaer, NY
QA Analyst | Data Analyst
Responsibilities:
- Involved in requirements analysis, identification, and documentation of required system and functional testing efforts for all test scenarios (Positive and Negative).
- Involved in Cross Functional, End to End testing for 834 in different releases logged related defects and resolved them.
- Wrote complex queries on Oracle and DB2 database for data validation, tested the Java web interface application with the legacy Mainframe programs.
- Reviewed Use cases test critical areas of application.
- Created test cases for Smoke testing, System testing and Regression testing.
- Coordinated the Testing efforts during Integration, System and User Acceptance Testing phases.
- Verified the application outputs like reports on the different browsers like Safari, IE, Chrome and Firefox.
- Various versions of the documents generated during the project were maintained and managed using Rational Clear Case.
- Tested the final application for Usability testing to verify whether all the User Requirements were catered to by the application
- Tested all HIPAA transactions for multi version support (4010 and 5010) and validating the database to file elements.
- Involved in testing HIPAA EDI Transactions and mainly focused on PA and Eligibility Transactions.
- Traced Requirements using Requirement Traceability Matrix.
- Tested Business Requirement/ETL Mapping Rules/Reports against various GW functionalities/modules and data flows from UI to OLTP DB to Replication DB to Staging to Data Mart/EDW to Reports.
- Integration testing between the systems (Policy, Billing & Claims) to create test data/scenarios for ETL, BI and UAT Testing manually as well as by employing UFT.
- Performed Backend testing using SQL queries to compare and retrieve data by using Database Check points
- Prepared Requirement Traceability Matrix (RTM), Test Status reports and Test Sign off documents.
- Involved with project backlogs and sprint backlogs.
- Extensively involved in System testing / Functional, Regression testing and Integration testing.
- Wrote Complex SQL queries and PL/SQL sub-programs to support the test case results, SQL performance tuning.
- Performed Negative testing using SSIS to find how the packaging performs when it encounters invalid and unexpected values.
- Worked on verification of ETL logs and reject/discard files for errors and reported them to development team.
- Attended review sessions with Data Analyst for any structural change (Due to incomplete Business requirement/granularity issues) and upgrade in the Data Mart solutions/Table solutions
- Tracked Defects, generating defect Reports in ALM/Quality Center based on Severity and Priority
- Performed Negative testing using Informatica to find how the workflow performs when it encounters invalid and unexpected values.
- Involved in validating SSIS and SSRS packages according to functional requirements.
- Worked on Informatica Workflow Manager to run the workflows and monitored the status and logs in Workflow Monitor.
- Created shell scripts and used Unix commands to manipulate input files for test execution
- Created test data for all ETL mapping rules to test the functionality according to requirements.
Environment: Informatica 9.1, Toad, SQL, PL/SQL, UFT, DB2, Oracle 11g, TOAD, XML, Putty, HP ALM, Quality Center 10, ALM, TFS (Team Foundation Server), Cognos, MS Visual Studio, EDI, Autosys, SQL Server 2008 R2, SSIS, SSRS, SSAS, Windows XP, UNIX, Doors.
Confidential, Bloomfield, CT
ETL Tester | Web QA Analyst
Responsibilities:
- Involved in Design and development of test plans based on high-level documents (BRD & FRS).
- Involved in writing and implementing the Test plan, Test cases, Test Procedures, Test sets using Use cases and requirement specifications.
- Analyzed customer and business needs to determine business and functional requirements.
- Gathered and documented business requirements for system gaps that require development in credit card project.
- Configured Test Environment Management (TEM) for specific Test Cases, created test data, executed automated or manual unit tests, document results and updated defect-tracking systems.
- Participated in functional requirement reviews and code reviews.
- Analyzed existing business processes and provided recommendations for improvements and efficiencies in credit card project.
- Worked with Business Analyst and Developer to resolve the defects.
- Designed and developed Use Cases, Use Cases diagrams, Activity Diagrams, Sequence Diagrams.
- Extensively involved in testing the application developed in Agile Methodology and detailed designs.
- Analyzed the user/business requirements and functional specification documents.
- Analyzed and optimized the use cases and created the Test cases based on them for Change Requests.
- Have leaded small offshore team to develop Test Plan and Test Script.
- Validated the responses of web services by passing various requests using Soap UI.
- Parameterized the variable for test cases, test suites, projects by using Groovy script.
- Created different assertions in Soap UI for the request coverage, to confirm that all the expected parameters are coming in the response.
- Executed all Test cases in different phases of testing like Smoke, Regression, and System testing of the application.
- Validated SSIS and SSRS packages according to business and functional specification documents.
- Validated the custom reports developed in Cognos Report studio, validated the adhoc reports in Cognos Query studio, and multi-dimensional cubes in Cognos Analysis studio.
- Have tested scheduler scenario on Autosys.
- Validated the Informatica workflows according to business requirement documents.
- Used Informatica workflow manager to run the workflows and workflow monitor to verify their status and logs.
- Widely experience in verify the log of ETL jobs.
- Validated the SSAS cubes for various dimensions.
- Validated the Data warehouse(DWH) Table structure part of ETL testing
- Execute batch processing and verify the ETL jobs status and data in database tables
- Validated parameter driven ETL process to map source systems to target data warehouse with Informatica complete source system profiling.
- Tested all the rules which implemented during ETL job move data from source to target Data Warehouse(DWH)
- Developed complex SQL queries to test ETL jobs source to target Data Warehouse(DWH)
- Involved in defining Test Scenarios for the applications and performed manual testing in HP Quality Center.
- Created and executed different types of test cases for the Change Request and existing functionality of the application.
- Responsible for checking of data in database by writing and executing SQL statements.
- Extensively involved in testing the application developed in Agile Methodology
- Involved in meetings with Automation team for Development of Automation Scripts for Change requests to run the along with regression tests suites of business priorities on Loan Application.
- Involved in backend testing using UNIX commands.
Environment: HP ALM, Oracle 11g, AS 400 IBM iSeries DW, Informatica Power Center 9.5, SQL Server 2012, SSIS, SSRS, SSAS, Autosys 11, Toad 10.0, WinSCP, Cognos 10, Soap UI, UNIX, DB Visualizer 10.0, VB Script, Windows Web Services, SQL, PL/SQL, XML
Confidential, Salt Lake City, UT
ETL | DWH | BI QA Analyst
Responsibilities:
- Worked with developers to gather information for a set of Development Plan templates documenting the service’s requirements, functional, and technical design Included SDLC documentation.
- Involved in development and implementation of new processes and systems in support of bank's Residential Mortgage Management business line.
- Extensively worked on Mortgage default functions, including collections, loss mitigation, short sale, foreclosure and bankruptcy.
- Tested all the ETL processes developed for fetching data from OLTP Systems to the target using complex SQL queries.
- Executed test cases, and test scenarios for User Acceptance (UAT), Functional and Regression test cases.
- Reviewed unit test cases and test results and certified the code for Integration testing
- Created user acceptance testing documents and reviewed them with the customer.
- Involved in entering and tracking defects using Rational Clear Quest.
- Validated several SSIS and SSRS packages to verify that they are working according to BRS.
- Involved in testing the batch programs by using the Autosys tool.
- Involved in testing Stored Procedures, Functions, Triggers and packages utilizing PL/SQL.
- Prepared Test Scenarios by creating Mock data based on the different test cases.
- Effectively communicate testing activities and findings in oral and written formats.
- Extensively used SQL programming in backend and front-end functions, procedures, packages to test business rules and security.
- Done data validations for Business Objects reports.
- Extensively used Ab Initio Co-op to execute the graphs and verify the logs.
- Performed Backend testing by writing SQL queries and running PL/SQL scripts in TOAD.
- Involved in testing the reports and UI screens containing those reports.
- Assisted in promotion of Informatica code and UNIX Shell Scripts from UAT to production.
- Extensively used Informatica Workflow Manager to run the workflows/mappings and monitored the session logs in Informatica Workflow Monitor.
- Involved in validating the web services testing using the Soap UI.
- Involved in testing SSIS Packages, and in Data Migration.
- Written several complex SQL queries for data verification and data quality checks.
- Utilized DOORS as a requirement repository (RTM) and synchronized with Quality Center
- Have tested reconcile and incremental ETL loads for the project.
- Worked with XML feeds from multiple sources systems and loaded the same into Enterprise data Warehouse.
- Performed data validation on the flat files that were generated in UNIX environment using UNIX commands.
Environment: Oracle 9i, Toad, DB2, SQL, Ab Initio 2.12, PL/SQL, Business Objects XI, MS Visio, Rapid SQL, Mainframes, TOAD, .Net, XML, COTS, RUP, Windows, Autosys, TIBCO, Remedy, SQL Server 2003/2005 SSIS, SSRS, HP Quality Center, UNIX, Putty, SQL Navigator