We provide IT Staff Augmentation Services!

Etl Tester/big Data Tester Resume

Mclean, VA

PROFESSIONAL SUMMARY:

  • Over 5 years of IT experience in Software Testing of web based, ETL Testing, Soap Services Testing and Client - Server applications with full understanding of Software Development Life Cycle (SDLC).
  • Extensive experience working in Financial applications
  • Well versed in manual and automation testing.
  • Proficient in using Quality Center, JIRA, ALM and automated tools like Quick Test Pro, Sauce Labs, Appli Tools, Gherkin Scenarios and Cucumber Testing.
  • Extensive experience in developing Test Plan, Test Cases, Test Scenarios, Test Scripts, Test Reports.
  • Expertise in QA process and different levels of testing like Functional, Regression, and Integration testing with business scenarios.
  • Good understanding with Business Requirement Documentation (BRD), Functional Documentation and System Requirement Specification (SRS).
  • Prepared Traceability Matrix for Test Coverage using Test Management Tools.
  • Proficient in System Testing, Integration Testing, Functional, Regression, GUI, Smoke Testing and End-to-End Testing.
  • Performed Data validation testing and verification.
  • Experience in Database testing by using Oracle & SQL databases, by Joins, and SQL Queries.
  • Worked on projects with SOA Architecture.
  • Good Experience in testing SOAP webservices and REST APIs by using SOAP UI and Postman.
  • Utilized the tools to validate the XML and JSON objects like XmlSpy and JSON validator.
  • Good experience in Agile-Scrum and SDLC model.
  • Excellent knowledge in testing applications developed in JAVA, JSP, XML, HTML, .NET and JavaScript
  • Good Experience in ETL testing in Unix Environment.
  • Meets Testing schedules and Project timelines and Ability to handle multiple tasks works independently and as well as in a team
  • Experience at working multiple application same period.
  • Worked with ETL group for understanding Data Stage graphs for dimensions and facts
  • Experienced in coordinating resources within Onsite-Offshore model
  • Strong experience with ETL systems, and having expertise with tools Informatica and Data stage.
  • Strong experience in testing Data Warehouse (ETL) and Business Intelligence (BI) applications.
  • Strong experience in Data Analysis, Data Validation, Data Profiling, Data Verification, Data Loading.
  • Experience in Dimensional Data Modeling using Star and Snow Flake Schema.
  • Validated the Data from files loaded into the HDFS through Python scripting.
  • Validated the Flat files load to the HDFS in Hadoop and validated the Raw, Archival zones in the HDFS.
  • Validated the MapReduce jobs.
  • Validated the Mainframe source files loaded into the Hive/Impala databases.
  • Validated the data in SQL server across the data loaded in Hive/Impala using python scripts and shell scripts.
  • Validated data using the Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).
  • Ran the scripts on the edge node server to land the files into HDFS and ran the loop wrapper scripts to validate and populate the data into Hive/Impala.
  • Strong working experience in the Data Analysis and Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL)
  • Experience in maintaining Full and Incremental loads through ETL tools on different environments.
  • Experienced in Defect Management using ALM, HP Quality Center
  • Good experience in writing SQL in order to data validation in migration as part of backend testing
  • Good exposure with databases like Oracle 10g, DB2, SQL Server 2012
  • Good expertise in using TOAD, SQL*Plus and Query Analyzer.
  • Worked with different data sources ranging from VSAM files, flat files, and DB2, oracle and SQL Server databases.
  • Experience in debugging the issues by analyzing the SQL queries.
  • Good experience in working with UNIX scripts in executing and validating logs for the issues.
  • Experience with testing reporting tools (Cognos/Business Objects).
  • Excellent Analytical, Problem solving, decision making and presentation skills. Can adopt to any newer technology or software at a greater pace. Able to define and fulfill the project’s goals.

TECHNICAL SKILLS:

Testing tools: Quick Test Professional (QTP), ALM, JIRA, Quality Center, Team Foundation Server, SOAP UI, Postman, Load Runner, Para soft, Informatica DVO.

Programming Languages: C, C++, Java, J2EE, ASP.NET, Unix Shell Scripting, SQL, XML, HTML, VB Script, Cucumber.

Operating Systems: MS Windows, UNIX, Linux.

Big Data: HDFS, MapReduce, HIVE, IMPALA, Sqoop.

BI Tools: Objects XIR4, Cognos 8.0 Series, Crystal Reports, Hyperion, OBIEE

ETL Tools: Ab Initio CO>OP, Informatica (PowerMart & PowerCenter), SSIS, DataStage

Databases: Oracle, Microsoft SQL Server, DB2, Teradata, Hive, Impala

PROFESSIONAL EXPERIENCE:

Confidential, Mclean, VA

ETL Tester/Big Data Tester

Responsibilities:

  • Analyzed business and user requirements to prepare test plans, test procedures, test cases, and test scripts.
  • Worked in Agile Software Development methodology and participated and daily scrum meetings to provide the daily updates and collectively document the team updates.
  • Used defect tracking tool JIRA to handle issues.
  • Worked as ETL Tester responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.
  • Performed ETL testing and extensively used SQL functionalities.
  • Involved in backend testing for the front end of the application using SQL Queries in Teradata data base.
  • Used Agile Test Methods to provide rapid feedback to the developers significantly helping them uncover important risks.
  • Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources systems including Oracle and Teradata
  • Used Autosys for job scheduling.
  • Designed and kept track of Requirement Traceability Matrix
  • Quality Center updates and test cases loading and writing Test Plan and executing Test Cases and printing status report for the team meetings.
  • Created and Developed Reports and Graphs using ALM
  • Involved in Integrating and Functional System Testing for the entire Data Warehousing Application.
  • Assisted on the business requirements, ETL Analysis, ETL test and design of the flow and the logic for the Data warehouse project.
  • Validated the Data from files loaded into the HDFS through Python scripting.
  • Validated the Flat files load to the HDFS in Hadoop and validated the Raw, Archival zones in the HDFS.
  • Validated the MapReduce jobs.
  • Validated the Mainframe source files loaded into the Hive/Impala databases.
  • Validated the data in SQL server across the data loaded in Hive/Impala using python scripts and shell scripts.
  • Validated data using the Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).
  • Ran the scripts on the edge node server to land the files into HDFS and ran the loop wrapper scripts to validate and populate the data into Hive/Impala.
  • Used Putty to access files and directories in UNIX environment.
  • Extracted Data from Teradata using Informatica Power Center ETL and DTS Packages to the target database including SQL Server and used the data for Reporting purposes.
  • Written several complex SQL queries for validating Cognos Reports.
  • Validated the entire historical and incremental loads in DVO and documented the results.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mapping
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing) also created the metadata for the source and Targets using the PowerCenter Designer.
  • Tested the ETL process for both before data validation and after data validation process.
  • Tested the messages published by ETL tool and data loaded into various databases
  • Created and executed test cases based on test strategy and test plans based on business requirement documents and ETL design documents
  • Tested a number of complex ETL mappings and reusable transformations for daily data loads.
  • Used Informatica Workflow Manager to run the ETL mappings and Monitor to monitor the status.
  • Creating test cases for ETL mappings and design documents for production support
  • Promoted Unix/Informatica application releases from development to QA and to UAT environments
  • Worked with ETL group for understating mappings for dimensions and facts.
  • Worked with ETL group for understanding Data Stage graphs for dimensions and facts
  • Tested various ETL transformation rules based on log files, data movement and with help of SQL
  • Tested several complex reports generated by Cognos including Dashboard, Summary Reports, Master Detailed, Drill Down and Score Cards
  • Used Putty to access files and directories in UNIX environment.
  • Extracted Data from Teradata using Informatica Power Center ETL and DTS Packages to the target database including SQL Server and used the data for Reporting purposes.
  • Written several complex SQL queries for validating Cognos Reports.

Environment: Informatica 9.x,10.x, DVO, Oracle, TOAD for Data Analysts, MLOAD, Teradata SQL Assistant, Hive, Impala, SQL Server, HP Quality Center, ALM, SQL/PLSQL, SOA Test, IBM Infosphere MDM, Autosys, Cognos, UNIX, Agile, Windows XP

Confidential, MD

ETL Tester/Big Data Tester

Responsibilities:

  • Analysis of Business requirements & Design Specification Document to determine the functionality of the ETL Processes.
  • Validated the process of claims, enrollment, Billing and Eligibility and also good business knowledge on medical, dental and pharmacy process.
  • Worked with EDI transactions (270,271,276,277,837,835,997) and interfaces testing.
  • Involved in HIPAA/EDI Medical Claims Analysis, Design, Implementation and Documentation.
  • Involved in Validation of HIPAA/EDI for 837 and 835 claims used for professional, Institutional and Dental billings by Writing Test cases, Test Plans and testing the processing of the Claims using BizTalk which processes the incoming claims from various vendors and generates XML files.
  • Created Test Plan, Test strategy documents, Test Summary Reports and other test artifacts.
  • Created Manual Test Suites for various modules
  • Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices.
  • Written SQL queries in order to data validation in migration as part of backend testing.
  • Validated the Data from files loaded into the HDFS through Python scripting.
  • Validated the Flat files load to the HDFS in Hadoop and validated the Raw, Archival zones in the HDFS.
  • Validated the MapReduce jobs.
  • Validated the Mainframe source files loaded into the Hive/Impala databases.
  • Validated the data in SQL server across the data loaded in Hive/Impala using python scripts and shell scripts.
  • Validated data using the Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).
  • Ran the scripts on the edge node server to land the files into HDFS and ran the loop wrapper scripts to validate and populate the data into Hive/Impala.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Extensively used SQL to verify and validate the data load.
  • Worked with developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Test newly developed features to ensure proper functioning prior to release to QA for multiple processes.
  • Involved in testing the batch jobs, using UNIX and Autosys.
  • Assisted with HBX enrollment and claims processes; details, captured in HBX status report.
  • Reviewed Data modeling, Data dictionary, Data mapping and architecture for Data flow during architecture and design phase in various projects.
  • Prepared test report in QA phase for SIT and E2E testing in QA environment.
  • Prepared and conducted UAT test report based on E2E testing in QA environment for Business approval to deploy the design code for production.
  • Tested various kinds of tools like Facets, PMP (provider management platform), ICD converter Availability. I was involved in various kinds of testing of the Facets application modules like Provider, Enrollment,, ICD 10 and claims.
  • Worked on DWH testing for ETL process in Teradata Integrated Data Warehouse Tower as well as Oracle Database Tower.
  • Checked the HIPAA compliance of the manually created 27X transactions using the Edifecs Analyzer tool
  • Involved in the projects related to FACETS, NASCO, FEP, NCAS, Dental and Pharmacy.
  • Wrote various documents describing DVO functionality, and how DVO should be used by group based on discussions with team lead.
  • Worked on environment issues, Compatibility checks after post migration of applications to Unix to Linux
  • Worked with UNIX scripts in executing and validating logs for the issues.
  • Reported bugs and tracked defects using HP Quality Center/ALM
  • Used FTP and Telnet protocols in order to migrate files to heterogeneous Operation systems like UNIX, Linux and windows.
  • Submitted weekly issue report updates to the Project Manager in the form of the QA Error Log.
  • Involving in Functional testing, End to End testing and Regression Testing
  • Understanding of Functional Requirement Specifications and System Requirement Specifications.
  • Responsible for creating Test Plans, Test cases, Test data based on functional and non-functional requirements (i.e. Interfaces) to test the Cognos reports
  • Tested reports from Cognos data warehouse system and used HP Quality Center for test cases execution and defect tracking.
  • Responsible for ETL batch jobs execution using UNIX shell scripting to load data from core database to Staging and Data mart tables.
  • Wrote Complex SQL queries to verify report data with the source and target system (data warehouse) data.
  • Assisted business users to execute UAT test scenarios and test data as part of Validation Testing
  • Experience on Data validation, Data merging, Data cleansing and Data aggregation activities.
  • Validating the data against staging tables and target warehouse
  • Find report defects and subsequently validating the fix, repeating the process until done.
  • Perform Sanity Testing, Data Driven Testing & Ad-hoc Testing when required.
  • Perform system testing to ensure the validity of the Report requirements and mitigation of risks prior to formal acceptance.
  • Performed Interface, Integration (SIT) Testing and UAT Testing.
  • Extensively use SQL queries for data validation and backend testing.
  • Working with Data base testing Involved in Data Migration Testing and preparing documents. Functionality, Interface, and Regression testing.
  • Preparing the SQL queries to fetch the data from databases
  • Validated the Cognos reports generated from the Data Marts and also validated the drill thru and drill down reports.

Environment: Informatica 8.x, 9.x, DVO, Oracle, DB2, Teradata SQL Assistant, Hive, Impala, SQL Server, XML, HP Quality Center, ALM, SQL/PLSQL, SOAPUI, Postman, Autosys, Cognos, OBIEE, UNIX, Agile, Windows XP

Hire Now