Sr.etl/qa Tester—bigdata Tester Resume
Woonsocket, RI
PROFESSIONAL SUMMARY:
- 7+ Years of Professional Experience in Software Quality Assurance (QA) and Testing in different environments and platforms including Data Warehousing, Business Intelligence, Client/Server and Web based applications.
- Experience in Data Warehouse applications testing using Informatica, Ab Initio, DataStage and SSIS on multiple platforms.
- Involved in all phases of the QA Life Cycle and SDLC, with timely delivery against aggressive deadlines, with QA methodologies such as Waterfall, Agile, etc.
- Extensive experience in writing SQL and PL/SQL scripts to validate the databases systems and for backend database testing.
- Experienced in working with Pig, Hive, Sqoop and Map Reduce.
- Experienced in using test management and bug reporting tools including HP Quality Center and JIRA, Pivotal Tracker, Bug herd tool.
- Extensive working knowledge in UNIX/Linux operating systems.
- Experience in User Acceptance Testing, Performance Testing, GUI Testing, System & Functional Testing, Integration Testing, Regression Testing, Data Driven, Keyword driven Testing and both Manual as well as using automated testing tools including Win Runner, Load Runner and QTP.
- Experience in testing Business Intelligence reports generated by various BI Tools like MicroStrategy, Cognos and Business Objects.
- Experience in using Oracle Databases, DB2 UDB, Sybase, SQL Server, Netezza, and Teradata.
- Experience in UNIX shell scripting and configuring cron - jobs, Autosys for scheduling.
- Involved in front-end and back-end testing. Strong knowledge of RDBMS concepts
- Having good knowledge on Oracle Data Integrator. Worked on Data files & label parameter of data file, strong in writing UNIX Korn shell scripting.
- Worked with XML feeds from multiple sources systems and loaded the same into EDW.
- Solid experience in Black box & White box Testing techniques.
- Excellent understanding of the System Development Life Cycle. Involved in analysis, design, development, testing, implementation, and maintenance of various applications.
- Performed Manual and Automated Testing on Client-Server and Web-based Applications.
- Extensive experience in drafting Test Flows, Test Plans, Test Strategies, Test Scenarios, Test Scripts, Test Specifications, Test Summaries, Test Procedures, Test cases & Test Status Reports.
- Strong knowledge of test methodologies: Object Oriented test methodology, Service Oriented Architecture, Top to bottom and Bottom to top test methodology, QA Validations &QA Compliances to ensure the Quality Assurance Control.
TECHNICAL SKILLS:
ETL Tools : Informatica, SSIS, DataStage, Ab Initio (GDE 1.14, Co>Operating System)
Programming : SQL, PL/SQL, UNIX Shell Scripting, PERL, XML, XSLT
Operating Systems : Windows 95/98/NT/2000, UNIX (Sun Solaris2.6/2.8, Linux 2.4, HP-Unix, IBM AIX 5.5)
Databases : Oracle 11g, 9i/10g, IBM DB2 9.x, Sybase 12.5, SQL Server 2008, TeradataV2R6 and Informix.
Testing Tools : HP Quality Center 10, 11, Rational ClearQuest
Version Control Tools : ClearCase, Ab Initio EME , CVS, PVCS 7.0
PROFESSIONAL EXPERIENCE:
Confidential, Woonsocket, RI
Sr.ETL/QA Tester—Bigdata Tester
Responsibilities:
- Responsible for Gathering test requirements and scoping out the test plan.
- Written complex SQL queries.
- Reported bugs and tracked defects using HP/ALM Quality Center
- Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Involved in data warehouse testing by checking ETL procedures/mappings.
- Involved in the Agile Scrum Process
- Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mapping.
- Point person for any/all facilities/safety dept. work being conducted at toad areas.
- Used Business Objects functionalities like Slice and Dice, Drill Down, Cross Tab, Functions and Master Detail etc.
- Extensively testing Web Services using SOAPUI (SOAP/REST).
- Participated in Integration, System, Smoke and User Acceptance Testing and production testing using QTP/UFT.
- Workflow Purchasing integrates with Oracle Workflow technology to create standard purchase orders or blanket releases automatically from Approved requisition lines.
- Managed and executed the test process, using Agile Methodology.
- Queried Teradata Database and validated the data using Sql Assistant.
- Validated data moving from different sources to target using Informatica.
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning.
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Created the test environment for Staging area, loading the Staging area with data from multiple sources
- Importing the budget values from CSV to Oracle apps tested the same.
- Generated automated scripts using QTP and documented them. Created and maintained functional & regression test suites.
- Reports were formatted according to the user requirements using Business Objects functionalities like Breaks, Calculations, Slice and Dice, Drill Down and Variables.
- Tested request and response XML's based web services interfaces using Soap UI.
- Expertise in Bug Tracking system and Process using Quality Center/ALM.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Extraction of test data from tables and loading of data into SQL tables.
- Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
- Tested the data load process using the Informatica Power Center ETL tool.
- Validated the reports to make sure all the data is populated as per requirements.
Environment: Informatica Power Center, Business objects, SQL, Agile, Teradata, Visio, Teradata SQL Assistant, Web Services, ALM/HP Quality Center 11, QTP/UFT, Oracle, TOAD, T-SQL, XML Files, XSD, XML Spy, Flat Files
Confidential, Cleveland, OH
ETL/QA Tester—Bigdata Tester
Responsibilities:
- Created test cases and test plans for user acceptance testing and system testing based on functional specifications.
- Tested PL/SQL procedures that were developed to load the data from temporary tables in staging to target tables in the data warehouse
- Worked on Unix, Hadoop, Hive, Impala and shell scripting for big data testing. Leading the team for the same.
- Extensively used HP ALM to upload requirements and write test cases
- Working in Agile environment, attended daily stand up meeting, SCRUM meetings.
- Provided support to offshore QA team by knowledge transfer and helping them with closure of the defects.
- Part of on-call support for issues relating to the internal application, Autosys and Informatica.
- Produced ETL detailed designs and documentation for Informatica PowerCenter.
- Performed data validation on the flat files that were generated in UNIX environment using UNIX commands as necessary.
- Involved in loading data from UNIX file system to HDFS, configuring Hive and writing Hive UDFs
- Used Flume-ng to move data from individual data sources to Hadoop system.
- Ran UNIX shell scripts to count the records for EDW source to staging tables.
- Reported defects using HP ALM, verified fixes and closed bugs during regression testing.
- Tested the XML feeds received from another source which was a third party for data consistency.
- Tested the ETL with XML as source and tables in the Data Warehouse as target.
- Tracked defects to closure by coordinating with the dev team.
- Implemented automated processes and scheduling using Autosys. Created Autosys Scripts for installing Autosys Jobs for peculiar scheduling and alerting scenarios.
- Defined testing criteria, planned, created and executed test plans in a mainframe environment.
- Tested source data for data completeness, data correctness and data integrity.
- Performed End to End testing starting from the source to the report.
- Written Hive jobs to parse the logs and structure them in tabular format to facilitate effective querying on the log data.
- Conducted and coordinated integration testing and regression testing.
- Participated in business requirements gathering and in modifications of the requirements based on scope.
- Prepared test data to cover all the test scenarios.
- Validated the Map reduce, Pig, Hive Scripts by pulling the data from the Hadoop and validating it with the data in the files and reports
- Coordinated with Release Management team in getting jobs executed in Test environment (Autosys)
- Prepared UNIX scripts to run the Informatica ETL jobs from command line.
- Maintained all the test cases in HP Quality Center and logged all the defects into the defect’s module.
- Tested the migration of reports from Business Objects XIR2 to XIR3.
Environment: Agile, SQL, PL/SQL, HP ALM/Quality Center, XML, Java, Oracle 10g, Informatica, Business Objects XIR3, Toad for Oracle
Confidential
ETL QA Tester
Responsibilities:
- Translated business-reporting requirements into Data Warehouse architectural designs and analyzed source and target data models and made necessary changes.
- Supported the extraction, transformation and load process (ETL) for a Data Warehouse from the legacy systems using Informatica and provided technical support and hands-on mentoring in the use of Informatica for testing.
- Identified the primary key (logical / physical) and put update or insert logic.
- Deleted the target data before processing based on logical or physical primary key.
- Designed and executed the test cases on the application based on company standards.
- Prevented occurrences of multiple runs by flagging processed dates.
- Wrote Test Cases for ETL to compare Source and Target database systems.
- Testing of records with logical delete using flags.
- Interacting with senior peers or subject matter experts to learn more about the data.
- Identifying duplicate records in the staging area before data gets processed.
- Extensively wrote test scripts for back-end validations.
- Ensured that the mappings were correct.
- Extensively used Toad to verify records that has been populated by ETL process.
- Responsible for de-duping and consolidating customer records from various sources to create master customer list.
- Prepared and implemented data verification and testing methods for the Data Warehouse; designed and implemented data staging methods and stress testing of ETL routines to make sure that they don’t break on heavy loads.
- Involved in performing Functionality testing, Validation testing and System testing, testing manually on the first release of the application.
- Extensively used QTP as automation tool for automating the scripts for regression testing
- Used Test Director as a defect-tracking tool to keep track of defects during the testing process.
- Performed End-to-End testing after bug fixes and modifications.
- Used Tivoli as a scheduling software to schedule the Informatica jobs.
Environment: Informatica, Big Data/Hadoop, Hive, SQL, XML Files, XSD, DTD, XSLT, PL/SQL, SQL*Plus, UNIX Scripts, Toad, ClearQuest, ClearCase, QTP.