We provide IT Staff Augmentation Services!

Hadoop /etl Tester Resume

4.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY:

  • 6 years of IT experience in Quality Assurance for ETL, BI, Web based, Client/Server applications using Manual and Automated testing tools.
  • Expert in analyzing the error caused to failure of the ETL load with reference to the log file and report them to the corresponding team to get it rectified.
  • Strong knowledge of working with Data Marts (Star Schema and Snowflake Schema) including Slow Changing Dimensions.
  • Writing Test Hive queries to test development algorithms in Hadoop cluster.
  • Expert in Data Validation, Data Conditioning, Data Verification, Data Profiling, Data Mapping, Data Governance and Data Cleansing.
  • Well experience on developing solution into Hadoop ecosystem (Pig, Hive, Flume, Sqoop, Oozie).
  • Experience in User Acceptance Testing, Performance Testing, GUI Testing, System & Functional Testing, Integration Testing, Regression Testing, Data Driven, Keyword driven Testing
  • Good knowledge of Hadoop Hands on experience in working with Eco systems like Hive, Pig and Map Reduce, Sqoop, oozie and Spark.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
  • Expert in testing ETL, BI and Data Warehousing Applications.
  • Experience in deployment of Informatica, Pentaho, UNIX Scripts, Database Objects, Jasper reports and Control - M jobs.
  • Expert in writing complex SQL queries and stored procedures.
  • Used Test Director, Quality Center and Business Process Testing for management of test plan, test design, test execution and defect log phase. Rational Quality Manager for management of STLC including requirements gathering, risk analysis, project planning, scheduling, testing, defect tracking, management, and reporting.
  • Experience in IBM Rational Suite for maintaining test requirements, test flows using Requisite Pro, automate tests using Rational Robot and Rational Functional Tester, Rational Manager, Defect tracking using Rational Clear Quest, version control using Rational Clear Case.
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • Proficient in writing SQL queries to perform data driven tests and involved in front-end and back-end testing. Strong knowledge of RDBMS concepts. Developed SQL queries in Oracle databases to conduct DB testing. Having good knowledge on Oracle Data Integrator. Worked on Data files & label parameter of data file, strong in writing UNIX Korn shell scripting.
  • Extensively tested several ETL Mappings developed using Informatica.
  • Excellent analytical, multi-tasking, problem-solving, time-management & communication skills with particular emphasis on clearly communicating & documenting detailed Test Requirements & Tests

TECHNICAL SKILLS:

BI Tools: Business Objects, Power BI

ETL Tools: Informatica 8.1/7.1.2/6.1. X (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Talend

RDBMS: Oracle 10g/9i/8i, MS Access 2000, MS SQL Server 7.0/2000Web/Application Servers: Apache Tomcat, JBoss, Weblogic, WehSphere.

O/S: Windows NT/XP Prof/2003 Server/2000/98/95, UNIX, (Solaris 9), Linux (SuSe 9.0, RedHat 7.0/8.0/9.1)

Languages: Java, JSP, C, C++, VB, SQL, PL/SQL, HTML, DHTML, XML, XSLT, Java Script

Others: MS Office-2003, MS Project.

PROFESSIONAL EXPERIENCE:

Confidential - Charlotte, NC

Hadoop /ETL Tester

Responsibilities:

  • Involved in all phases of Software Testing Life Cycle (STLC) and Software Development Life Cycle (SDLC) - Testing methodologies, Disciplines, Tasks, Resources & Scheduling.
  • Participated in business requirement walk through, design walk through and analyzed Business requirements.
  • Created Test Plan, Test Design, Test scripts and responsible for implementation of Test cases as Manual test scripts.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica.
  • Extraction of test data from tables and loading of data into SQL tables.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mapping.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
  • Tested several stored procedures.
  • Tested Big Data Hadoop hive datasets and worked with Eco systems such as Flume, Pig, Sqoop, ETL, HDFS, MapReduce jobs in a Cloudera.
  • Participated in scheduling Hadoop jobs in tool by configured with Talend.
  • Written customized SQL queries for business needs.
  • Execution of cluster batch jobs & Hadoop workflows to ge nerate test data in Hive &Hbase tables
  • Did automation use PERL and UNIX, Shell Scripting.
  • Triggering Autosys jobs those pull batch data from source systems and thereby monitoring associated workflows and logs using Oozie Editor (i.e., Cloudera workflow dashboard)
  • Reviewed system use cases and functional specs with business and System Analysts.
  • Develop ETL process in Informatica and Pentaho Data integration studio to generate invoice XML.
  • Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases.
  • Responsible for Data mapping testing by writing complex SQL Queries using WINSQL
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Involved in testing data mapping and conversion in a server-based data warehouse.
  • Involved in Security testing for different LDAP roles.
  • Created and maintained Confluence reference pages around Analyst processes, the Jira SQL schema, and Power BI Report updating and use
  • Used TOAD to confirm the correctness of the data in the backend.
  • Provided End to End support for the testing activities during System Testing, UAT.
  • Performed positive and negative functional testing against Pentaho Data Migration job flow by manipulating testing data / parameter value
  • Used Quality Center to track and report system defects.
  • Imported and exported data into HDFS and Hive using Sqoop.
  • Used Data Lake concepts to store data in HDFS.

Confidential, Mclean, VA

DWH QA Analyst

Responsibilities:

  • Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.
  • Worked with development team to ensure testing issues are resolved on the basis of using defect reports.
  • Identify and report on various computer problems within the company to upper management
  • Report on trends that come up as to identify changes or trouble within the systems using Access and Crystal Reports.
  • Developed mapping document for ETL development efforts.
  • Fine-tuned complex mappings for external sources.
  • Involved in testing ETL reports using Pentaho reporting tool.
  • Responsible for Database (SQL) testing, performing functional, system, testing for major Power BI Gateway releases.
  • Tested ETL mappings, enabling the extract, transform and loading of the data into target tables.
  • Involved in Source System Analysis.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Generated the detailed reports of the Bugs, go no go reports and comparison charts.
  • Conducted Black Box - Functional, Regression and Data Driven. White box - Unit and Integration Testing (positive and negative scenarios).
  • Defects tracking, review, analyze and compare results using Quality Center.
  • Defined the Scope for System and Integration Testing
  • Prepares and submit the summarized audit reports and taking corrective actions
  • Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.
  • Document and publish test results, troubleshoot and escalate issues
  • Preparation of various test documents for ETL process in Quality Center.
  • Involved in Test Scheduling and milestones with the dependencies
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
  • Tested complex objects to the universe to enhance the report functionality.
  • Optimized the usage of user objects, conditions and universe level filters
  • Involved in documenting the changes to the universe and reports functionality.
  • Identified & presented effort estimations related to testing efforts to Project Management Team
  • Responsible to understand and train others on the enhancements or new features developed
  • Conduct load testing and provide input into capacity planning efforts.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using Load Runner
  • Create and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
  • Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
  • Sending package install requests for new builds and verifying proper packages are installed.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Confidential, Jersey City, NJ

ETL QA Tester

Responsibilities:

  • Created source to target mappings including business rules and transformation logic.
  • Tested complex reports by using Report Studio.
  • Used Query Studio to develop ad hoc reports
  • Tested XML feeds loaded into Data Warehousing.
  • Did Unit Testing for all reports and packages.
  • Created a Retirement analysis document of the existing system to enable retiring and merging its functionality to the new system.
  • Created UNIX scripts for a variety of purposes like data cleansing, Ftp and Triggers for Mainframes to run SQL queries and to Load and Unload data from tables using BCP commands.
  • Analyzed the data and the data flow of the existing system to better understand the compatibility of the data with the data model of the new system.
  • Developed the data model and the data dictionary for the existing systems for a better understanding of the systems and clarity on the data transfer process.
  • Involved in various meetings to determine the data transfer process flow before terminating the application and developed the UML diagrams for the process using MS Visio.
  • Analyzed the application front end developed using .net, to identify the SQL Procedures being used for particular functions of the application.
  • Part of on call support for issues relating to the internal application, Autosys and Informatica.
  • Created Universes, Reusable reports and Ad-hoc reports using Business Objects.
  • Used Rational Clear Quest for the defect reporting and tracking
  • Identify and record defects with valuable information for issue to be reproduced by development team
  • Created batch test for overnight execution of SQA test scripts.
  • Used Rational Clear Case for version controlling

We'd love your feedback!