We provide IT Staff Augmentation Services!

Senior Etl Tester/ Hadoop Tester Resume

El Segundo, CA

PROFESSIONAL SUMMARY:

  • 6 years of experience as a software professional as Quality Analyst in manual and automation testing which involves web/SQL/Backend/ETL/business intelligence solutions, Hadoop QA, and Hadoop ETL.
  • Extensively written test plans, test cases, scripts and prepared test data and followed STLC extensively for DWH testing practices.
  • Extensive working experience on Hadoop eco - system components like Hive, Pig, Sqoop, and Hue.
  • Experience in writing MapReduce programs and using Apache Hadoop API for analyzing the structured and unstructured data.
  • Experience in Selenium automation using Selenium Web Driver, Cucumber, Selenium Grid, JAVA, JUnit, Jenkins.
  • Experience in Importing/ exporting the data using Sqoop from HDFS to Relational Database systems and vice-versa.
  • Experience on working with Big Data and Hadoop file system (HDFS).
  • Knowledge on testing with Big Data Technologies like Hadoop, MapReduce, Hive, Pig, Hbase, Kafka and Spark.
  • Strong in Data warehousing concepts, dimensional Star Schema and Snowflakes Schema methodologies
  • Extensively worked with large Databases in Test environments.
  • Exposure to multiple ETL tools including Datastage, Informatica, Ab Initio and SSIS.
  • Extensive work in ETL testing using SQL, UNIX, PL/SQL.
  • Profound experience in testing web applications using Quality Center and Quick Test Pro (QTP).
  • Strong SQL query skills using an enterprise wide RDBMS (Oracle, MS SQL Server & MS Access) and familiar with concepts of SQL queries ranging from DML and, DDL
  • Strong expertise in Relational data base systems like Oracle, SQL Server, MS Access, MYSQL and Teradata
  • Extensively tested data validation, load processes (ETL) using Toad and SQL Developer.
  • Data testing experience with Data Mart applications, mainly transformation processes using Informatica, Datastage tools.
  • Extensive experience in tracking bugs using Bug tracking tool Quality Center
  • Experienced in testing BI reports generated using different BI tools like Micro strategy and Business Objects.
  • Performed Backend Testing for Database integrity by executing SQL queries for validating the data in the backend database tables.
  • Good Testing experience in developing reports using SQL Server (SSRS)
  • Experienced in working in Agile Scrum and SDLC methodology environments
  • Experience in testing/validating the Source, Stage and Target (End-to-End).
  • Experience in Unit Testing, Functional Testing, System Testing, Integration Testing, Regression Testing, User Acceptance Testing and Performance Testing, UAT, and production release testing.
  • Performed System, Integration testing and Performance testing.
  • Proficient in understanding business processes / requirements and translating them into technical requirements.
  • Worked on QC -ALM and JIRA as test management tool and tracking the defect.
  • Experienced in testing BI reports generated using different BI tools like Microstrategy and Business Objects.
  • Team Player with excellent communication, analytical, verbal, writing and interpersonal skills.

TECHNICAL SKILLS:

BIG Data Technologies: Hadoop, HDFS, MapReduce, Hive, Pig, Hbase, Sqoop, Kafka and Spark.

Data Warehousing ETL: Informatica Power Center 9/8.x/7.x, (Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Repository Manager, Workflow Manager, Workflow Monitor and Informatica Server) ETL, DataStage, SSIS, Repository, Metadata, Datamart, FACT & Dimensions tables.

Data Modeling: Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, Fact, Dimensions), Entities, Attributes, Cardinality, ER Diagrams

BI Tools: Cognos 8.x/7.0, SSRS, Microstrategy, Business Objects XIR3/XIR2/6.x

Databases: Oracle 11g/10g/9i/8.0, MySQL 5.4, MYSQL, OLTP

Programming: SQL, PL/SQL, SQL*Loader, Unix, Shell Scripting, C, Java, HTML,XML, Perl, PHP, JavaScript

Tools: TOAD, SQL*Plus, Dream weaver

Environment: UNIX(IBM AIX, Sun Solaris), Windows XP/Vista, Linux(RHEL)

Testing Tools: HP Quality Center 10, QTP,Selenium, Test Director, ALM, TDP

PROFESSIONAL EXPERIENCE:

Confidential, El Segundo, CA

Senior ETL Tester/ Hadoop Tester

Responsibilities:

  • Experienced working on large implementation of big data analytics platform using NoSQL and analytical databases, and data visualization tools.
  • Perform functional testing activities such as pre-Hadoop processing, map reduce process, structured and unstructured validation.
  • Validation of pre-Hadoop processing to ensure that that data is getting processed without any errors.
  • Validation of data from different source systems and loading into HDFS using Sqoop.
  • Validation of Hadoop Map Reduce process data output processing to ensure that that data is getting processed without any errors.
  • Create Hive tables and partitions using HQL.
  • Perform MapReduce operations involve in processing the input files and applying map and reduce operations to get desired output.
  • Write SQL scripts and PL/SQL queries using TOAD and Oracle SQL developer to query DBs and analyzed the results.
  • Strong experience in writing HQL queries in HUE tool.
  • Involved in loading data from LINUX file system to HDFS.
  • Validate Sqoop data from Oracle to Hadoop to generate reports by BI team.
  • Perform Schema validation using Pig Scripts for data quality on 3rd party datasets.
  • Responsible for validating the files availability, format and data on UNIX server.
  • Involved in testing the reports by writing complex SQL queries.
  • Validate reports after ETL/transformation work flows are executed for all source systems.
  • Test several complex ETL mappings and reusable transformations for daily data loads.
  • Design the data driven tests in Quick Test Professional (QTP) script to validate with different sets of test data.
  • Involved in monitoring Hadoop cluster job performance using Cloudera manager.
  • Experienced in managing Hadoop cluster using Cloudera Manager Tool
  • Trigger work flows using Autosys and validated data according to business rules.
  • Closely work with data torrent developers during failure of work flows.
  • Perform functional, integration, regression, and end to End testing using automated scripts.
  • Responsible to gauge the Test progress activities and forecast the Risks involved in the testing process.
  • Participate in QA best practice initiatives across QA organization.
  • Provide mockup data for different scenarios.
  • Validate transformation rules are applied correctly.
  • Validate reports after ETL/transformation work flows are executed for all source systems.
  • Data Storage validation is done to ensure that the data is corrects and is of good quality.
  • Validate non-functional testing like performance and failover testing plays a key role to ensure the whole process is scalable and is happening within specified SLA.
  • Report the defects to developers using ALM/Quality Center Defect Manager.
  • Discuss issues/defects with the business and development teams.
  • Attend daily status meetings with Project Manager's and development team.
  • Mentor offshore team on validation tasks.
  • Provide overview on project testing approach, setting up testing standards and processes.

Environment: Informatica, Oracle Developer, Hadoop, HDFS, MapReduce, Hive, Pig, Sqoop, Cloudera, Autosys, Data torrent, MRUnit, Quality Centre (ALM), Agile, UNIX, HP ALM, DB2, and SQL Server

Confidential, Redwood City, CA

ETL Tester/ Database Tester

Responsibilities:

  • Analyzed business requirements, data mapping specifications and developed Test Plans, Test Cases, Expected Results and Prioritized tests for the applications for various modules using Quality center.
  • Developed the detail ETL Test Plan, Test Resource Utilization and Close Down document for multiple products.
  • Generated reports using business Objects Report Designer.
  • Experienced in creating the SSIS packages, DTS packages
  • Used Agile Test Methods to provide rapid feedback to the developers significantly helping them uncover important risks.
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Involved in automated Regression Testing using Quick Test Professional (QTP).
  • Created complex SQL queries for querying data against different data bases for data verification process.
  • Created Test cases for Data Acquisition and Data Delivery and tested the data accuracy for all the transformations.
  • Tested the extracted data from flat files and oracle database, and tested the applied business logic on the transformations.
  • Tested mappings/Reusable Objects/Transformation/mapplets by using Informatica PowerCenter
  • Used Informatica Power Center for extraction, loading and transformation (ETL) of data in the data warehouse.
  • Strong knowledge in Data warehouse tools like Informatica Power Center, Business Objects and Micro strategy.
  • Used Cucumber by creating the Features and Step Definition files to execute test scripts.
  • Designing and creating new reports, Freeform SQL reports in Micro strategy
  • Validated the data on Front end applications to reflect the data on the back end using SQL queries
  • Expertise in planner/designer using mapping, transformations, data validation, oracle data warehouse builder.
  • Performed manual testing of each build and then regression testing on each builds using Selenium WebDriver.
  • Tested various mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformation
  • Analyzed test cases for automation and created scripts for regression testing using QTP 9.2. Automated Test cases for the purpose of regression testing.
  • Worked in AGILE Methodology.
  • Tested reports using Business Objects functionality like Queries, Slice and Dice, Drill down, @Functions, Formulae etc.
  • Developed Teradata and Oracle Structured Query Language ( SQL ) test scripts from test procedures and executed them with Oracle's SQL plus and Teradata 's Queryman or BTEQ
  • Used SQL queries to retrieve data from tables and to perform back end testing thru TOAD.
  • Wrote SQL, PL/SQL, stored procedures & triggers, cursors for testing business rules and transformations.
  • Worked on UNIX for testing the sample data and comparing the data Confidential the source vs target systems.
  • Performed integration, regression, retesting.
  • Used Testing Annotations in Selenium WebDriver and executed a batch of tests as testing suite.
  • Involved in Relational modeling and Dimensional Modeling Techniques to design ER data models and Star Schema designs
  • Written and reviewed test cases and participated in test effort estimation
  • Worked with the team to set up the test bed.

Environment: HP Quality Center 10, Informatica Power Center 9.1, QTP, Micro strategy, Teradata, Selenium, AutoSys, Oracle 11g/10g, XML, TOAD 6.0, Java, Java Script, Agile, SQL, PL/SQL, IBM UNIX AIX.

Confidential, Cleveland, OH

SQL Tester

Responsibilities:

  • Involved in understanding Business Process and coordinated with Business Analysts to understand specific user requirements.
  • Tested the different sources such as Flat files, PeopleSoft accounting data and Oracle to load into the Teradata data warehouse
  • Tested the extracted the data from heterogeneous sources like .dat file, Excel file, .csv file, flat files, Oracle etc and loaded into target tables Oracle.
  • Quality, integration testing, UAT & regression testing.
  • Involved in validating the data that has been populated into Data warehouse Fact and dimensional tables using Ab Initio Data Warehouse tool.
  • Strong knowledge of working with Star Schema and Snowflake Schema
  • Extensively used the Sequential File, Dataset, Join, and Look-up, transformer, Funnel, Xml and Connector stages provided by Informatica to perform transformation and load the data.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Involved in developing detailed Test strategy, Test plan, Test cases and Test procedures using Quality Center for Functional and Regression Testing
  • Tested Business Objects reports and Web Intelligence reports.
  • Worked in an Agile technology with Scrum
  • Prepared Test Scripts and test Plan for automated testing using QTP
  • Tested all the reports by running queries against the warehouse-using TOAD. Also compared those queries with the Business Objects SQL engine generated queries.
  • Used standard testing techniques to test the data validity before deploying the Business Objects Reports to Users.
  • Tested several Informatica ETL mappings and ran on UNIX for loading purpose and checking the log files
  • Involved in SQL Optimizations, Performance Analysis, and future growth analysis for OLTP and data warehouse applications
  • Analyzed the specifications and identified the source data that needed to be moved to the data warehouse
  • Used Aggregator stages to sum the key performance indicators used in decision support systems.
  • Used SQL functions, clauses, Wildcards etc for extensive testing of the warehouse
  • Used SQL Queries to validate the data on daily basis
  • Queried Teradata Database and validated the data using Sql Assistant
  • Running SQL Query's in TOAD for counting the no of rows from the tables
  • Monitoring the ETL jobs and coordinating the Bug fixes.
  • Translate business-reporting requirements into data warehouse architectural designs and analyzing source and target data models and makes necessary changes.
  • Used Quality Center for Test cases, Bug reporting, tracking & generation of Test Metrics
  • Participated in the review and maintenance of data security standards to protect information assets.
  • Extensive experience in testing and implanting Extraction, Transformation and Loading of data from multiple sources into Data warehouse using Informatica.
  • Tracked the defects to closure.
  • Supported the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Informatica and provide technical support and hands-on mentoring in the use of Informatica for testing.
  • Involved in scrum meetings with the Project Manager and provided updates of the testing team.

Environment: Informatica 9, Oracle 10g/9i,QTP, Business Objects, IBM DB2, Quality Center, ClearCase 8.0, ClearQuest 8.0, MYSQL, XML, SQL, TOAD, SQL*Plus, Teradata, Agile, AutoSys, Java, Java Script,

Confidential, Minneapolis, MN

QA Tester

Responsibilities:

  • Reviewed the Business Requirement Documents and the Functional Specification.
  • Interacted with senior peers or subject matter experts to learn more about the data.
  • Validating the Workflows and running them using Workflow Manager.
  • Prepared Test Cases and Test Plans for the mappings developed through the ETL Data Stage tool.
  • Tested different detail, summary reports and on demand reports. Extensively used Cognos for report generation.
  • Excellent knowledge of HIPAA standards, EDI (Electronic data interchange) Transaction syntax like ANSI X12, Implementation and Knowledge of HIPAA code sets, ICD-9, ICD-10 coding and HL7.
  • Tested Different Module Like Billing, Claims, Subscriber/Family, and Customer Care in Facets.
  • Experienced in writing complex SQL queries for extracting data from multiple tables and validating with target tables.
  • Experience testing tables like SCD Type 1, Type 2 & Type 3 verified history load and incremental load data.
  • Developed advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted.
  • Validated target tables DDL, counts, data (Business rules), negative testing and regression testing.
  • Experience in writing test cases for the Facets applications
  • Worked with FACETS Team for HIPAA Claims Validation and Verification Process (Pre-Adjudication).
  • Worked on HIPAA EDI Transactions and Code Sets Standards according to the test scenarios such as 270/271, 276/277,837/835 Transactions
  • Strong technical Knowledge of UNIX Utilities, Shell Script to automate process.
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Analyzed trading partner specifications and created EDI mapping guidelines.
  • Involved extensively in doing back end testing of the data quality by writing complex SQL.
  • Exported Manual Test Cases from MS Excel template directly to QC and executed the Test Cases in Test Director with Pass/Fail/Blocked status.
  • Executed the developed Test Cases and Test Plans and also responsible for creating Test Strategy for the Data Warehouse.
  • Highly Experienced in HIPAA (Health Insurance Portability and Accountability Act), Health Care industry providing Business Process Assessment, Requirements Gathering, Gap Analysis, Implementation and Testing
  • Did data analysis for various version changes of EDI messages on different sub-systems.
  • Defect tracking and Defect Report Generation are prepared using ALM and Quality Center
  • Highly Experienced in HIPAA (Health Insurance Portability and Accountability Act), Health Care industry providing Business Process Assessment, Requirements Gathering, Gap Analysis, Implementation and Testing

Environment: IBM Data Stage, Oracle 10g, SQL, PL/SQL, Flat files, Excel Files, UNIX, ALM/Quality center, Cognos8.4.1

Hire Now