We provide IT Staff Augmentation Services!

Sr. Test Analyst Etl/big Data Resume

5.00/5 (Submit Your Rating)

Eagan, MN

SUMMARY

  • 8+ years of experience as a Big Data/ETL Test Analyst.
  • Experienced in all stages of Software Development Life Cycle and Software Testing Life Cycle.
  • Good Knowledge and experience in Metadata and Star schema/Snowflake schema.
  • Analyzed Source Systems, Staging Layer, Fact and Dimension tables in Target D/W.
  • Well - versed with different components of Hadoop like HDFS, HIVE, Sqoop, PIG and HBASE.
  • Knowledge in Hadoop 2.0 Architecture and Hadoop 1.0 Architecture.
  • Worked in Cloudera as Client tool for Hadoop.
  • Written several Hive scripts for data validation between the Source and Staging Layer in HDFS.
  • Written several Hive queries to validate the data between the hops in the datalake from Raw, Staging, consolidation, mapping and outgoing layers.
  • Used HDFS scripts to move data from Files to HDFS andexecuted Sqoop commands to import the data from Oracle database to HDFS.
  • Professional experience in Integration, Functional, Regression, System Testing, Load Testing, UAT Testing, Black Box and GUI testing.
  • Collaborated with project team members to discuss and resolve issues.
  • Experience in Web based applications like online banking, transactions applications, and healthcare.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
  • Experience in creating Requirements Traceability Matrix (RTM), Test Summary Report, documents.
  • Experience in developing Test Plan, preparing Test Strategy, writing detailed Test Cases and Test Scripts and executed the test casesbased on Business Requirements Document, and developed Test Scenarios to support quality deliverables.
  • Extensively worked on HP ALM to upload Test Cases, Execute Test Cases.
  • Logged defect Status reports, resolving requirement, and observed design inconsistencies.
  • Expertise in querying and testing RDBMS such as Oracle, MS SQL Server using SQL for data integrity.
  • Tested Web Services/XML /SOAP services using SoapUI tool.
  • Extensive experience in testing BI reports generated by Tableau.
  • Strong working experience in Windows NT/2000/XP and UNIX environments.
  • Clear understanding of business procedures and ability to work as an individual and as a part of a team.
  • Experience in interacting with business analysts, developers, and technical support and help them base line the requirement specifications.
  • Detail oriented personnel with excellent communications and interpersonal skills.

TECHNICAL SKILLS

Operating Systems: Windows, LINUX, UNIX

Big Data Technology: HDFS, MapReduce, Hive, Pig, SqoopTest Management Tricentis qTest, HP ALM 12.0/11.0, AQT, SAS EG

Project Management: JIRA

ETL Tools: Talend, Informatica 10x/9x, DataStage 8.7/8.1Scheduler ASG Zena, Robot Scheduler

Web Services: Soap UI

RDBMS/ Databases: Teradata, Oracle 11g/10g, SQL Server 2008/2005

BI Tools: Tableau, Cognos, Denodo

PROFESSIONAL EXPERIENCE

Sr. Test Analyst ETL/Big Data

Confidential, Eagan, MN

Responsibilities:

  • Worked collaboratively with the team as a Test Lead and allocated tasks to the team members as per the projections and estimations.
  • Worked with offshore team collaboratively to work on the projects.
  • Attended meetings with the QA Managers to discuss on the new projects or provide updates on the existing projects.
  • Created test cases in HP ALM based on STT documentand executed them
  • Wrote SAS Scripts for validating the Import, Expected, Actual and Compare programs to generate output
  • Used Beyond Compare tool to validate the data, schema between the Staging Layer and Distribution Layer
  • Tested fact and dimension table based on requirements
  • Written complex SQL queries for querying data against different data bases for data verification process using Teradata and AQT
  • Preparation of technical specifications and Source to Target mappings
  • Defects identified in testing environment where communicated to the developers using defect tracking tool Quality Center
  • Written several complex SQL queries for validating views for Confidential Reporter
  • Tested several complex ETL mappings, mapplets and reusable transformations for daily data loads
  • Created test cases for ETL mappings and design documents for production support
  • Extensively worked with flat files and excel sheet data sources and wrote scripts to convert excel to flat files
  • Effectively communicate testing activities and findings in oral and written formats
  • Reported bugs and tracked defects using Test Director
  • Worked with ETL group for understating mappings for dimensions and facts
  • Extracted data from various sources like Oracle, flat files, and SQL Server
  • Prepared all Test Deliverables of the Projects such as Test Plan, Test Strategy, Test Cases, SAS Scripts, Biweekly Status Report, Weekly ITQA Status Report, Defect Status Report, Test Closure Report, QA Sign Off and Schedule Trackers
  • Validated Views on reports from Impala to Denodo Environment
  • Exposure to AWS on creating and initializing test environments

Environment: SAS Scripting, MDM,Teradata, Advanced Query Tool (AQT), HPALM, Beyond Compare, BOBJ, Denodo, Impala, Hue, AWS

Big Data Test Analyst

Confidential, Chicago, IL

Responsibilities:

  • Worked collaboratively with the Data Analyst and System Analyst to understand the requirement.
  • Created sub tasks in Jira board as per the user story assigned and tracked the progress of the task.
  • Ran ASG Zena Processes to load the data in the layers of the datalake as a first step for validations.
  • Actively involved in all phases of Software Testing Life Cycle.
  • Scripted HQLs in Teradata and Beeline to validate the data in the datalake.
  • Maintained proper Test Plan documents as deliverable from QA.
  • Created test cases on a regular basis and reviewed it with the team for feedback.
  • Used Tricentis qTest to upload test cases and executed it.
  • Ran Automated python scripts, including beeline queries for QA validations.
  • Raised In Sprint bugs in Jira and linked it to relevant requirement in qTest for traceability.
  • Referred to Interface Data Deliverable (IDD) document as a source of truth for validations.
  • Extracted Reports and metrics from the qTest application on the weekly basis for review with the upper-level management.
  • Participated in business discussions with business, product owner, DA and SAs to define test requirements and to provide test estimations to satisfy the business objectives.
  • Accomplished Testing Goals by doing Data Completeness, Data transformation, Data Quality, Performance and Scalability.
  • Performed Extract Validation and DQ Validation along with error report and Archival Validation for data quality threshold before sending it to downstream applications.
  • Validated through Infogix Assure to verify balance and controls on the data.
  • Completed deliverables and received a sign off and approval from Test Manager on a timely manner.
  • Performed System Integration Testing, Retesting, Functional Testing and End to End testing in the project.

Environment: HDFS, HIVE, Sqoop, PIG, Tricentis qTest, ASG Zena, Infogix, Talend, WinSCP, MS Excel, Teradata, HQL, PL/SQL, UNIX, Windows

ETL/Hadoop Tester

Confidential, Des Moines, IA

Responsibilities:

  • Worked closely with Business Users in understanding designing and documenting the Functional testing plan and then writing, executing, documenting the results and logging defects.
  • Created Hive queries which helped analysts spot emerging trends by comparing fresh data with historical claim metrics.
  • Used Sqoop to move data from individual data sources toHadoopsystem.
  • Used Teradata for database and wrote complex queries on Teradata SQL Assistant for view tables and data for validation purposes.
  • Compared the data in the tables by writing SQL Queries in Teradata.
  • Validated the Pig, Hive Scripts by pulling the data from theHadoopand validating it with the data in the files and reports.
  • Responsible to validate the data by accessing the Cloudera impala database where the daily feeds are processed through HIVE and stored in the HDFS.
  • Involved in complete life cycle activities like pre-testing phase and testing phase.
  • Created Test Procedures and Test Cases to perform the system testing and Regression testing.
  • Used Jira for Agile Management framework.
  • Participating in business discussions with business, product owner, BA and Tech lead to define test requirements and to provide test estimations to satisfy the business objectives.
  • Reached Testing Goals by doing Data Completeness, Data transformation, Data Quality, Performance and Scalability
  • Tested BI reports developed in Tableau are as per company standards.
  • Created various User Defined Functions for script enhancements and to verify the business logic.
  • Extensively used SQL queries to perform database validation
  • Responsible for reviewing complete test plans, test cases, test data and ensuring accurate coverage of requirements and business processes for the Big data warehouse

Environment: HDFS, HIVE, Sqoop, PIG, HP ALM 12.0, Informatica Power Centre 10, MS Excel, Oracle 11g, Teradata, SQL, PL/SQL, UNIX, Windows, Tableau

Big Data/ ETL Tester

Confidential, Boston, MA

Responsibilities:

  • Tested fact and dimension table in Star Schema model based on requirements
  • Created and executed test cases based on test strategy and test plans based on ETL mapping document
  • Written complex SQL queries for querying data against different data bases for data verification process
  • Preparation of technical specifications and Source to Target mappings
  • Defects identified in testing environment where communicated to the developers using defect tracking tool Quality Center
  • Written several complex SQL queries for validating Cognos Reports
  • Worked with business team to test the reports developed in Cognos
  • Tested several complex ETL mappings, mapplets and reusable transformations for daily data loads
  • Extensively used Informatica power center for extraction, transformation and loading process
  • Creating test cases for ETL mappings and design documents for production support
  • Extensively worked with flat files and excel sheet data sources. Wrote scripts to convert excel to flat files
  • Scheduling and automating jobs to be run in a batch process
  • Effectively communicate testing activities and findings in oral and written formats
  • Reported bugs and tracked defects using Test Director
  • Worked with ETL group for understating mappings for dimensions and facts
  • Extracted data from various sources like Oracle, flat files, and SQL Server

Environment: UNIX Shell Scripting, Oracle 11g, Informatica Power Center 9.2 (Power Center Designer, Workflow Manager, Workflow Monitor), SQL, Cognos, Windows, TOAD, PL/SQL

DWH Tester

Confidential, Marlborough, MA

Responsibilities:

  • Scripted complex SQL queries for querying data against different data bases for data verification process.
  • Involved in the error checking and testing of the ETL procedures and programs Informatica session log.
  • Participated in defining and executing test strategies using agile methodology.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Created Test cases for Data Acquisition and Data Delivery and tested the data accuracy for all the transformations.
  • Executed test cases and reported bugs in HPALM.
  • Executed the Test cases for Cognos Reports.
  • Tested Web Services/XML/SOAP using SOAP UI tool.
  • Performed database testing with SQL queries to verify data integrity.
  • Conducted backend testing by querying databases to synchronize testing databases and checked for data integrity and proper routing based on workflow rules at each step.
  • Generated the detailed Bug reports and tracked, reviewed, and analyzed defects.
  • Involved in meetings to discuss the findings in the executed tests and deciding the next steps.
  • Validated application with requirements and functioning as per the technical and functional specifications.
  • Interacted with the Business users to identify the process metrics and various keys dimensions and measures.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging table.
  • Tested the data and data integrity among various sources and targets.

Environment: Informatica, SQL, PL/SQL, UNIX, Teradata, Jira, SOAP UI, Cognos, MS Excel, Agile, XML

ETL TESTER

Confidential, Wilmington, DE

Responsibilities:

  • Responsible for writing Test Cases, Test Plans, Test scripts and other test documents based on business requirement.
  • Implemented various big data strategies in all stages of SDLC by following Agile.
  • Extensively tested Data Warehouse or Big Data applications using ETL.
  • Extensively tested transformations from ETL application.
  • Designed, developed, and executed test cases.
  • Worked on queries and scripting languages like SQL.
  • Designed test cases for positive, negative, alternative flows and ensured complete requirement coverage.
  • Identified and documented testing issues and quality risks, as well as participated in defect remediation.
  • Designed Test cases in Rally and tested the ETL transformations using Functional testing, System testing, Integration testing, regression testing and UAT.
  • Worked in a diverse global team environment sharing ideas.
  • Evaluated and implemented new initiatives on process improvement and technology projects.
  • Participated and conducted Issue Log weekly status meetings, Report status meetings and Project status meetings to discuss issues and workarounds.
  • Communicated with Team both Onsite and Offshore, throughout all the phases of project development to eliminate issues/Roadblocks. Worked with data validation, constraints, record counts, and source to target, row counts, random sampling, and error processing.
  • Developed test reports and participated in testing prioritization and archived test results.
  • Involved in extensive data validation using SQL queries and back-end testing.

Environment: HPALM 11, Cognos 8.4, DataStage 8.7/8.1, UNIX, TOAD, SQL Server, Flat Files, CSV, DSV

We'd love your feedback!