We provide IT Staff Augmentation Services!

Qa Tester / Azure Cloud Resume

Tampa, FL

SUMMARY

  • Around 6 years of experience as an ETL/HADOOP Tester.
  • Experienced in all stages of Software Development Life Cycle and Software Testing Life Cycle.
  • Good Knowledge and experience in Metadata and Star schema/Snowflake schema. Analyzed Source Systems, Staging area, Fact and Dimension tables in Target D/W.
  • Experience and knowledge in Microsoft Azure tools such as ADLS, ADF, Azure DevOps and Dremio.
  • Well - versed with different components of Hadoop like HDFS, HIVE, Sqoop, HBASE, Pig, MapReduce.
  • Knowledge in Hadoop 2.0 Architecture and Hadoop 1.0 Architecture.
  • Worked in Cloudera as Client tool for Hadoop.
  • Written several Hive scripts for data validation between the source and Staging in HDFS layer.
  • Written several Hive queries to validate the data between the Staging and Integration layer in Hadoop.
  • Used HDFS scripts to the data from Files to HDFS and Executed Sqoop commands to import the data from Oracle database to HDFS.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica graphs.
  • Professional experience in Integration, Functional, Regression, System Testing, Load Testing, UAT Testing, Black Box and GUI testing.
  • Collaborating with project team to discuss and resolve issues.
  • Experience in Web based applications like transactions applications, healthcare.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
  • Experience in creating Requirement Traceability Matrix (RTM) documents.
  • Experience in preparing Test Strategy, developing Test Plan, Detailed Test Cases, writing Test Scripts by decomposing Business Requirements, and developing Test Scenarios to support quality deliverables.
  • Extensive worked on HP Quality Center 10.0 & HP ALM Quality Center 12x to upload Test Cases, Execute Test Cases, Log Defects, Track Defects, and Track Progress Status.
  • Experience in working with Software Development team in resolving Defects, presenting the Defect Status reports, resolving requirement, and observed design inconsistencies.
  • Expertise in querying and testing RDBMS such as Oracle, MS SQL Server using SQL for data integrity.
  • Experience in Reports testing in Crystal Reports XI Environment.
  • Tested Web Services /XML /SOAP services using SoapUI tool.
  • Extensive experience in testing reports generated by MicroStrategy
  • Strong working experience in Windows NT/2000/XP and UNIX environments.
  • Clear understanding of business procedures and ability to work as an individual and as a part of a team.
  • Experience in interacting with business analysts, developers, and technical support and help them base line the requirement specifications.

TECHNICAL SKILLS

Operating Systems: Windows, LINUX, UNIX

Big Data Technology: HDFS, MapReduce, Hive, Pig, HBase, Sqoop

Test Management: HP ALM 12.0/11.0, JIRA, QTest

ETL Tools: Informatica 10x/9x, DataStage 8.7/8.1, Informatica

Web Services: Soap UI, Postman

RDBMS/ Databases: Oracle 11g/10g, SQL Server 2008/2005, SSMS

BI Tools: Tableau, Cognos, Micro Strategy

PROFESSIONAL EXPERIENCE

Confidential, Tampa, FL

QA Tester / Azure Cloud

Responsibilities:

  • Involved in creating VDS in dremio space and PDS in dremio source.
  • Detected active and inactive table in control database and wrote query to delete the inactive table.
  • Validated JSON/XML file using Postman.
  • Worked on Anaconda Prompts to run python scripts to generate reports from source and target.
  • Written complex SQL queries to validate in SSMS and made sure data is ingested accurately in ADLS by querying in Dremio.
  • Good knowledge and proper understanding of Azure Data Factory (ADF) such as mapping data flows, pipelines.
  • Used Azure DevOps to log bugs and track tasks.
  • Actively participated in all team meeting such as daily stand up, refinement meeting, sprint planning meeting and retrospective meeting.

Environment: Azure Data Factory (ADF), Azure Data Lake Explorer (ADLS), Azure DevOps, Dremio, SSMS, Anaconda Prompts, Postman, Parquet Viewer.

Confidential, Chicago, IL

ETL/Big Data Tester

Responsibilities:

  • Prepared test plan and test strategy documents based on business requirements.
  • Written complex hive queries in order to validate between source and target layer.
  • Ran schedular job through Zena to load the data from source to trigger drop location.
  • Validated using Hive Scripts by pulling the data from HDFS and validating with the files and reports.
  • Prepared test cases and executed using the Tricentis qTest tool to perform system testing and regression testing.
  • Used Jira to track the user stories and Insprint bugs.
  • Worked on Teradata to perform validation on the complete source to target mappings along with transformation rules.
  • Worked on python automation tool for complete validation.
  • Ingested JSON, XML and CSV file into raw layer and curated layer and pushed out to consumption.
  • Generated Test Status Reports and defect reports.
  • Prepare documents for QA sign off and approval.
  • Actively participated in Daily Scrum meeting, Sprint Planning meeting and Retrospective meeting in agile process.
  • Presented test case review in every sprint and made project team aware for testing approach.
  • Represented the team as a lead and delivered update to test manager.
  • Involved in Business analysis and requirements gathering.

Environment: HDFS, HIVE, QTEST, UNIX, Teradata, MS Excel, Zena, Agile, XML, JSON, Jira, WinSCP.

ETL/DWH TESTER

Confidential

Responsibilities:

  • Worked on creating Test plan, Test Design, Test scripts and responsible for implementation of Test cases as Manual test scripts.
  • Proficient in testing several data migration application for security, data protection and data corruption during transfer.
  • Responsible for testing packages using (ETL) to verify data completeness, data transformation, data quality, integration testing, UAT and regression testing.
  • Created test cases and executed test scripts using HP ALM/Quality Center.
  • Tested the data and data integrity among various sources and targets.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
  • Performed Verification, Validation, and Transformations on the Input data (Text files, XML files) before loading into target database.
  • Involved in preparing test data for UAT and participated in UAT signoff.
  • Involved in Data migration and Data distribution testing.
  • Worked on testing the Application in UNIX Environment and collecting the Test data from the Business Team.
  • Involved in testing the Cognos reports by writing complex SQL queries.
  • Performed Functional, Data Validation, Integration, regression and User Acceptance testing.
  • Participated in defining and executing test strategies using agile methodology.
  • Developed Oracle SQL test scripts from test procedures and expected results and executed them in various test environments.
  • Responsible for integration testing activities with other Data Warehouse teams and upstream/downstream application test teams.
  • Adopted Agile/scrum methodology for - Risk Mitigation, Monitoring and test Management (risk analysis) at every stage of the project Cycle with transparency in planning and module development.

Environment: SQL, PL/SQL, SQL SERVER, Oracle 11g, Unix, Jira, Informatica power center 9.5, XML, Agile, SoapUI, Teradata, Cognos, HP ALM.

Hire Now