Qa Data Engineer/etl Azure Test Lead Resume
CA
PROFESSIONAL SUMMARY:
- Over 7+ years of professional experience in Database, Hadoop and Software Manual Testing (Functional) of web - based applications.
- 3 years of working noledge in Cloudera Hadoop and its stack like HDFS, MapReduce, Pig, Hive and Sqoop.
- Good Knowledge of HBase, experience in Kafka testing
- Good noledge in Python Programming
- AWS Certified and Good Knowledge on Cloud Computing wif Amazon Web Services like EC2, S3 which provides fast and efficient processing of Big Data
- Experience in Azure snowflake
- Experience in API testing using postman tool
- Experience in Kafka tool for Data testing
- Experience in Informatica as ETL tool to transfer the data from source to staging and staging to target
- Well - versed wif all stages of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
- In depth noledge of Testing methodologies, concepts, phases, and types of testing, developing Test Plans, Test Scenarios, Test Cases, Test Reports and documenting test results accordingly after analyzing Business Requirements Documents (BRD), Functional Requirement Specifications (FRS)
- Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Experience in importing and exporting data from relational database into Hadoop cluster using Sqoop.
- Experience in creating custom Hive, Pig and map reduce programs for analyzing data.
- Good noledge in SDLC and software testing phases.
- Worked wif different platform like Oracle, Tera Data, SQL Server
- Expert in Test Case preparation, Functional testing - Manual, Defect Management, Regression and Sanity testing, Test Plan Building, Test report generation, Test case review and maintenance
TECHNICAL SKILLS:
Big Data Technology: HDFS, MapReduce, Hive, Pig, HBase, Sqoop, Kafka
Cloud technology: AWS, Azure
Databases: Oracle 11g/10g/9i, Teradata, DB2, SQL Server
Methodologies: Waterfall, Agile/Scrum
ETL Tools: Informatica Power Center 10.2/10.1/9.6, Data Stage
Reporting Tools: Tableau, Cognos 10.0, Power BI
Testing Tracking Tools: HP ALM 12.0/11.0, Quality Center 10.x, 9.x, Jira, Postman, Kafka
Tools: TOAD, SQL Navigator, Oracle SQL Developer Azure Snowflake
Automation: ICEDQ, Query Surge, Selenium
Languages/Programming: SQL, PL/SQL, Python
PROFESSIONAL SUMMARY:
Confidential, CA
QA Data Engineer/ETL Azure test Lead
Responsibilities:
- Test Lead for the EDIM (Enterprise Data Integration Management) End to End testing QA team
- Support the teams in testing wif high quality deliverables, perform reviews in order to meet the defined quality SLA (Service Level Agreement)
- Work Collaboratively and Proactively wif QA, Development, Business and other IT teams.
- Work on design document, Mapping document to ensured proper transformation rules, any gaps are there to complete testing.
- Develop overall Test Strategy, lead testing of all impacted applications/infrastructure and post-production support activities.
- Write Test scripts for Inbound and Out bound ETL processes and interfaces wif other systems and streams.
- Lead and manage the offshore team - work allocation, issue support and other activities. • Build Automation Framework using Shell and Python Scripts to validate Source to Target Testing and generate reports and publish the report in the project Dash Board.
- Perform activities such as functional analyst and Data validation.
- Performed front end testing for the use case scenarios
- Work on the Agile scrum approach and all the testing approach.
- Perform defect management process - Logging all the bugs in JIRA and business for review.
- Perform the Data Quality checks, Audit checks on the source data and target
- Preparing the mock up/Synthetic data to test the possible positive and negative scenarios to meet the Business requirements
- Create Test related documents and provide results to various client systems to support applications which use Azure technology
- Performed API testing using postman tool
- Experienced in Power BI reports for validating the front end data and testing same data populating in Data base
- Expertise in writing complex SQL queries in Azure snowflake to test pre and post ingestion activities
- Monitor ingestion jobs for different SOR’s to capture framework related metrics for the specific build pipeline.
- Generate and analyze the metrics followed by fixing the issues and finally send the testing metrics to the Client. The metrics involves the following: Requirement coverage, Requirement Traceability, Defect Leakage by phase, Defect Density, Defect Removal Rate, Automation vs Manual test execution. Reporting tools are Power BI, and JIRA.
Environment: Azure snowflake SQL, Jira, XML, JSON, PYTHON Scripts, Power BI, Shop site UI, Postman, Kafka, MongoDB
Confidential, Charlotte, NC
ETL/Big Data Tester
Responsibilities:
- Identify the test scope for a release, committing the stories for a particular release
- Understanding Customer Demand Story design and documenting Acceptance criteria and Test Plan
- Involve in designing and reviewing the test cases, test scenarios etc. to confirm acceptance criteria of the business to meet 100%
- Perform Sanity check and verifying the systems prior to start any testing
- Identifying defects and following it up wif the Component teams.
- Written complex SQL queries.
- Used SQOOP to move data from individual data sources to Hadoop system.
- Validated the Pig, Hive Scripts by pulling the data from the Hadoop and validating it wif the data in the files and reports.
- Coordinated various project specific access requests related to Hadoop components like HDFS directories, Hive tables
- Performed various Hadoop controls like date checks, record checks, balance checks, threshold limits for records and balances etc.
- Tested Map Reduce programs to parse the raw data, populate staging tables and store the data in HDFS
- Reported bugs and tracked defects using ALM/Quality Center
- Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
- Experienced using database query tools for Oracle, SQL and UNIX such as TOAD
- Experienced in data analysis using SQL, PL/SQL and many other queries-based on applications.
- Involved in extensive DATA validation using SQL queries and back-end testing
- Validated the Tableau insight center reports to make sure all the data is populated as per requirements.
Environment: MicroStrategy, SQL, PL/SQL, Agile, Soap UI, ALM/HP Quality Center 11, Oracle UNIX, TOAD, T-SQL, SQL Server, XML Files, Flat Files
Confidential, Malvern, PA
ETL Tester
Responsibilities:
- Assisted in developing test plans based on test strategy. Created and executed test cases based on test strategy and ETL mapping document
- Written complex SQL queries for querying data against different data bases for data verification process
- Performed UAT testing
- Worked wif data investigation, discovery and mapping tools to scan every single data record from many sources.
- Performing data management projects and fulfilling ad-hoc requests according to user specifications
- Written test cases to in HP ALM. Defects identified in testing environment where communicated to the developers using defect tracking tool HP ALM
- Used Informatica as ETL tool to transfer the data from source to staging and staging to target
- Written several SQL queries for validating Cognos Reports
- Worked wif business team to test the reports developed in Cognos
- Extensively used Informatica power center for extraction, transformation and loading process
- Creating test cases for ETL mappings and design documents for production support
- Extensively worked wif flat files and excel sheet data sources.
- Effectively communicate testing activities and findings in oral and written formats
- Reported bugs and tracked defects using ALM
- Worked wif ETL group for understating mappings for dimensions and facts
- Extracted data from various sources like Oracle, flat files and SQL Server
- Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic
Environment: UNIX Shell Scripting, Oracle 10g, Informatica 10.0, HPALM 11.0, SQL Loader, Cognos 8.4, SQL Server, Windows, TOAD, PL/SQL, SOAP
Confidential
ETL TESTER
Responsibilities:
- Interacted wif Business Analysts, Developers and support team to define the testing requirement documents along wif the QA Manager and mentored the QA Team accordingly.
- Drafted the Test Plan document and presented to IT and Business group for Sign Off.
- Escalation for unresolved bugs to the concerned developers and module leaders.
- Used Unix Commands to access and trouble shoot Errors by accessing the Error Log files.
- Participated in Design Phase, together wif members of the Product team, developers, and DBAs.
- Performed Smoke, System testing, system Integration Testing, User acceptance, Database & Regression testing.
- Prepared Test cases, procedures, Bug Tracking, Logging and reporting bugs using Quality center
- Expert wif Test documentation, such as Test Cases, Bug Reports, Use Cases etc.
- Designed various test Cases wif different test conditions.
- Prepared and executed scenarios for the Regression testing on new builds.
- Conducted User Acceptance Test (UAT) wif users and customers and wrote issues log based on outcome of UAT.
- Develop test cases and executed in HP QC
- Involved in creating test plan to test the ETL system thoroughly.
- Extensively interacted wif developers to analyze & resolve issues that were encountered while testing application.
- Executed Test Cases using positive and negative data in HP ALM Test Lab and reported results and defects using HPQCs Defects tool.
Environment: Data Stage 8, Oracle 11g, SQL, PL/SQL, MS Office Suite, HP QC 10, Unix, Windows, Cognos, TOAD, SQL developer