We provide IT Staff Augmentation Services!

Etl Test Lead Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Around 11 years of experience in Software Quality Assurance in ETL/DWH testing, Big Data/Hadoop testing, Data Analytics, Back end tensing by using SQL, Manual testing and Automation testing for software products and applications to meet business, market and customer needs.
  • Good understanding of various phases of SDLC and involved in complete Test lifecycle. Versatile with Water - fall model, V-model and Agile Methodologies.
  • Strong experience in backend testing using query languages PLSQL/SQL
  • Expertise in understanding the Requirement Specifications, ability to inspect Functional Requirements, designing Test Cases with Test Data/Conditions and ability to execute the Test Cases as per the schedule.
  • Good knowledge in Designing and developing mapping from varied transformation like Connected Lookup, unconnected lookup, Source Qualifier, Expression, Router, Filter, joiner and aggregator.
  • Good knowledge of Relational & Dimensional Data models (Star and Snow Flake), Facts and Dimension tables, SCD, Normalized, Data, Data Extraction, Data Integration.
  • Extensive experience in onshore/Offshore model.
  • Experienced in preparing testing related documents like Test strategies, Test Plans, Test Scheduling, Test Scripts, Test Cases, Test Data, Test Summary report and Test Execution Report.
  • Experience in HiveQL and Hive Table Creation, Schema Validation, Hive Table Partitions
  • Validated the Source tables data migrated into the Hive database.
  • Proficient in writing complex Hive queries.
  • Experience on Imported and exported data into HDFS and Hive using by SQOOP.
  • Professional experience in Integration, Functional, Regression, System Testing, Performance testing, Load Testing, UAT Testing, Black Box and GUI testing
  • Hands on Experience in HP Quick Test Professional (QTP) and HP Quality Center (QC/ALM)
  • Experience on design and develop the automation scripts in BPT (Business Process Testing).
  • Hands on experience in Healthcare Insurance and Life insurance, Gaming and lottery and Media &publications projects applications testing.
  • Good knowledge on Informatica data quality (IDQ) tools.
  • Involved in various HIPAA validations for EDI transactions using 820, 834, 835, and 837.
  • Good experience on Windows and web based applications testing.
  • Experience in testing web services via SOAP, using the open source testing tool SOAP UI.
  • Experience on BI Reports testing (History reports, Standard Reports, Drill Down and Cubes Validation).
  • Well expertise in reporting Bugs in conjunction with the Development team using Bug Logging & Tracking Tools like HP Quality Center (ALM), Rational Clear Quest, MUTT, JIRA and TFS.
  • Exposure in using Test Case Management Tools like Quality Center (QC/ALM) to design & execute the Test Cases and to map the Requirements with the Test Cases.
  • Adherence to the Quality goals by conducting and ensuring reviews of all deliverables.
  • Quick learner and excellent team player ability to meet tight deadlines and work under pressure.

TECHNICAL SKILLS

Databases: Oracle 9i/11, MS Access 2000, SQL Server 2012, Oracle, Teradata and MS Access

Scheduling Tools: CA Workload Automation, Autosys 4.0,Informatica Scheduler

Tools: Informatica Power Center 9.x/8.x, SQL Server Integration Services(SSIS) 2012, SSRS, IBM Datastage, Cognos Reports and HP Quality Center 9.0/11.0, TOAD, HDFS, MapReduce, HIVE, SQOOP, TIBCO, Rally, Data validation option (DVO), CA Workload Automation and Share point and MTM.

Programming: C, C++, JAVA, Python, VBA, UNIX Shell Scripting and XML

Operating System: Windows 8/7/XP/ 2003/2000/98 , Unix

Web: HTML, VBScript, XML, XHTML, XSLT, SOUPUI and Web Logic

Automation Tool: Quick Test Professional (QTP) and SOAPUI and load runner and Selenium

PROFESSIONAL EXPERIENCE

Confidential

ETL Test Lead

Responsibilities:

  • Analyze the business requirements and documented the test plan and test strategy for validating the data completeness and correctness.
  • Involve in day to day SCRUM in an AGILE environment meeting both on site and off shore teams providing the day to day updates on the testing schedules.
  • Review the test artifacts prepared by offshore team.
  • Involve in design reviews, meeting and test activates on a regular basis
  • Using Informatics ETL tools to load data from flat files to relational tables, relational table to flat files
  • Define the data cleansing strategy based on the exploratory data analysis reports.
  • Involve in writing SQL statements to create test data for test cases and data validation tests to extract data from the tables
  • Perform data quality analysis using advanced SQL skills.
  • Created PL/SQL Stored procedures and implemented them through the Stored Procedure transformation.
  • Involve with ETL test data creation/Identified for all the ETL mapping rules.
  • Execute test scripts and verified the results. Reported Bugs to developers through the JIRA.
  • Documented and tracked defects identified during testing. Report accurate testing status and recommend release to the next environment.
  • Identify the automation scope and develop the automation suite by using DVO tool.
  • Work closely with development teams and customers to design, architect, implement, and support new projects.
  • Ability to clearly present information, verbally and in writing, to audiences of diverse technical backgrounds.

Platform: Infomatica 9.x, SQL Server 2012, SSIS Report and Pega.

Tools: Used: JIRA and DVO

Confidential

Sr. QA

Responsibilities:

  • Analyze the business requirements and documented the test plan and test strategy for validating the data completeness and correctness.
  • Involved in creating Hive tables, loading and analyzing data using Hive queries.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Create Hive queries which help analysts spot emerging trends by comparing fresh data with historical claim metrics.
  • Used sqoop tool to export/import data between HDFS and relational db.
  • Validated the Map reduce, Pig, Hive Scripts by pulling the data from the Hadoop and validating it with the data in the files and reports.
  • Tested Map Reduce programs to parse the raw data, populate staging tables and store the data in HDFS.
  • Validated Pig UDF’S to pre-process the data for analysis .
  • Performed data quality analysis using advanced SQL skills.
  • Involved in design reviews, meeting and test activates on a regular basis.
  • Performed the data profiling for different formats of data files like flat files and CCDA.
  • Defined the data cleansing strategy based on the exploratory data analysis reports.
  • Medical claims data validation as per HEDIS 2017& HEDIS 2018 measures.
  • Involved in writing SQL statements to create test data for test cases and data validation tests to extract data from the tables
  • Verify the gap closer report in SDM (Stars data mart) by writing the SQL queries in the Teradata DB.
  • Identify the risks and provide mitigation plan to avoid the issues.
  • Validated data in between source and target after running Mappings by the SSIS jobs.
  • Involved in Troubleshooting, resolving and escalating data related issues and validating data to improve the data quality.
  • Tracking and reporting the issues to project team and management during test cycles.

Platform: SQL Server Integration Services (SSIS) 2012, SQL Server 2012, HDFS, MapReduce, Hive, SQOOP, PIG Teradata and J2EE, Teradata.

Tools: Used: Share point, Service now, Ipswich WS FTP 12, Ultra Edit, Rally and ECG Quick connect

Confidential

Sr. ETL /Big data QA

Responsibilities:

  • Responsible for creation of Test strategy, Test plans, Test schedule, Test scenarios, Test data, Test reports and other testing deliverables in accordance with SDLC guidelines.
  • Provide test solutions for Application Components and Application Functionality and Regression .
  • Analyze changes in the requirements that related to the test team and functional processes to ensure all impacts have been identified for testing
  • Conduct test team meetings to ensure business and testing expectations are met, and test deliverables are traceable to business requirements.
  • Review and ensure all deliverables (test plan, test cases, test summary reports, test defects reports) within a release.
  • Utilize clear quest issue tracking tool to log and track all defects identified during testing.
  • Collaborate with the development and configuration management team for deployments into the test environments
  • Involved in System and Regression testing
  • Extensively used Informatica ETL tool to load data from Flat Files to relational tables, relational table to Flat Files
  • Validated data in between source and target after running Mappings by the Informatica jobs.
  • Testing the Inbound and outbound flat files generating as per DTCC layout standards.
  • Involved in writing SQL statements to create test data for test cases and data validation tests to extract data from the tables.
  • Performed System test on Cognos Reports for ecommerce Data warehouse system which involved analyzing functional specifications and writing test analysis, executing reports with test data for test scenarios and writing complex Test SQLs for the reports and logging defects in Clear Quest.
  • Responsible for System testing of Ad-Hoc Query Packages in Cognos which involved developing and execution of Ad-hoc reports and corresponding Test SQLs based on various join conditions and filters and calculations.
  • Wrote UNIX Shell Scripts and pmcmd command line utility to interact with Informatica Server from command mode.
  • Tested Triggers which were enforcing Integrity constraints, stored procedures for complex business logic complementing the Informatica sessions.
  • Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic
  • Analyzed application data using the SQL to identify the Informatica ETL processes
  • Used the CA Work Load Automation tool to run the scheduled Informatica jobs.
  • Validate the data by using the Data validation option (DVO).
  • Support UAT testing.
  • Prepared Test Matrix to give a better view of testing effort

Platform: Informatica 9.x, PL SQL Developer 10, SQL Server2008, Java, J2EE, Web Logic, Cognos Reports

Tools: Used: IBM Clear Quest 7.1, Ultra Edit 32, Onbase image generator, MTM, CA Workload Automation 11.3And Data validation option (DVO)

We'd love your feedback!