We provide IT Staff Augmentation Services!

Big Data Developer

SUMMARY:

  • ISTQB certified analyst with an experience of more than 8 years in Big Data, ETL and Business Intelligence projects.
  • Tricentis Tosca certified technical specialist with an experience of around 1 year in using Tosca automation.
  • Extensive work experience of Telecom, Banking, Financial, Retail and Healthcare domain.
  • Experience in testing Bigdata environments Hadoop, Hive, Sqoop, MapReduce and No SQL(HBase) Databases.
  • Good understanding of SDLC, Quality Assurance concepts (e.g. Functional, Integration, Systems, Regression testing) and agile development methodologies.
  • Experience in working with business analysts/customers to understand customer requirement and convert this into test plan/test case development.
  • Experience in design and execution of test cases and scenarios. Participated in Defect management and Bug Reporting.
  • Proficient in database testing using SQL queries.
  • Thorough Knowledge on Data Warehouse concepts like Star Schema, Snow Flake Schema, Dimension and Fact Tables.
  • Proficient in Functionality Testing, Regression Testing and Smoke testing. Proficient in ETL, Cubes and Reports Testing for BI applications.
  • Experience in writing and executing complex SQL queries to validate actual vs. expected results and to validate data from source to target.
  • Strong in SQL Server 2008, 2008 R2, Teradata and Oracle. Experience in using SSIS, Abinitio and Informatica tools for validating ETL workflows.
  • Experience in using SSRS, SSAS and Cognos tools for Reports and Analytics validation.
  • Experience in validating SCD type 1 and SCD type 2 tables.
  • Experience in validating Static and Incremental loads. Experience in Big data testing in Hadoop environment using Hive query language.
  • Well versed with User Level Unix commands. Exposure to all stages of Software Development Life Cycle.
  • Experience in defect management tools like JIRA and Quality Centre/ALM.
  • Excellent problem - solving skills with a strong technical background and good interpersonal skills.
  • Quick learner and excellent team player having ability to meet tight deadlines and work under pressure.
  • Flexible and versatile to adapt to any new environment and work on any project. Created VB script for unzipping, copy/move set of files.

TECHNICAL SKILLS:

Technical Skills: SQL, PLSQL, TSQL, Hive QL

Scripting: Unix, Shell, VB Scripting

Databases: SQL Server 2008R2, Teradata, Oracle, Redshift and HBase

Operating Systems: Windows, Linux

Project Management Tool: JIRA, QC, RTC, RQM, Subversion

Automation Tool: Tosca

Hadoop components: Hadoop, Hive, Sqoop

PROFESSIONAL EXPERIENCE:

Big data developer

Confidential, Columbus, Ohio

Responsibilities:

  • Experience in validating SCD type 1 and SCD type 2 tables. Experience in validating Static and Incremental loads.
  • Tested PL/SQL procedures that were developed to load the data from temporary tables in staging to target tables in the data warehouse.
  • Prepared and executed test cases data validation using Hive SQL queries. Developed strategies to validate incremental data loads.
  • Validated Cognos reports like Standard Reports, Cross tab reports, Adhoc reports.
  • Validating Static and Incremental loads.
  • Adhere to formal QA processes, ensuring that the Systems Implementation SI team is using industry-accepted best practices.
  • Oversee all aspects of quality assurance including establishing and reporting on metrics, applying industry best practices, and developing new tools and processes to ensure quality goals are met.
  • Act as key point of contact for all QA aspects of releases, providing test services and coordinating QA resources internally and externally.
  • Collaborates with the data warehouse architect, the ETL lead and data warehouse developers in the construction and execution of test scenarios including those applicable to the development, test, and production data warehouse environments
  • Create and Provide feedback on test cases, scripts, plans and procedures (manual and automated). Responsible for executing them.
  • Diagnose defects and track them from discovery to resolution.
  • Ensure all testing work is carried out in accordance with the Testing Plan, including schedule and quality requirements
  • Respond to all requests from stakeholders in a timely professional manner.
  • Developing and executing test cases, scripts, plans and procedures to support various development methodologies including waterfall and agile
  • The entire software development life cycle and test cycles (Unit, Regression, Functional, Systems, Stress & Scale, Smoke & Sanity)
  • Adhere to best practices and methodologies to design, implement and automate processes
  • Using a metrics-driven approach and closed-loop feedback to improve software deliverables and improve predictability and reliability of releases
  • Detailed and effective written communication skills for documenting the features tested and bugs found

Big data developer

Confidential

Responsibilities:

  • Actively involved in Requirement Analysis and business user walk through.
  • Involved in identifying Key Business scenarios for the E2E delivery.
  • Prepared and executed test cases data validation using Hive SQL queries. Developed strategies to validate incremental data loads.
  • Prepared and executed test cases for Business Scenarios Validation.
  • Preparing test scenarios to validate MapReduce jobs written in Java and validated the load process.
  • Written Hive queries for data analysis to meet the business requirements. Validated Hive Managed and External table types.
  • Creating Hive tables for testing purpose and working on them using Hive QL. Validated data imported to Hadoop or HDFS environment from RDBMS.
  • Validated Partitioning and Bucketing process using Hive queries. Validated Cognos reports.
  • Experience in validating SCD type 1 and SCD type 2 tables. Experience in validating Static and Incremental loads.
  • Tested PL/SQL procedures that were developed to load the data from temporary tables in staging to target tables in the data warehouse.
  • Using Tosca, we have automated Cognos reports for data and lineage validation.
  • Raised and tracked defects in JIRA/QC and shared defect reports to my manager and all the key stake holders.
  • Validated importing of data from various data sources, performed transformations using Hive loaded data into HDFS and Extracted the data from Teradata into HDFS using Sqoop.
  • Actively participated in defect calls and followed up to its closure. Involved in UAT support activities.
  • Preparing Key learning documents for every release. Responsible for re-testing defects which fixed by Dev team.
  • Involved in preparing Test Environment. Follow Peer Review Process for Test cases. Mentoring and nurturing of new team members

Confidential

Big data developer

Database and tools: Teradata, Abinitio

Responsibilities:

  • Actively involved in Requirement Analysis and business user walk through. Involved in identifying Key Business scenarios for the E2E delivery.
  • Extensively worked in data Extraction, Transformation and Loading from source to target system using BTEQ, FastLoad and MultiLoad.
  • Identifying the usage of Indexes in Teradata at various situations as part of tuning.
  • Collected the statistics and used join indexes.
  • Experienced in Writing/Tuning the BTEQ, Triggers, Macros, Mload, Fload scripts for better performance.
  • Involved in preparation of loading & extracting scripts using Teradata (MultiLoad, FastLoad, FastExport, BTEQ) utility.
  • Analyzing the Teradata EXPLAIN plan and optimizing it to reduce the total execution time
  • Through explain plan, analyzed and modified indexes and modified queries with derived or temporary tables to improve the performance.
  • Prepared and executed test cases for ETL validation of multiple layers using SQL queries. Prepared and executed test cases for Business Scenarios Validation.
  • Involved in analyzing the Abinitio workflows and executing them manually to ensure the workflows are meeting the quality expectations.
  • Experience in validating SCD type 1 and SCD type 2 tables. Experience in validating Static and Incremental loads.
  • Prepared test data to cover all scenarios in testing all possible transformations.
  • Prepared test cases to validate end to end data flow from source or inbound files to EDW or outbound files.
  • Prepared possible test scenarios to validate Abinitio graph when source is a file.
  • Raised and tracked defects in JIRA/QC and shared defect reports to my manager and all the key stake holders.
  • Followed Agile methodology for software development lifecycle.
  • Participated in daily scrum calls and meetings with scrum master.
  • Prepared Agile sprint burndown chart.
  • Actively participated in defect calls and followed up to its closure. Involved in UAT support activities.
  • Preparing Key learning documents for every release. Responsible for re-testing defects which fixed by Dev team.
  • Involved in preparing Test Environment. Follow Peer Review Process for Test cases.
  • Mentoring and nurturing of new team members

Confidential

Big data developer

Responsibilities:

  • Actively involved in Requirement Analysis and business user walk through.
  • Involved in identifying Key Business scenarios for the E2E delivery
  • Prepared and executed test cases for Data Model Testing, ETL validation of multiple layers using SQL queries
  • Prepared and executed test cases for Business Scenarios Validation.
  • Involved in analyzing the SQL Server Packages in (SSIS, SSAS and SSRS) and executing them manually to ensure the packages are meeting the quality expectations.
  • Experience in validating SCD type 1 and SCD type 2 tables.
  • Experience in validating Static and Incremental loads.
  • Analyzed SSRS Reports, KPI and Dash board output data against the source data.
  • Validated and executed ETL jobs using SQL Server agent.
  • Raised and tracked defects in JIRA and shared defect reports to my manager and all the key stake holders.
  • Actively participated in defect calls and followed up to its closure. Involved in UAT support activities.
  • Preparing Key learning documents for every release. Responsible for re-testing defects which fixed by Dev team.
  • Involved in preparing Test Environment. Follow Peer Review Process for Test cases.
  • Mentoring and nurturing of new team members

Confidential

Big data developer

Responsibilities:

  • Developed Test Cases/Test Strategy with input from the assigned Business Analysts
  • Prepared and executed test cases for Data Model Testing, ETL validation using SQL queries
  • Prepared and executed test cases for the requirements.
  • Experience in validating SCD type 1 and SCD type 2 tables. Experience in validating Static and Incremental loads.
  • Raised and tracked defects in JIRA till its closure. Responsible for re-testing defects which fixed by Dev team.
  • Involved in UAT support activities and helped users in understanding Data Warehouse.
  • Follow Peer Review Process for Test cases.
  • Processed transactions from system entry to exit. Tested functionality across applications and workflows.
  • Involved in testing the designing and development of data warehouse environment.
  • Optimizing SQL queries for better performance using Hints, Indexes and Explain Plan.
  • Coordinated system walk through, training, User Acceptance Testing (UAT) and facilitate issue resolution

Hire Now