Consultant - Hadoop Testing / Python Testing Resume
SUMMARY:
- QA Analyst with 8+ Year of Hands - on experience in Hadoop testing, Phyton Testing, ETL testing, BI Testing, cognos testing (Report Testing), Pentaho BI testing & MSBI-Testing in depth with cross platforms (windows, Unix).
- 2 years’ experience in Hadoop testing, Python testing in HDP(Horton Works Data Platform), HDFS, Hive with linux, centos, unix environments
- Good Exposure in writing H-Query scripts, SQL scripts and shell scripts
- Good Knowledge in Control-M scheduling tool.
- Worked with multiple servers with large data.
- Good understanding of dimensional modeling like Star Schema, Snow Flake Schemas
- Good Knowledge in ETL tools like Informatica, Datastage, Cognos(reporting tool)
- Good Knowledge in Hive, SQOOP, Shell scripting, Unix commands
- Good Knowledge in HDFS, Security, Ambari
- Worked with huge data, like Gigabytes, peta bytes
- Analyzed software problems, provided clear defect reports and followed defect tracking practices based on internal QA standards and processes.
- Exposure with agile testing process and followed scrum process, multi (BO & BI testing) Data stage testing, and cognos testing
- Activities involved in, estimations, test plan creation, test cases creation, Test strategy creation, execution, defect tracking and management, scope finalization, resource management and handling reporting.
- Experience in analyzing record sets for Data Quality, Data Validation and Metrics using SQL/ NoSQL
- Experience in writing complex SQL queries and UNIX.
- Experience in Shell Scripting.
- Good Knowledge in Control-M scheduling tool.
- Proficient in Test Case Authoring, Execution, Bug Tracking and Reporting.
- Understanding of Software Testing Life Cycle and Software Development Life Cycle. Database experience using Oracle 11g/10g/9i, SQL, SQL*Loader, My SQL, SQL Server
- Experience working with huge volumes of data, analyzing record sets for Data Quality, Data Validation and Metrics using SQL/ NoSQL
- Experience in Data cleansing, Data Merging, Data Aggregation and Data Scrubbing test validation techniques.
- Experience in Bug reporting tools like ALM, QC, Jira
- Worked with Testing practice for TCG(test case generator), TDG(test data generator) tools,
- Having experience in Banking, Payments, Finance and Insurance Domain
- Experience in different databases like Oracle, MySQL, SQL SERVER
- Excellent in Testing documentation, Requirement Traceability Matrix, reporting of testing data (size, effort, defects)
- Strong Problem Analysis & Resolution skills and ability to work in Multi-Platform Environments
- Designed, built and managed complex test environments
TECHNICAL SKILLS:
Operating System: Windows XP, Windows 7, UNIX, Ubantu, CentOs
Environment: ETL, BI Environment, Control-M, Guidewire Billing Center, Agile Center
Database: Oracle, My SQL, SQL Server, Hive, Python
Bigdata/ETL/BI Tools Known: Hadoop, HDFS, Sqoop, Datastage, Pentaho, Informatica, Microsoft-BI, SASDI, Cognos, SAPBO, VM Ware, FCCM, Cognos Reporting, Python
Other: Win Merge, QC 9.0, ALM, Bugzilla, Jira, Test Director, Toad, TDM
PROFESSIONAL EXPERIENCE:
Confidential
Consultant - Hadoop Testing / Python Testing
Responsibilities:- Confidential - is an Banking company, it is dealing with GE Captial, WestPac and Citi bank data AIMS is the Invetory Finance system which does processing for all the inventory finance invoice.
- AIMS will send Direct Debit and Direct Credit Payment files to GAMS Global Application Middleware System is batch integration platform interfacting with AIMS system for orginating payment requests to different external partner Banks, Payment messages will be enriched and processed via GAMS at specified frequencies to different partner Banks to receive and update status of the payment back to AIMS systems payments messages, downstream reporting and all type of batch processing is done via GAMS.
- GABS: Global Application Business Services is Sanctioned processing platform to interacts with GSMOS Global Sanction service on real time process. which would be destined to GE Captial’s Customers Credit their Bank account. And
- GABS act as a source to GSDL which is Global Service Data lake which is maintaining AIMS system data, . GAMS and GABS system partner Bank accounts, Transaction data which includes Receipt, Direct debit info, Urgent, Non-Urgent transaction details, SEPA payments details. And
Technologies: Python testing, Hadoop (HDP) testing, Oracle Flexcube, REST/SOAP//XML
Confidential
Consultant - Hadoop Testing/ Report Testing
Responsibilities:- Confidential - is an insurance company, it designs, markets, and underwrites property and casualty insurance products for niche markets, with value-added coverages and services. PHLY competes on coverage, customized solutions, and consistent pricing, with a disciplined underwriting. PHLY approaches the market through multiple distribution channels: preferred producers, firemarked producers, independent insurance producers, wholesalers, and the internet
- PHLYGWBC conversion project, GWbilling center, Legacy systems to DataLake and generate reports through Cognos. core systems work together as designed to meet the defined business objectives. BillingCenter integrating component and respective integrating applications are tied together and available in Data Hub.
- Reporting will be implemented once data moved from source to Data Hub.
Tools: - ETL, Informatica, Hadoop, Hive, Sqoop, Data stage, Unix, Cognos
Confidential
Consulant - ETL Testing
Responsibilities:- Confidential provides solutions for ZNAW (Zurich North America Warehouse) . ZNA having information on Crop insurance for various products and transaction status with third party like risk share and Federal policies all the information to maintain federal insurance regulatories.
- RCIS a broad portfolio of products, including hail, named peril, supplemental and stand-alone insurance products, as well as multi-peril insurance policies available through the USDA's Risk Management Agency. Additionally, RCIS offers a suite of technology and service solutions that helps agents, farmers and team members simplify communication and work together more effectively with Federal programs, Private products, Agent tools, Producer services
Confidential
Consultant - BIG DATA Test Engineer/Hadoop tester
Responsibilities:- AML- Antimoney laundering Project is for HSBC. Enterprise wide policies and procedures related to anti-money laundering and anti-terrorist financing, customer due diligence and economic sanctions. HSBC introduced AML
- The entire data comes from different servers and loads into DATA LAKE. From Data Lake the Hadoop jobs runs and the data moves into hive tables. These scripts are running through automation and run through control-m scheduling tool. Once the data loaded into hive tables from there FCCM (Finance Crime Control Management) tool interacts with the data and generate alerts for the customers who falls into the high risk category.
Tools: - Hadoop, Hive, Sqoop, Data stage, Unix, Winscp, Control-M, Oracle FCCM, AML.
Confidential
Consultant - ETL Test Engineer
Responsibilities:- CDMP- Claims Data migration project- ICBC has maintained insurance claims data dating back to 1976, at the intended time of migration this will constitute approximately 38 years of data.
- The CDMP project will migrate the data from the different source systems like Legacy and CDW data into the same format/structure as the Claim Center Staging database (GW Staging), from which it will be loaded into the EDW environment, validated and integrated with the other enterprise data in the EDW.
Confidential
Consultant - ETL Test Engineer
Responsibilities:- The purpose of this project is to provide a new internally-hosted source of Worker data for DLG applications. HR delivered a series of cloud based services to achieve the separation of worker data and transformation of the HR services in Direct Line Group.
- Batch Job Pull from Workday data from SQL SERVER database in .XML, Apollo data from SAS DB in .CSV format and jobs push the data to FLA area. File Landing Area it is in UNIX environment. Incremental loads from FLA to RLA through batch jobs and Using SAS Data Integrator Workday and Apollo Data is merging into Confidential Data ware house tables.
Confidential
Consultant - ETL Test Engineer
Responsibilities:- Confidential is a UK insurance group headquartered in the Paris. DLG is a conglomerate of independently run businesses, operated according to the laws and regulations of many different countries. The DLG groups of companies engage in life insurance, health insurance and other forms of insurance as well as investment management.
- DLG Deals with different Products like HOME, PET, MOTOR, TRAVEL and BRANDS like ICIC Pru, Churchil, Privilege, Greenflag., Data integrate from different Product and brands and give the solution for the management about the insurance policies, and brands details.
Confidential
Senior BI Test Engineer
Responsibilities:- Confidential Card Services product involves in mainly 5 types of modules card services, wallet services, cashier services, sva services, and authorization services
- Card Services: it mainly interacts with the Plastic card data when it approves from the fis-uk then, it is going to process with the user data it comes into program manager. There the data is retrieve from the different using different card all the data is extracting from the card holders the wallet is having a SVA (stored value card) account id, and it links to Cashier services to check/monitors the transactions happen, closed loop, open loop transactions, Wallet services.
Environment: Pentaho ETL testing, BI reports testing, Windows, Unix, Pentaho Business Intelligence, My SQL, etc.
Confidential
ETL Test Engineer
Responsibilities:- The Confidential is a user friendly, intranet application developed to maintain clean data of customer’s worldwide from different systems. The application also provides the facilities of is a web based application. Confidential is internal data editions Resource & computing.
- Confidential clear acts as a custodian in between the company and the Web Solutions. Idearc is an application which performs various functionalities like Instruction Management, Grid Management, Reporting, Settlement date, Agent deadline date, etc.
- Confidential is the application which is useful for the costumers to instruct their instructions Online. Confidential purely depends on the Corporate Action and sends messages by Swift.
- Corporate action is an action announced by a company based upon their company annual growth which affects their Market positively or negatively.
Environment: Data stage testing, Report testing, MS-BI Testing
Confidential
ETL Test Engineer
Environment: Informatica testing, Oracle, Database testing.
Responsibilities:
- Analysis software problems, provided clear defect reports and followed defect tracking practices based on internal QA standards and processes.
- Preparation of Test scenarios for System testing and SIT.
- Designed Automation scripts using Shell /Python.
- Design and execution of Test Cases based on the test scenarios and business requirements.
- Tracking Test case execution and Defects in QC.
- Executed Shell Scripts for running Datastage and Hadoop jobs
- Maintain the data as per client request. Debugging for nulls, blank spaces in the target systems
- Data validating for fault files and special files. By target systems using oracle, My SQL, SQL server
- Data Validation within Target tables to ensure data is present in required format and there is no data loss, Bad data from Source to Target tables
- Preparing Test cases, reviewing the Test cases and executing those Test cases
- Checked for stages like filter, merge, lookup, join, switch, SCD1, SCD2 in data stage jobs
- Written complex H-QL queries to validate the data flow from layer(DML) to layer(TL)
- Sending daily status reports to onsite counter parts.
- Naming Convention Testing, Constraint Testing, Source to Target Counts
- Source to Target Data Validation, Transformation and Business Rules
- Sampling, Duplicate Testing Connectivity Tests, Performing up on No Data losses
- Performing correct transformation rules, Data validation, Regression Testing
- Worked on One shot/ retrospective testing, Ad-hoc testing and Snapshot testing
- Data integration testing using ETL test Environment for the upcoming data
- Maintain the cleansed data from source system to target systems
- Send traceability matrix report for the test cases, failed test cases and review requirements for those test cases
- Participate in the reviews, realizes, FSD documentation, dashboards, report generations
- Performing data validation.
- Identifying Defect, getting them logged in QC and tracking them to closure.
- Involved in System testing, Integration testing, Re-testing and Regression Testing,
- Designed Test scenarios for validating the reports as per business requirements.
- Design and Execution of Test cases.
- Tracking test case execution and defects in QC.
- Written complex SQL queries against mappings to perform report validation.
- Communicated with development team for regular updates on changes.
- Sending daily status reports to onsite counter parts.
- Validated and analyzed informatica mappings as per requirements.
- Triggered Informatica mappings using shell scripts and from workflow manager when needed.