We provide IT Staff Augmentation Services!

Senior Data Analyst/etl Lead Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • Over 15+ years of experience in the Information Technology Industry.
  • Has good hands on experience in snowflake Cloud Data Warehouse, Teradata, IBM BIG INSIGHTS (Hadoop), Data Analysis, Data Warehouse Architecture, Designing, Verification and Validation of report components using MSTR and OLAP tools.
  • Has 10+ years of strong experience in writing scalable, Hi - performing SQL queries for complex data scenario and worked in Oracle, SQL Server and Teradata database.
  • Has 8+ Years of experience in ETL Data Validation in testing of Data Comparison, Data Analysis, Data Validation, Data Cleansing, Data Verification able to identify the data mismatch
  • Has good experience in Python scripting language to automate the current process in data validation process. To read/write the CSV file and process of connecting the DB using the Python script.
  • Has good experience in preparation of automation script using the Robot Framework to load the source feed & Data validation using Pybot.
  • Experience in writing the Python scripts to load the different file format source feeds and Transform/filter out the set of records and loads into Source layer.
  • Has good knowledge in SAS Code using the SAS Enterprise Guide tool to perform the data comparison validation and which helps to ease out the mismatch analysis.
  • Has 5+ years of experience in Hadoop Testing, Big data Technologies, ETL & BI Tools and performs Team Lead management activity.
  • Has 8+ years of experience in Unix Shell scripts preparation by maintaining all global environment/application level variables.
  • Has Strong hands on experience in Business Intelligence reporting tool like Microstrategy 9.4, Tableau which helps to analyses different set of data for analytics purposes and implemented in the reporting components.
  • Has strong skill in understanding of all aspects in software testing lifecycle, ranging from requirements analysis through to Test Planning, Test Estimation, Test Metrics, Test Case Development & review, Defect tracking and Test Reporting.
  • Has Strong hands on experience in validation of SQL-Server Reporting Services to run different model runner to verify the false alerts.
  • Has efficient knowledge in translating the business rules and requirements into technical SQL scripts and tested the data from Source to Staging to Target database structure and content.
  • Had multiple sessions with BA/DA to understand the transformation mapping logic to validate converted loans are transformed as per mapping logic.
  • Has 5+ years of experience in UAT Support and Business Requirement Decision.
  • Has 6+ years of vast experience in Credit Card Processing & Payments using TSYS platform.
  • Has strong knowledge of all types of Software testing such as Functional, Integration, Regression, System and End to end testing.
  • Has certified in International Software Testing Qualification Board (ISTQB).
  • Worked on entire life cycle of the project and organized functional requirement documentation, deployment. Defect review with project team and daily status reporting.
  • Played a key role as a Quick test Professional, Rational Functional Tester, Clear Quest and Quality Center tools as automated testing tools.
  • As part of Data conversion Projects (MISER/CACS into MSP) and (ALS into MSP-HELOC) has thorough understanding in the requirements and transformation logic between the legacy and new system.
  • Good Knowledge in data conversion between the MISER/CACS System Verses MSP related to different products codes, Account Status code (Active, Closed, Paid-off), Account Type code (D1A, D1B, D1C etc..), Branch Code, Interest Type code, Bank Number, Escrow Bill Frequency Code, Collateral Code, Fee Assessment Code, Bankruptcy code, Charge-off reasons etc.
  • Key role handling as On-site & Offshore Co-ordination part and leading the team in offshore by tracking day to day task activities and delivering consolidated status from Offshore to Onsite lead on daily basis.

TECHNICAL SKILLS

Database: Oracle 11g, Teradata 14, SQL Server, Netezza, DB2

Operating System: Windows NT, Windows 2000 Advanced Server, UNIX, LINUX and MS-DOS

Software Testing & Methodologies: Big Data Technologies, Hadoop Testing, ETL Testing, System Integration Testing, Regression Testing, Functional Testing, UAT SDLC Models, Waterfall Model and Agile Methodologies, Basic of Unix Shell scripts & Shell Scripting

Tools & Scripting Language: SAS scripting code, Python 3.7, Linux Scripting, HP ALM 11, QTP, Rational Clear Quest, SOAP UI, REST Client & Microstrategy 9.4 & SAS Enterprise Guide, SSRS reports (Model Runner), Jira, Confluence, Tableau, Robot Framework, Pycharm, Spyder IDE’s

Other Utilities: Rational Clear Case & AccVerify 8.x (508 Compliance), GitHub, UCD

Domain knowledge: Banking and Finance (Credit Cards) & Mortgage & Capital Marketing & Trade Compliance

PROFESSIONAL EXPERIENCE:

Confidential

Senior Data Analyst/ETL Lead

Responsibilities:

  • Onsite Test Lead for Confidential project work with Business Analysis and PMs to get the requirement of the project and come up with an estimation of the project.
  • Build Test Plan document for the tasks, dependencies and highlight on the automation stuffs that was utilize in the later stage.
  • Preparation of Loading Python scripts for different source feed, since the source feed will be in different file layout format and with different file extension like flat files, CSV, Delimiter, PSV, XML and Excel files.
  • Worked in Python scripts to automate the current process in data validation process such as comparing the data between the source and target using the transformation logic as per requirement.
  • Preparation of regression suite using the Robot Automation Framework for the different source feed which helps to load the source files and validate the data comparison.
  • To read/write the CSV files and process of connecting the DB using the Python script. Captured the mismatch details at each field level in the separate CSV files and send the email to respective team.
  • Worked in validation of Model runner which was prepared using SSRS tool and to determine the false alerts based on the requirements.
  • Worked on the Tableau to validate the Data cleansing, Data profiling with various set of rules which test requirements.
  • Updating the comments & tracking the story to closure after the QA validation in the Jira.
  • Reviewing of ETL mapping and transformation specifications based on the business requirements.
  • Data will be loaded into FDM tables from different source feed by preparing the python scripts.
  • Analyze different set of stages during Extract Transform & Load processing, validated and reviewing of ETL mapping and transformation specifications based on the business requirements.
  • Worked on Preparation of Test Cases, Test Script, and Test Scenario for Data & Functional Testing and also attend daily status calls and share the Weekly Status Report to client which includes the Delivery highlights, Testing results, Test case coverage, required resources, defects discovered and their status, performance baselines
  • Preparation of Signoff document for each release and publish to respective product owner.
  • Interact with client project managers and ensure the resources are billable and track it till the invoice is getting generated to the vendor.

Confidential

Senior Data Analyst/ETL Lead

RESPONSIBILITIES:

  • Offshore Test Lead for Data Conversion project works with Business, DEVs and PMs to get the requirement of the project and come up with an estimation of the project.
  • Build Test Plan document for the tasks, dependencies and highlight on the automation stuffs that was utilize in the later stage.
  • Worked in Python scripts to automate the current process in data validation process such as comparing the data between the source and target using the transformation logic as per requirement.
  • To read/write the CSV files and process of connecting the DB using the Python script. Captured the mismatch details at each field level in the separate CSV files and send the email to respective team.
  • Used SAS Tool for data validation which helps to ease out the file to file and table to table comparison. This tool helps to read the mainframe using the copybook details using the SAS dataset query.
  • Developed the SAS Program to read instream data into SAS using a DATALINES or CARDS statement and from an external raw data file using an INFILE statement.
  • LIBNAME Statement to create a new SAS data Library and also used to permanent & temporary DATA sets.
  • Using SAS tool has created bar charts, Histogram and Scatter plot graphics function that helps to analyze and report data.
  • Created various SAS Reports, Tables, Graphs and Summary analysis on Mart systems being used in these properties.
  • Extensively used SAS Data Step and PROC SQL to generate various reports as per Ad-hoc requirements.
  • As part of Validation, we have used SAS Enterprise Guide (SAS Code) to compare the Source data vs Target data with appropriate transformation logic that helps to ease out validation process with mismatch details.
  • Responsible for importing, manipulating and generating weekly and monthly list reports using SAS Enterprise Guide.
  • There are multiple extract like Finance Extract, Downstream and other Core interface extract has been compared against the legacy system (Production data) for these extract and also validated new logic which are applied for individual fields in the Test extract.
  • Reviewing of ETL mapping and transformation specifications based on the business requirements.
  • Worked on Preparation of Test Cases, Test Script for 200+ high Critical fields, and Test Scenario for Data & attend daily status calls and share the Weekly Status Report to client that includes the Delivery highlights, Testing results, Test case coverage, required resources, defects discovered and their status, performance baselines.

Confidential

Senior Data Analyst

RESPONSIBILITIES:

  • Has ability to understand the business requirements and interact with BA/DA to get the requirement of the project and come up with an estimation of the project. Build Test Plan document for the tasks, dependencies and highlight on the automation stuffs, which was utilize in the later stage.
  • Has strong expertise in writing complex SQL Queries to check the integrity of data to perform database testing and validate the converted loan numbers was transform as per new transformation rules.
  • Involved in validation of Critical Financial extracts, Downstream Extract and Core extracts and MART Validation using Platinum tool.
  • Has been involved in establishing automated Hadoop Integration testing system that improves the data validation time span and worked on preparation of Test Cases, Test Script, and Test Scenario for Data.
  • Used SAS Tool for data validation which helps to ease out the file to file and table to table comparison. This tool helps to read the mainframe using the copybook details using the SAS dataset query.
  • Used SAS Import/Export Wizard as well as SAS Programming techniques to extract data from Excel. Compared data across various data sets and analyzed dimensions of data.
  • Created various SAS Reports, Tables, Graphs and Summary analysis on PMS systems being used in these properties and checked for invalid/out of range data.
  • Using SAS tool has created bar charts, Histogram and Scatter plot graphics function that helps to analyze and report data.
  • Has good knowledge in SAS Code using the SAS Enterprise Guide tool to perform the data comparison validation and which helps to ease out the mismatch analysis
  • Attend daily status calls, and share the Weekly Status Report to client, which includes the Delivery highlights, Testing results, Test case coverage, required resources, defects discovered and their status, performance baselines.
  • All the defects was track using HP ALM Tools and shares the Trend, Summary graph to Management on daily/Weekly/Monthly basis.
  • Prepared UNIX commands to validate the contents in the dataset files in the Hadoop environment and used Copybot tool to transfer the files into different region for further analyzes and validation.
  • Used to debug, scheduling ETL Jobs/Mappings, and monitoring error logs in the mainframe environment. In addition, involved in verifying the initial and daily load Ab Initio graphs ETL jobs. Have tested reconcile and incremental ETL loads for the project.

Confidential

ETL Test Lead

RESPONSIBILITIES:

  • Onsite Test Lead for Data Conversion project work with Business, DEVs and PMs to get the requirement of the project and come up with an estimation of the project.
  • Build Test Plan document for the tasks, dependencies and highlight on the automation stuffs that was utilize in the later stage.
  • Analyze different set of stages during Extract Transform & Load processing, validated and reviewing of ETL mapping and transformation specifications based on the business requirements.
  • Worked on Preparation of Test Cases, Test Script, and Test Scenario for Data & Functional Testing and also attend daily status calls and share the Weekly Status Report to client that includes the Delivery highlights, Testing results, Test case coverage, required resources, defects discovered and their status, performance baselines.
  • Experience in SAS programming for auditing data, developing data, performing data validation QA and improve efficiency of SAS programs.
  • Used SAS Import/Export Wizard as well as SAS Programming techniques to extract data from Excel.
  • Prepared Requirement Traceability Matrix based on the business requirement document for 120 report-ids. Worked on Test Plan preparation, Test Effort estimates for test execution plan and provided Project & domain related information to team member.
  • Analyze different set of stages during Extract Transform & Load processing, validated and reviewing of ETL mapping and transformation specifications based on the business requirements.
  • Design review of Test Scenario preparation, test data preparation and test case preparation for different report-ids based on the requirements.
  • Played as on-site & offshore Co-coordinator role and conducted status call on the daily status, which helps team to get, interact with onsite team and gain more knowledge on the project related information and client expectation for each report-id.
  • Prepared Defect Summary file which was share with business partners & clients get to know about the status of each report-ids.
  • SIT Exit Criteria document has been prepared and provided UAT support for several requests. Performed on analysis of UAT requests between legacy report & MSTR report and found lot of requirements gap that was track in the separate defect and fixed them appropriately.

Confidential

ETL Test Lead

RESPONSIBILITIES:

  • Analyze system requirement of RESPA/TILA Architecture and prepared test scenarios and test cases based on transformation logic.
  • Worked on preparation of Regression test cases, prepared SQL queries to validate against the transformation conditions, and created Excel Macro to compare the table structure between IST & UAT Environment based on mapping requirements on the Requirement Traceability Matrix.
  • Analyze different set of stages during Extract Transform & Load processing and response to verify data models and data maps (extract, transform and load analysis - ETL analysis) of the data marts and feeder systems in the aggregation effort.
  • Manage and validate the sequences and jobs through different set of ETL stages.
  • Responsible for Validate and reviewing of ETL mapping and transformation specifications based on the business requirements and prepared detailed test logs for defects with necessary screenshots / queries.
  • Defect Tracking, update defect status in Rational Clear Quest, and performed retesting of defects thereby ensuring a regression and bug free ETL environment.
  • Understanding the requirement of the projects and had discussion with BA to clarify gaps in the requirement.
  • Prepared on Test Estimates and Worked as single point contact from onsite and managing 15 member offshore team.
  • Preparation of test plan activity for every release and assigning the modules to team after identifying their knowledge in the current module.
  • Based on the execution progress, individual modules tracking of each work order in SIT Phase, worked on risk assessment and mitigation planning, and conduct Daily/Weekly/Monthly SIT call to get the status of test modules.
  • Analyze different set of stages during Extract Transform & Load processing and responsible to verify data models and data maps (extract, transform and load analysis - ETL analysis) of the data marts and feeder systems in the aggregation effort.
  • Tracking progress of each work order in SIT Phase and attend UAT calls to provide update on UAT defects.
  • Defect Tracking and update defect status in Quality Center 11 and prepare new and enhance Scripts based on QTP Platforms and involved in Regression Testing and validate the sequences and jobs through different set of ETL stages.
  • Responsible for Validation and reviewing of ETL mapping and transformation specifications based on the business requirements.
  • Provided support to UAT team such as data set-up, functionality query clarification, batch job execution and preparation of SIT Exit Criteria document and closure Report document for each release.

Confidential

System Analyst

RESPONSIBILITIES:

  • Understand business requirement through team group discussion and prepared Unit Testing Checklist and Functional Test cases.
  • Worked on preparation of environment set-up using Apache 1.3.28, J2SDK1.4.2 software to build SIT environment.
  • Test Data preparation, raised Query clarification to understand functional requirement to development team, and tracked them to closure.
  • Test Execution was perform on Functional testing, Regression Testing and release management activities to assigned Modules.
  • Defect Tracking and update defect status in Quality Center. Also involved in project Monitoring and execution tracking and provided UAT support to clarify their functional query. Conduct checkout during production implementation.
  • Hitachi team has provided training session on Network Storage device about the RAID system, which helps to understand the storage and network security of the file system.
  • Prepared the UNIX script to validate the file copying between two servers and time calculation was record for file copy for multiple files with huge size file and used the CLMAN tool software for preparing test case activity for functional scenario.
  • With complete understanding of functional requirements has prepared around 2000 functional test cases and set-up the environment for SIT Testing.
  • Performed manual testing and Integration testing to cover the network storage and defects were track for UAT issues and direct interact with client and provide the response back them for their requests.
  • Installation of Neevia document converter and MSMQ software in web application to prepare environment for testing phase and understanding of the functional requirement and raised queries for further clarification.
  • Worked with OpenSTA tool to check the performance of the application, prepared test cases for each functional specification using the CLMAN tool, and provided support to client for their requests.
  • Used the Neevia Document converter that helps to convert all types of files into PDF file format. In this project, multiple documents has been converts into single PDF format document and check the order and sequence of the files.
  • OpenSTA tools which helps to check the performance of the application when system process the multiple files at the same time.
  • CLMAN 3.7 tool software was use to prepare the test case for different functional scenarios.

We'd love your feedback!