Etl Tester Resume
Chicago, Il
SUMMARY:
- About 8 years of ETL, BI & Hadoop end to end Testing experience in Insurance, Media Research and Retail domains
- Worked extensively on Requirement Analysis, Test Planning, Test Case Development, and Test Execution, (Complete STLC) using Teradata, Oracle, SQL Server, Confidential SSIS and ETL/BI Tools in Agile methodology
- Solid knowledge and experience in complex SQL queries & HIVE Queries (SQL Joins, temporary Tables) to perform ETL backend/database & migration testing
- Data testing experience focusing on OLTP & OLAP & MDM systems
- Experience in integration testing of various data sources like SQL Server, Oracle, db2 and flat files (Flat files, pdf, csv, etc.)
- Proficient in ETL Process and Expert in testing SCD Type - 2 data
- Proficient in Extraction, Transformation, and Loading (ETL) solutions
- Proficient in ETL & BI standards and best practices
- Expert in risk based testing for new features
- Expert in Prioritizing & execution of Test cases in Dev/QA/Prod environments
- Solid experience in preparing Test Estimates and Test point analysis
- Expert in testing Dimension and Fact tables (Dimensional Modeling)
- Optimized SQL Test scripts using various techniques like Sub tables, execution plan, collect statistics and indexes to improve their performance
- Expert in testing Self-service BI reports using Oracle OBIEE by generating Drilldown, Slicing, Dicing, Parameterized reports and graphs
- Good knowledge in designing and developing mappings through various transformations such as lookup, source qualifier, router, sequence generator, aggregator, rank, filter joiner and sorter transformations
- Good Knowledge on Hadoop ecosystem such as Hive, Sqoop, Pig, HBase
- Knowledge on API Testing using Postman Tool and UI Testing across browsers worked on Testing web applications using Selenium & Firebug
- Worked extensively to ensure QA issues are resolved using defect tracking tools and appropriately documented
TECHNICAL SKILLS:
Databases: Teradata 13.x/14.x, Oracle 11g, DB Visualizer, SQL Server 2014, MySQL 5.x, Toad, HIVE
ETL Tools: DataStage, Informatica Power center 9.x, Informatica DVO & MDM & Address Doctor, Pentaho DI 4.x, Data Dictionary, Data flux, SQL Developer, Confidential SSIS
Reporting Tools: OBIEE 12c, Pentaho BI Community Edition, Confidential SSRS (SQL Server Reporting Services), Micro Strategy 9.4x
Version Control Tools: SharePoint, PVCS, WINCVS, Star Team
Operating Systems: All Windows client and server platform, Linux
Programming: HIVE, SQL, T-SQL (Stored Procedures, Triggers), PL/SQL, Teradata SQL, UNIX, XML, HTML, Python, Putty
Software: HUE (Big data), SuperPutty, Query Surge, JIRA 6.x, Issue Log, HP QC, HP ALM, WinSCP, TFS 2013, Postman, MS-Office, Confidential SharePoint, Win Merge, FileZilla, Firebug, Selenium, Control M, Jenkins
PROFESSIONAL EXPERIENCE:
Confidential, Chicago, IL
ETL Tester
Responsibilities:
- Understanding the Project requirements, System functionality and Scope
- Developing Test Plans/Strategy based on requirements document (190/340)
- Developed Test Scripts as per ETL mapping documents
- Worked on complex HIVE SQL Scripting to ensure that Metadata, Object definitions & test data is accurate in Hadoop environment
- Experience in testing MDM - Master Data sourced from Landing & staging to Target DB and Outbound Files are generated with golden records, which led to downstream systems
- Extensive experience in preparing RCA (Root Cause Analysis) documents for the data mismatches
- Understand the Business logics used to populate Dashboards and reports
- Expert in testing BI reports (Drill Down, Slice, Dice, Sub Reports) data to ensure business logics and calculations are in sync with the warehouse
- Responsible for the execution of automated test scripts
- Streamlining test phase through process improvements
- Involved in study of existing Operational Systems, Data Modeling and Analysis.
- Running the batch processing workflows
- Identify defects by debugging scripts and Log those defects in Jira & HP QC
- Able to work collaboratively with Developers and maintain detail testing & RCA (Root Cause Analysis) documents
- Identified business processes and able to relate to system flows and address the gaps
- Identify technical staffing requirements, and estimates
- Leading and providing technical guidance effectively for offshore team
Environment: DataStage, Hadoop Eco System, SOA, HIVE QL, Pig, Sqoop, HBase Python, Unix, Oracle, HP QC, JIRA, WinSCP, Scrum Agile, Query Surge
Confidential, Stamford, CTETL & BI Lead
Responsibilities:
- Analyzes requirements document and develops test cases to ensure full test date coverage
- Document Test Plan/Cases & Estimates in Confluence
- Tested Business Reports and Dashboards to ensure the functionality meets end-user business requirements
- Expert in testing BI reports (Drill Down, Slice, Dice, Sub Reports) data to ensure business logics and calculations are in sync with the warehouse
- Developing Test scripts for warehouse based on SSIS ETL mappings documents (D1/D2)
- Testing ETL data feeds from various data sources to ware house
- Continuously improve test coverage and validation cycle through design of test data sets using SSIS platform
- Expert in testing ETL transformations from source systems to warehouse
- Worked collaboratively with Developers and maintain detail testing documentation
- Coordinated with onshore/offshore resources in executing tasks
- Optimized SQL scripts using various techniques like Sub tables, execution plan, collect statistics and indexes to improve their performance.
- Involved in study of existing Operational Systems, Data Modeling and Analysis
- Running the workflows and populating the date into the data mart.
- Resolve Bugs in mapping by debugging across multiple platforms
- Designing Report test cases and validating across dashboards/ Browsers
- Identify high impact scenarios defects by debugging scripts and Log those defects in Jira/Confluence
- Able to work collaboratively with Developers and business analysts in complex projects and maintain detail documentation for exploratory/Adhoc testing
- Weekly status reports walkthrough for both technical and non-technical stakeholders
Environment: Confidential SSIS, MDM Architecture, Teradata, Oracle OBIEE, Win Merge, Toad, Data flux, FileZilla, Selenium IDE, Firebug, Query Surge, Safe Agile
ConfidentialETL Tester
Responsibilities:
- Developing test plans based on business requirements document
- Understand the transformation logics used to populate warehouse tables
- Developed test scripts as per requirements documents
- Worked on building complex SQL test scenarios to test Electronic Data Interchange EDI files 834/835
- Testing data with Metadata, MDM and Full Minus queries to ensure test data accuracy
- Extensively used ETL methodologies for testing and supporting data extraction, transformations (ETL) & loading process, in a corporate-wide-ETL Solution using Informatica.
- Expert in testing dimension and fact tables
- Responsible for the execution of software test scenarios in TEST/QA/PROD environments
- Understanding Project Data Modeling and relations
- Running the batch processing workflows
- Performing regression testing on design changes
- Experience in preparing Test Estimations, Test Strategy, developing Test Plan, Detailed Test Cases, writing Test Scripts by decomposing Business Requirements, and developing Test Scenarios to support quality deliverables on time and reporting bugs in Jira & Mantis
- Communicated effectively with Developers and business team to stay updated with design changes
- Performed Peer review on Test scenarios & scripts developed
Environment: Teradata, Informatica, DataStage, Abinitio, Netezza, Data flux, SQL Developer, Mantis, Jira, Query Surge ETL, Scrum
ConfidentialETL / SQL Developer
Responsibilities:
- Understanding the functionality, scope and overall architecture of system
- Involved in requirements walkthrough to identify the Development and unit testing scope and feasibility
- Understand the data mapping from source to target tables and business logic used to populate target table
- Developing mappings from source to target based on ETL mappings documents
- Developed scripts as per ETL mapping documents
- Worked in Monitoring & Production Support Activities
- Prepared unit testing documentation and Defects log sheets for developed mappings.
- Running the workflows and populating the date into the data mart
- Involved in study of existing Operational Systems, Data Modeling and Analysis.
- Resolve Bugs in mapping by debugging
- Designing Unit test cases and test scripts as per ETL mappings logics
- Log the mismatches in bug tracking tool
- Documents results inconsistent with expected results
- Coordinating with Data Analysts and QA Team for quick resolution of testing related issues
Environment: Teradata, Data flux, Pentaho, Informatica, waterfall, Agile
ConfidentialETL QA
Responsibilities:
- Involved in data mapping from source to target tables and business logic used to populate target table
- Generating ER-Diagrams using data dictionary and understanding the hierarchy
- Perform basic DML, DDL Operations as per the Business requirement
- Migrated data from Heterogeneous Data Sources (Access, Excel, Flat File, Oracle) to centralized Teradata database
- Created objects as per the user requirement
- Involved in development and maintenance of database systems to extract, cleanse, transform, load and maintain data
- Worked on data analysis and Table level analysis
- Involved in debugging the scripts and performing root cause analysis on the issues
- Involved in Peer review and Unit testing
- Actively participated in Project Review Meetings
- Involved in the development backend code, altered tables to add new columns, Constraints, Sequences and Indexes as per business requirements
- Adherence to the defined delivery process/guidelines
Environment: Teradata, Data flux, SQL Server, MySQL