We provide IT Staff Augmentation Services!

Etl Tester Resume

4.00/5 (Submit Your Rating)

VA

SUMMARY

  • Over 6 Years of experience in IT Industry with experience in all phases of software development with emphasis on Quality Assurance & ETL Testing.
  • Experience in reviewing and testing of data in relational databases.
  • Experience inIntegration, Functional, Regression, System, Load, UAT &Black Box testing.
  • Planned, documented and tested extensive data integration process (PL/SQL) in support of financial reference data
  • Tested reports developed by Business Objects Universes for both Adhoc & Canned Reporting users of Business Objects XI R
  • Responsible for creating manual test scripts to include Functional Test, Regression Test, UAT, Data Migration Test and Study Configuration Test.
  • Used SQL Profiler for troubleshooting, monitoring, optimization of SQL Server and non - production database code as well as T-SQL code from developers and QA
  • Different Data Stage components especially are used TEMPeffectively to develop and maintain the database
  • Extensive experience in using Tableau functionalities for creating different Reports, Interactive Dashboards with page and dashboard Prompts.
  • Created ETL test data for all ETL mapping rules to test the functionality of theETL components.
  • Good noledge and experience inMetadata and Star schema/Snowflake schema.
  • Reviewed database business and functional requirements, reviewed mapping transformations.
  • Tested ETL applications in relational databases and flat files using Informatica and also tested the reports generated.
  • Interacted with Users to understand the bugs in Business Reports generated from OBIEE
  • Preparation of Test Cases based on ETL Specification Document, Use Cases, Low Level Design document.
  • Proficient in quality assurance testing by manually and using Automation tools (QTP, ALM/Quality Center)
  • Extensive experience with ETL tools Informatica Power Center, IBM Data stage and Ab Initio
  • Used Informatica Data Validation Option (DVO) for a large set of pre-built operators to build dis type of ETL testing with no programming skills required
  • Supporting presales activity for Data centric warehouse testing projects.
  • Worked with ETL group for understanding Data Stage graphs for dimensions and facts
  • Experienced in coordinating resources within Onsite-Offshore model
  • Strong experience with ETL systems, and having expertise with tools Informatica and Data stage.
  • Strong experience in testing Data Warehouse (ETL) and Business Intelligence (BI) applications.
  • Strong experience in Data Analysis, Data Validation, Data Profiling, Data Verification, Data Loading.
  • Experience in Dimensional Data Modeling using Star and Snow Flake Schema
  • Strong working experience in the Data Analysis and Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL)
  • Experience in maintaining Full and Incremental loads through ETL tools on different environments.
  • Experienced in Defect Management using ALM, HP Quality Center
  • Good experience in writing SQL in order to data validation in migration as part of backend testing
  • Good exposure with databases like Oracle 10g, DB2, SQL Server 2012
  • Good expertise in using TOAD, SQL*Plus and Query Analyzer.
  • Worked with different data sources ranging from VSAM files, flat files, and DB2, oracle and SQL Server databases.
  • Good experience in working in quality assurance methodologies like Waterfall, Scrum and agile.
  • Experience in preparingTest Strategy, developingTest Plan, Detailed Test Cases, writingTest Scriptsby decomposing Business Requirements, and developingTest Scenariosto support quality deliverables.
  • Well versed with writing detailed test cases for functional & nonfunctional requirements.
  • Experience inAutomationusing Selenium.
  • Experience in usingBugzilla,JiraandHP Quality Centerfor Bug Tracking and Defect Reporting.
  • Proficiency in Back-End Testing/Database Testing specifically in developing and executing SQL queries to interact with databases.
  • Experience in Data Warehousing/UAT Testing within the environment of Oracle, SQL Server, Informatica PowerCenter 9.1/8.6/ 8.1.
  • Validating the data files from source to make sure correct data has been captured to be loaded to target tables.
  • Clear understanding of business procedures and ability to work as an individual and as a part of a team.

TECHNICAL SKILLS

ETL Tools: Informatica Power center 9.X/8.X /7.X, Informatica Analyst

Data modeling tools: Erwin, MS Visio

Reporting tools: OBIEE, Business Objects, MS Excel

Databases: Oracle 11g/10g/9i, SQL Server 2008, DB2

Bigdata: HDFS, MapReduce, HIVE, IMPALA, Sqoop

Languages: SQL, PL/SQL, C, UNIX Shell Scripting

Operating Systems: UNIX, Windows 10/8/7, Linux, Sun Solaris

DB Tools: TOAD, SQL *Plus, PL/SQL Developer

Scheduler: Autosys

Testing Tools: HP Quality center, HPQC- ALM11, HPQC- ALM12, DOORS, Version One

MS Suite/ Project Tools: MS Office (Word, Excel, PowerPoint, Outlook), MS Project, MS SharePoint

PROFESSIONAL EXPERIENCE

Confidential, VA

ETL Tester

Responsibilities:

  • Responsible for coordinating testing throughout both systems ODS and EDW which were two different teams.
  • Tested several complex reports generated by Cognos including Dashboard, Summary Reports, Master Detailed, Drill Down and Score Cards
  • Promoted Unix/Informatica application releases from development to QA and to UAT environments
  • Proficient in quality assurance testing by manually and using Automation tools (QTP, ALM/Quality Center)
  • Involved in backend testing for the front end of the application using SQL Queries in Teradata data base.
  • Used Agile Test Methods to provide rapid feedback to the developers significantly halping them uncover important risks.
  • Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources systems including Oracle and Teradata
  • Used Control-M for job scheduling.
  • Experience in Leading the Offsite Project, managing a team of Offshore and Onsite consultants.
  • Responsible for leading team and co-coordinating with offshore team.
  • Developed data quality test plans and manually executed ETL and BI test cases.
  • Worked in Waterfall and AGILE Methodology.
  • Experienced working with Customer, Item and Vendor Data Marts for business reporting through MDM.
  • Tested data quality for MDM, customer data integration and Match-Merge process.
  • Tested Address Standardization process for MDM.
  • Tested the data acquisition, data transformation and data cleansing approach for the MDM implementation.
  • Designed and kept track of Requirement Traceability Matrix
  • Quality Center updates and test cases loading and writing Test Plan and executing Test Cases and printing status report for the team meetings.
  • Created and Developed Reports and Graphs using ALM
  • Involved in Integrating and Functional System Testing for the entire Data Warehousing Application.
  • Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices.
  • Performed data quality analysis using advanced SQL skills.
  • Tested slides for data flow and process flows using PowerPoint and Microsoft Visio
  • Wrote various documents describing DVO functionality, and how DVO should be used by group based on discussions with team lead.
  • Performed the Extract Transform and Load ETL: source to target, utilizing Informatica Power center and Data Validation Option DVO.
  • Developed test cases, test plan to ensure accurate coverage of requirements and business processes.
  • Executed test cases to verify accuracy and completeness of ETL process.
  • Validating the load process of ETL to make sure the target tables are populated according the data mapping provided dat satisfies the transformation rules and good understanding of BI business models.
  • Involved inFunctional Testing, Regression Testing, Integration & System Testing.
  • Tracked the defects using JIRA and generating defect summary reports.
  • Prepared documentation for some of the recurring defects and resolutions and business comments for those defects.
  • Knowledge in understanding of Shell scripts extensivelyand understand the source systems for ETL and DWH needs and downstream applications.
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • Tuned ETL jobs/procedures/scripts, SQL queries, PL/SQL procedures to improve the system performance.
  • Worked as Onsite Coordinator for getting the work done from offshore team.
  • Testing the reports and maintenance of daily, monthly and yearly.
  • Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic
  • Involved in Writing Detailed Level Test Documentation for reports and Universe testing.
  • Involved in data warehouse testing by checking ETL procedures/mappings
  • Implemented and maintained tools required to support data warehouse testing.
  • Performed the tests in both the FIT, QA and contingency/backup environments
  • Performed all aspects of verification, validation including functional, structural, regression, load and system testing
  • Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.
  • Worked on test data and completed unit testing to check all business rules and requirements are met. Also tested for negative data to check dat the job fails on any critical error.
  • Validated the Data from files loaded into the HDFS through Python scripting.
  • Validated the Flat files load to the HDFS in Hadoop and validated the Raw, Archival zones in the HDFS.
  • Validated the MapReduce jobs.
  • Validated the Mainframe source files loaded into the Hive/Impala databases.
  • Validated the data in SQL server across the data loaded in Hive/Impala using python scripts and shell scripts.
  • Validated data using the Hive Queries for analyzing data in Hive warehouse usingHive Query Language (HQL).
  • Ran the scripts on the edge node server to land the files into HDFS and ran the loop wrapper scripts to validate and populate the data into Hive/Impala
  • Tested several data migration application for security, data protection and data corruption during transfer
  • Responsible to halp testing team on creating test cases to make sure the data originating from source is making into target properly in the right format.
  • Functioned as the Onsite / Offshore coordinator and Team Lead
  • Tested Cognos reports and written test cases using HP Quality Center.
  • Wrote SQL and PL/SQL scripts to perform database testing and to verify data integrity.
  • Written several complex SQL queries for validating Cognos Reports.
  • Created different user defined functions for applying appropriate business rules
  • Involved in testing the OBIEE reports by writing complex SQL queries
  • Created, optimized, reviewed, and executed SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Written several complex SQL queries for validating OBIEE Reports.
  • Performed analysis of Mapping Documents and 'Schema Compare' with Database tables, logged the defects and worked with the Database Modeling Team to resolve them.
  • Used Agile testing methodology for achieving deadlines in UAT.
  • Tested several UNIX shell scripting for file validation.
  • Executing and monitoring jobs through Autosys.
  • Verified the logs to identify the errors occurred during the ETL execution.
  • Built Complex SQL queries and used TOAD for Oracle, WinSQL for DB2 and SQL Server BI Studio for SQL Server and Teradata SQL Assistant for Teradata.
  • Worked with all kinds of components with Ab Initio including Dedup, Denormalize, Normalize, Rollup, Scan, Reformat, Redifine, Sort, Joiner, XML Read, XML Write, FBE, Partition Components.
  • Created positive and negative test cases to test the Business rules.
  • Developed UNIX scripts to validate the flat files and to automate the manual test cases.
  • DevelopedTest Matrixto give a better view of testing effort
  • Involved in communication between UAT team members and business.
  • Attended defect resolution meetings with the development teams and worked towards bug resolution.

Environment: Informatica 10x, 9.1, MDM, Oracle 11i, TOAD for Data Analysts, MLOAD, Teradata SQL Assistant 6.0,Hive, Impala, HDFS, Python, SQL Server 2005, SQL/PLSQL, SOA Test, SoapUI, Postman, Control-M, Cognos, UNIX, Agile, Windows XP, Quality Center & ALM 11.0, SQL, PL/SQL, Cognos 8.0 Series, Mainframe Flat Files, Agile, COBOL II, UNIX, Control-M, Korn Shell Scripting, DB2, TOAD9.7, Ab Initio CO>OP 2.15, GDE 1.15, OBIEE, Informatica Power Exchange 10.2, Oracle 11g/12c, Toad, PL/SQL, Windows 8, UNIX, JIRA, Toad

Confidential, VA

ETL Tester

Responsibilities:

  • Reviewed requirements together with QA Manager, QA Lead & Business Analyst.
  • Responsible for creating completetest cases, test plans, test data, and reporting status ensuring accurate coverage of requirements and business processes.
  • Involved in the error checking and testing of the ETL procedures and programs
  • Tested Ab Initio graphs and used Ab Initio as an ETL tool to load the final loads
  • Tested all OBIEE Dashboards according to the requirement
  • Analyzing requirements and creating and executing test cases.
  • Validating the reporting objects in the reporter against the design specification document.
  • Validating the data files from source to make sure correct data has been captured to be loaded to target tables.
  • Responsible in testing Initial and daily loads of ETL jobs.
  • Experienced in working with DB2, Teradata, Sql Server, Oracle etc.
  • Tested Address Standardization process for MDM.
  • Tested the data acquisition, data transformation and data cleansing approach for the MDM implementation.
  • Designed and kept track of Requirement Traceability Matrix
  • Quality Center updates and test cases loading and writing Test Plan and executing Test Cases and printing status report for the team meetings.
  • Created and Developed Reports and Graphs using ALM
  • Involved in Integrating and Functional System Testing for the entire Data Warehousing Application.
  • Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices.
  • Performed data quality analysis using advanced SQL skills.
  • Tested slides for data flow and process flows using PowerPoint and Microsoft Visio
  • Wrote various documents describing DVO functionality, and how DVO should be used by group based on discussions with team lead.
  • Performed the Extract Transform and Load ETL: source to target, utilizing Informatica Power center and Data Validation Option DVO
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • Tested several data validation graphs developed in Ab Initio environment.
  • Developed and executed SQL queries using SQL plus for data verification.
  • Performed Verification, Validation, and Transformations on the Input data (Text files, XML files) before loading into target database.
  • Designed and developed UNIX shell scripts as part of ETL process to automate the loading and pulling the data for testing ETL loads.
  • Extensively written test scripts for back-end validations.
  • Worked on delimited flat file sources.
  • Involved in writing complex SQL queries to verify data from Source to Target
  • Used Quality Center for creating and documenting Test Plans and Test Cases and register the expected results.
  • TestedETLmappings to extract and load data from different databases such as Oracle, SQL Server and flat files and loaded them in to Oracle
  • Tested complex SQL scripts for Teradata database for creating BI layer on DW for tableau reporting.
  • Worked with all kinds of components with Ab Initio including Dedup, Denormalize, Normalize, Rollup, Scan, Reformat, Redefine, Sort, Joiner, XML Read, XML Write, FBE, Partition Components.
  • Tested the reports like Drilldown, Drill Up and Pivot reports generated from OBIEE
  • Used TOAD for SQL Server to write SQL queries for validating constraints, indexes.
  • Testing of records with logical delete using flags.
  • Developed and executed various manual testing scenarios and neatly documented the process to perform Functional of the application
  • Involved in testing the ETL mappings, by validating whether the mapping adhere to the development standards and naming conventions; whether the mapping do wat the technical design says it should do; whether the mapping work correctly in relation to other processes in your data logistical flow.
  • Used Rapid SQL tool to query the DB2 data and verified the results.
  • Provided weekly status report to the Project Manager and discuss issues related to quality and deadlines.
  • Environment: Ab Initio CO>OP 2.15, GDE 1.15, Co-Op3.0. 1, OBIEE, Informatica PowerCenter 9.6, Informatica Power Exchange 9.6, MDM, Hive, Impala, DB2, Teradata SQL Assistant, Teradata V2R6, SQL & PL/SQL, Oracle 11g, Outlook, and Excel.

Confidential, MD

ETL Tester

Responsibilities:

  • Responsible for creating completetest cases,test plans,test data, andreporting statusensuring accurate coverage of requirements and business processes.
  • Analyzed requirements and created and executed test cases.
  • Tested reports in Cognos using Analysis studio, Report studio and query studio.
  • Experienced in analyzing the issue by checking the log files in the AIX environment.
  • Used Mercury Quality Center for Test Planning, Test Designing, Test Analysis, Test Execution, Defect Tracking and Test Result
  • Extensive querying using TOAD to run SQL queries and monitor quality & integrity of data.
  • Extracted Data from Teradata using Informatica PowerCenter ETL and DTS packages to the target database including SQL server and used the data for Reporting purpose.
  • Actively participated in creating requirements Traceability matrices and Test plans
  • Tested graphs for extracting, cleansing, transforming, integrating, and loading data using Ab Initio ETL Tool.
  • Interacted with Users to understand the bugs in Business Reports generated from OBIEE
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Extraction of test data from tables and loading of data into SQL tables.
  • Involving in formulating the test plan, procedures and wrote test cases.
  • Involved in Data Extraction from Teradata and Flat Files using SQL assistant.
  • Conducted and actively participated in reviews, walkthroughs of test cases.
  • Involved in creating the test data for generating sample test reports before releasing to production.
  • Wrote complex SQL scripts using joins, sub queries and correlated sub queries.
  • Performed Integration, End-to-End, system testing.
  • Metadata graphs from legacy source system to target database fields and involved in creating Ab Initio DMLs.
  • Performed front-end testing on OBIEE Executive Dashboard portal
  • Involving inFunctional Testing & Regression Testing
  • Developed and executed SQL queries using SQL plus for data verification.
  • Also worked on Integration of all the processes in UAT.
  • Validated the data in SQL server across the data loaded in Hive/Impala using python scripts and shell scripts.
  • Validated data using the Hive Queries for analyzing data in Hive warehouse usingHive Query Language (HQL).
  • Ran the scripts on the edge node server to land the files into HDFS and ran the loop wrapper scripts to validate and populate the data into Hive/Impala.
  • Involved in user training sessions and assisting in UAT (User Acceptance Testing).
  • Involved in the projects related to FACETS, NASCO, FEP, NCAS, SALESFORCE, CVS Caremark, Argus, CMS, Davis Vision, Magellan, Healthy Blue, ICD 10, ACA (Affordable Care Act), Lab Corp, MLR, Quest etc.
  • Experienced in creating multiple XML files for 5010 & ICD10 837 & 835 XML manually using XML SPY.
  • Tested total life cycle modules of Health Care Solutions (Enrolling the Providers, Approve Providers in Process Manager, Member enrollment, claims submissions, Flexi Financial Services).
  • Validated Provider Certifications according to HIPPA standards.
  • Worked with EDI transactions (270,271,276,277,837,835,997) and interfaces testing.
  • Created Provider authorizations and submitted claims (professional, Institutional and Dental) validated claim Adjudication in QNXT.
  • Created Claims Flexi Financial payment cycles and validated the Claims payments in the SQL Server Database.
  • Environment: Ab Initio CO>OP 2.15, GDE 1.15, OBIEE, Informatica PowerCenter 9.6, Informatica Power Exchange 9.6, Teradata SQL Assistant, Teradata V2R6, PowerConnect, SQL Server 2000, SQL*Loader, Oracle 9i

Confidential, DC

QA Engineer

Responsibilities:

  • Responsible for dealing with developers and business analysts to better understand requirements, functionality and business process for QA testing.
  • Created Initial test plan and developed test cases and test scripts manually.
  • Reviewed Business requirements, IT Design documents and prepared Test Plans which involved various Test Cases for all assigned module/projects.
  • Involved in writing white-box test cases based on the User and Business Requirements.
  • DocumentManual testing procedures for the entire application with strong emphasis on regression and integration testing.
  • Tested the reports generated by OBIEE and verified and validated the reports using SQL.
  • Worked on Test Cases and attach Test Cases in Bugzilla.
  • Validated the data in SQL server across the data loaded in Hive/Impala using python scripts and shell scripts.
  • Validated data using the Hive Queries for analyzing data in Hive warehouse usingHive Query Language (HQL).
  • Ran the scripts on the edge node server to land the files into HDFS and ran the loop wrapper scripts to validate and populate the data into Hive/Impala.
  • Involved in user training sessions and assisting in UAT (User Acceptance Testing).
  • Involved in the projects related to FACETS, NASCO, FEP, NCAS, SALESFORCE, CVS Caremark, Argus, CMS, Davis Vision, Magellan, Healthy Blue, ICD 10, ACA (Affordable Care Act), Lab Corp, MLR, Quest etc.
  • Experienced in creating multiple XML files for 5010 & ICD10 837 & 835 XML manually using XML SPY.
  • Tested total life cycle modules of Health Care Solutions (Enrolling the Providers, Approve Providers in Process Manager, Member enrollment, claims submissions, Flexi Financial Services).
  • Validated Provider Certifications according to HIPPA standards.
  • Worked with EDI transactions (270,271,276,277,837,835,997) and interfaces testing.
  • Created Provider authorizations and submitted claims (professional, Institutional and Dental) validated claim Adjudication in QNXT.
  • Created Claims Flexi Financial payment cycles and validated the Claims payments in the SQL Server Database.
  • Created, prioritized and attached bugs in Bugzilla.
  • Updated Manualtesting procedures as and when Application functionality changes.
  • Worked with development team to fix defects and re-test manually.
  • Interacted with Developers and Business Analysts to perform various types of testing throughout Software Testing Life Cycle (STLC) and Bug Life Cycle (BLC).
  • Extensively tested several OBIEE reports for data quality
  • Database Change Verification testing using Oracle-TOAD and PL/SQL queries/procedures.
  • Verified the application on difference web browser such as IE, Mozilla Firefox and Opera.
  • Tested web-based application on different operating systems such as Windows 7 and UNIX.
  • Developed and documented complete testing process with well-written test cases.

Environment: Informatica 8x, MDM, Oracle, TOAD for Data Analysts, MLOAD, Teradata SQL Assistant, Hive, Impala, HDFS, Python, SQL Server, SQL/PLSQL, SOA Test, SoapUI, Postman, Control-M, Cognos, UNIX, Agile, Windows XP, Quality Center & ALM, SQL, PL/SQL, Cognos, Mainframe Flat Files, Agile, COBOL II, UNIX, Control-M, Korn Shell Scripting, DB2, Ab Initio CO>OP, OBIEE, Informatica Power Exchange, Oracle, Toad, PL/SQL, Windows, UNIX, JIRA, Toad

We'd love your feedback!