We provide IT Staff Augmentation Services!

Sr. Etl Sql Bi Dwh Test Analyst Resume

0/5 (Submit Your Rating)

St Petersburg, FL

SUMMARY

  • Over 7+ years of IT experience, in Manual and Automated testing on web applications, Data warehouse testing, Client - Server, Database migration and Database testing.
  • Experience in all phases of Software Development life cycle such as Agile, Scrum, Waterfall, RUP
  • Good understanding of ETL Test Processes and Test Methodologies.
  • Extensive experience with the Bug tracking systems, source code/version control systems, bugs reporting, root cause analysis, documenting the test results, meeting the delivery schedules.
  • Proficient in Data warehouse and Backend application Testing Life cycle.
  • Strong understanding of Test Methodologies and Test Plans in Project Life Cycles.
  • Experience in preparing Test strategy, Developing Test plan, Test cases, and Writing test Scripts by Decomposing business Requirements and Developing Test scenarios to support quality deliverables.
  • Experience in UNIT Testing, Integration Testing, System Testing Functionality, Regression, Performance Testing and User Acceptance Testing.
  • Extensive experience in testing and implementing Extraction, Transformation and Loading of data from multiple sources into Data warehouse using Ab Initio, Informatica and SSIS
  • Expertise in Developing PL/SQL Packages, Stored Procedures/Functions, triggers.
  • Expertise in utilizing Oracle utility tool SQL*Loader and expertise in Toad for developing Oracle applications and Teradata SQL Assistant for Teradata.
  • Highly experienced with different RDBMS such as Oracle 10g, MS SQL Server 2000, Teradata
  • Worked with Informatica transformations such as multi joins of heterogeneous flat files, connected/unconnected lookups, procedures, routers etc.
  • Experience in Performance Tuning of SQL and Stored Procedures.
  • Automated and scheduled the Informatica jobs using UNIX Shell Scripting and Good at Unix Shell scripting and Automation of ETL process.
  • Application Data warehousing experience in Banking, Prime Brokerage, Financial, Retail and Insurance.
  • Experience in scheduling the Informatica, Ab Initio, Data Stage jobs using Appworx, Ctrl-M & Autosys.
  • Performed Unit, Integration and System Testing.
  • Good analytical and problem solving skills.
  • Team player with excellent Organization and Interpersonal skills and ability to work independently and deliver on time.

TECHNICAL SKILLS

ETL Tools: Informatica 8.6.1, Ab Initio (GDE 1.15, Co>Op 2.15), SSIS

Databases: Teradata V2R6, Oracle 10g/9i/8i/7.3, MS SQL Server 6.5/7.0/7.5/2005 , Sybase12.0/11.0, MS Access

BI/Reporting Tools: Business Objects XIR 3.1/2/6.5/6.0 , Cognos BI 8.0

QA Tools/Bug Reporting: QTP 9.5/9.2/9.0, Rational Clear Quest, Test Director, HP Quality Center, PVCS tracker, Remedy Tool, MS Access, Bugzilla, Siebel Service Request

Version Control: Synergex PVCS Tracker, Rational Clear Case 7.1/7.0.1/7.0.0 , Visual Source Safe 2005, Control Version System CVS

Programming: TSL, SQA basic, Java script, VB Script, VBA, Shell Scripting, XSLT, PERL, HTML, C, C++, Java, Visual Basic 5.0, AS/400 CL

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin

Operating Systems: Windows XP/NT/2000, DOS, Mac 10.4, AIX UNIX, LINUX

PROFESSIONAL EXPERIENCE

Confidential, St Petersburg FL

SR. ETL SQL BI DWH Test Analyst

Responsibilities:

  • Involved in project planning, coordination and implementation of QA methodology based on the Business requirement and Design documents.
  • Created test plan and test cases from the business requirements to match the project’s initiatives in Quality Center.
  • Used the most comprehensive approach of data warehouse testing. ie, validating the data at each transformation ; first the Source tables staging tables, then the loading tables and finally the destination tables.
  • Extensively used INFORMATICA for extraction, transformation and loading process and validate the tables using SQL queries.
  • Validating the data files from source to make sure correct data has been captured to be loaded to target tables
  • Extensively used Teradata load utilities Fast load, Multiload and FastExport to extract, transform and load the Teradata data warehouse
  • Extracted data from Oracle and upload to Teradata tables using Teradata utilities FASTLOAD.
  • Ran several INFORMATICA UNIX scripts from backend for getting the data counts to verify the reconciliation process.
  • Created public folders, private folders, personalized pages and custom views on the Cognos connection portal.
  • Extensively written Teradata SQL Queries, creating Tables and Views by following Teradata Best Practices.
  • Validating the load process of ETL to make sure the target tables are populated according the data mapping provided that satisfies the transformation rules.
  • Writing complex SQL queries using Case Logic, Intersect, Minus, Sub Queries, Inline Views, and Union in Oracle.
  • Written SQL scripts to test the mappings and extensively used Cognos for report generation
  • Used SQL and PL/SQL scripts to perform backend database testing.
  • Identifying duplicate records in the staging area before data gets processed and loaded by INFORMATICA into Datawarehouse.
  • Checked for any inconsistent joins and resolved loops in the catalog.
  • Tested UNIX shell scripts as part of the ETL process to automate the process of loading, pulling the data for testing ETL loads and checking the status of ETL jobs in UNIX.
  • Arranged meetings with business customers in better understanding the reports by using business naming conventions.
  • Involved in validating the aggregate table based on the rollup process documented in the data mapping.
  • Tested graphs for extracting, cleansing, transforming, integrating, and loading data using INFORMATICA ETL Tool.
  • Create and Maintain Release Jobs in Control-M and submit Transition Service Requests on behalf of the Release.
  • Involved in testing the Cognos reports by writing complex SQL queries.
  • Tested different master detail, summary reports, ad-hoc reports and on demand reports using Cognos Report Studio.
  • Distributed the reports to the users via Cognos Connection.
  • Involved in testing Cognos Reports and closely worked with operations, and Release teams to resolve the production issues
  • Extensively involved with backend testing by writing complex SQL queries.
  • Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic.
  • Checked the reports for any naming inconsistencies and to improve user readability.
  • Retested the modifications, once bugs are fixed after reinstalling the application.
  • Making sure the QA Design & best Practices are incompliance with Albertson’s Guidelines.
  • Developed presentation and testing implementation learning to other testing resources for cross functional .
  • Experience in job Scheduling tools like Control M.
  • Reported the bugs through email notifications to developers using Rational ClearQuest.
  • Generated Problem Reports for the defects found during execution of the Test Cases and reviewed them with the Developers. Worked with Developers to identify and resolve problems.

Environment: INFORMATICA 9.1/8.6.1, Cognos 8.4 series, TOAD, DB2, Oracle 11g, Teradata 12.0, Teradata SQL Assistant 7.0, SQL, PL/SQL, XML Files, XSD, XML Spy 2008, HP QTP 11.0, HP ALM Quality Center 11.0, Rational ClearQuest 7.0, Control-M, Java Script and HTML, MS SQL Server 2010, Shell Scripting, UNIX.

Confidential, Downers grove, IL

Sr. ETL SQL QA BI Data warehouse Test Engineer

Responsibilities:

  • Involved in project planning, coordination and implementation of QA methodology based on the Business requirement and Design documents.
  • Analyze system issues and prepare business and technical requirements documents using established CMM level 2 standards.
  • Created test plan and test cases from the business requirements to match the project’s initiatives in Quality Center.
  • Used the most comprehensive approach of data warehouse testing. ie, validating the data at each transformation ; first the Source tables staging tables, then the loading tables and finally the destination tables.
  • Extensively used Ab Initio for extraction, transformation and loading process and validate the tables using SQL queries.
  • Validating the data files from source to make sure correct data has been captured to be loaded to target tables
  • Extensively used Teradata load utilities Fast load, Multiload and FastExport to extract, transform and load the Teradata data warehouse
  • Extracted data from Oracle and upload to Teradata tables using Teradata utilities FASTLOAD.
  • Ran several Ab Initio UNIX scripts from backend for getting the data counts to verify the reconciliation process.
  • Created public folders, private folders, personalized pages and custom views on the Cognos connection portal.
  • Extensively written Teradata SQL Queries, creating Tables and Views by following Teradata Best Practices.
  • Validating the load process of ETL to make sure the target tables are populated according the data mapping provided that satisfies the transformation rules.
  • Writing complex SQL queries using Case Logic, Intersect, Minus, Sub Queries, Inline Views, and Union in Oracle.
  • Written SQL scripts to test the mappings and extensively used Cognos for report generation
  • Used SQL and PL/SQL scripts to perform backend database testing.
  • Identifying duplicate records in the staging area before data gets processed and loaded by Ab Initio into Datawarehouse.
  • Checked for any inconsistent joins and resolved loops in the catalog.
  • Tested UNIX shell scripts as part of the ETL process to automate the process of loading, pulling the data for testing ETL loads and checking the status of ETL jobs in UNIX.
  • Arranged meetings with business customers in better understanding the reports by using business naming conventions.
  • Involved in validating the aggregate table based on the rollup process documented in the data mapping.
  • Tested graphs for extracting, cleansing, transforming, integrating, and loading data using Ab Initio ETL Tool.
  • Create and Maintain Release Jobs in Control-M and submit Transition Service Requests on behalf of the Release.
  • Involved in testing the Cognos reports by writing complex SQL queries.
  • Tested different master detail, summary reports, ad-hoc reports and on demand reports using Cognos Report Studio.
  • Distributed the reports to the users via Cognos Connection.
  • Involved in testing Cognos Reports and closely worked with operations, and Release teams to resolve the production issues
  • Extensively involved with backend testing by writing complex SQL queries.
  • Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic.
  • Checked the reports for any naming inconsistencies and to improve user readability.
  • Retested the modifications, once bugs are fixed after reinstalling the application.
  • Making sure the QA Design & best Practices are incompliance with Albertson’s Guidelines.
  • Developed presentation and testing implementation learning to other testing resources for cross functional .
  • Experience in job Scheduling tools like Control M.
  • Reported the bugs through email notifications to developers using Rational ClearQuest.
  • Generated Problem Reports for the defects found during execution of the Test Cases and reviewed them with the Developers. Worked with Developers to identify and resolve problems.

Environment: Ab Initio CO>OP 2.15, AB INITIO GDE 1.15, EME, Cognos 8.1 series, TOAD, DB2, Oracle 10g, Teradata V2R6, Teradata SQL Assistant, SQL, PL/SQL, XML Files, XSD, XML Spy 2008, QTP 9.5, Mercury Quality Center 10.0, Rational ClearQuest 7.0, Control-M, Java Script and HTML, MS SQL Server 2008, Shell Scripting, UNIX.

Confidential, San Francisco, CA

Sr. ETL Tester/QA Analyst/Data warehouse - Data Migration

Responsibilities:

  • Worked with Business analysis team to prepare a detailed Business Requirement documents.
  • Involved in writing & implementation of the test plan, various test cases & test scripts.
  • Developed unit test cases, configured environments and prepared test data for testing.
  • Performed data quality test to determine whether processed data is correctly extracted, transformed and loaded.
  • Performed Functionality, Stress, Security, and GUI & Regression testing of web-based applications.
  • As QA Tester, performing responsibilities with both the functionality and back end testing.
  • Wrote SQL, PL/SQL Testing scripts for Backend Testing of the data warehouse application.
  • Worked with Production environment to resolve several data issues.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server, MS Excel, MS Access and Flat Files.
  • Used PL/SQL programs for performance testing and wrote PL/SQL /Complex SQL queries for system testing.
  • Incremental development/testing of Enterprise Data Warehouse.
  • Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
  • Tested several reports and automated using Quick Test Pro.
  • Written several Stored Procedures and modified existing stored procedures.
  • Involved with extraction routines to verify all the required data loaded into target systems.
  • Extensively executed SQL queries in order to view successful transactions of data and for validating data.
  • Extensively used Informatica tool to extract, transform and load the data from Oracle to DB2.
  • Was involved in analyzing scope of testing the application. Test Strategy was created to test the all modules of the Data Warehouse.
  • Created the test data for interpreting positive/negative results during functional testing.
  • Involved in the whole testing life cycle covering the entire project from initiation to completion
  • Clearly communicated and documented test results and defect resolution during all phases of testing.
  • Focused on Data Quality issues / problems that include completeness, conformity, consistency, accuracy, duplicates and integrity.
  • Used SQL queries and custom programs to perform data validation.
  • Prepared test cases by understanding the business requirements, Data Mapping documents and technical specifications.
  • Developed and documented data Mappings/Transformations, and Informatica sessions as per the business requirement.
  • Involved in the development of Informatica Mappings and also tuned them for better performance.
  • Designs and directs the preparation of test data; prepares test data, documents detailed results, provides regular and ad-hoc test reports.
  • Did data analysis and data testing and verified fields present in the reports are as agreed in the ETL specifications.
  • Verified the Business Objects universe with appropriate Joins, Aliases and Aggregate Awareness tables and built customized Classes and Objects.
  • Verified the applied security features of Business Objects like report level & object level security in the universe so as to make the sensitive data secure
  • Tested reports developed by Business Objects Universes for both Adhoc & Canned Reporting users of Business Objects XI R3.1
  • Tested the Business Objects Web I Reports are created and deployed through Info View.
  • Tested whether the Business Objects reports developed in OLAP are as per company standards.
  • Data transformation tool (ETL) and Business Objects as data mining and front-end reporting
  • Extensively tested the Business Objects report by running the SQL queries on the database by reviewing the report requirement documentation.
  • Maintained the test logs, test reports, test issues, defect tracking using Quality Center.
  • Involved in preparation of Requirement Traceability Metrics (RTM), Software Metrics, Defect Report, Weekly Status Reports and SQA Report using Quality Center
  • Worked closely with the development teams to replicate end user issues

Environment: Oracle 9i, SQL Server 2000, Informatica Power Center 7.1, SQL, PL/SQL, Stored Procedures, HP QTP 9.0, HP Quality Center 9.2, Points Portal, Autosys, MS Office Suite, Cognos 7.0 series, HTML, JAVA, JSP, JavaScript, Clear Case, RUP, TOAD, Requisite Pro.

Confidential, Herndon, VA

SR. ETL Tester

Responsibilities:

  • Involved in Business analysis and requirements gathering.
  • Tested/Found the defects in universes and reports. Used Mercury Quality Centre for tracking the defects.
  • Tested all data reports published to Web including dashboards, summarized, master-detailed and KPI’s.
  • Worked as ETL Tester responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.
  • Created Short-cut joins, aliases and contexts to make the universe loops free.
  • A test reporting environment was established to combine Distribution Work Management and PeopleSoft information into a consolidated place to allow combined reports to be generated.
  • Develop ETL test plans based on test strategy. Created and executed test cases and test scripts based on test strategy and test plans based on ETL Mapping document.
  • Extensively used Ab Initio for extraction, transformation and loading process in project.
  • Observed the Extract, Transform, Load process with Ab Initio and tested the data load process using the Ab Initio ETL tool and validated the data using SQL queries.
  • Tested and worked on creating open document reports for business.
  • Used various @Functions like @Prompt (for user defined queries), @Where (For creating conditional filters), and @Select for testing Business Reports with various boundary conditions.
  • Preparation of technical specifications and Source to Target mappings.
  • Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security
  • Experienced using query tools for Oracle to validate reports and troubleshoot data quality issues.
  • Solid testing experience in working with SQL Stored Procedures, triggers, views and worked with performance tuning of complex SQL queries.
  • Validated format of the reports and feeds.
  • Effectively communicate testing activities and findings in oral and written formats.
  • Worked with ETL group for understanding Ab Initio graphs for dimensions and facts.
  • Extracted data from various sources like Oracle, flat files and SQL Server.
  • Designing and creation of complex mappings using SCD type II involving transformations such as expression, joiner, aggregator, lookup, update strategy, and filter.
  • Optimizing/Tuning several complex SQL queries for better performance and efficiency.
  • Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.
  • Worked on issues with migration from development to testing.
  • Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
  • Tested the ETL Ab Initio graphs and other ETL Processes (Data Warehouse Testing)
  • Written several shell scripts using UNIX Korn shell for file transfers, data parsing, text processing, job scheduling, job sequencing and cleanup process.
  • Validated cube and query data from the reporting system back to the source system.
  • Data transformation tool (ETL) and Business Objects as data mining and front-end reporting
  • Tested several business reports developed using Business Objects including dashboard, drill-down, summarized, master-detail & Pivot reports.
  • Responsible for testing Business Reports developed by Business Objects XIR2
  • Tested Business Objects reports for verifying and validating the data. Tested several complex reports generated by reporting tool including Dashboard, Summary Reports, Master Detailed, Drill Down and Score Cards

Environment: Ab Initio (GDE 1.12, Co>Op 2.12), EME, SQL *Loader, Business Objects XIR2, PL/SQL, SQL, Stored Procedures, Test Cases, Test Scripts, Test Plan, Oracle8i/9i, SQL Server 2000/2005, Erwin 3.5/4.1, Windows 2000, TOAD 7

Confidential, Hopewell, NJ

Sr. ETL QA Backend Tester

Responsibilities:

  • Reviewed business requirement documents and technical specifications.
  • Worked with ETL Informatica for Extract, Transform and Load process and loaded the data source databases to targeted database.
  • Documented Test Cases corresponding to business rules and other operating conditions.
  • Did testing on multiple Informatica mapping for data validation and data conditioning.
  • Carried out ETL Testing using Informatica.
  • Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
  • Executed sessions and batches in Informatica and tracked the log file for failed sessions.
  • Extensively written Teradata SQL Queries, creating Tables and Views by following Teradata Best Practices.
  • Involved in developing test cases to test Teradata scripts (Bteq, multiload, fastload).
  • Updated the status of the testing to the QA team, and accomplished tasked for the assigned work to the Project Management team regularly
  • Wrote complex SQL scripts to validate target data based on the business requirements.
  • Implemented data verification procedures for ETL processes in load testing.
  • Write SQL queries to validate that actual test results match with expected results
  • Check the naming standards, data integrity and referential integrity.
  • Responsible for monitoring data for porting to current versions.
  • Checked the reports for any naming inconsistencies and to improve user readability.
  • Involved in testing data reports. Written several SQL queries for validating the front-end data with backend.
  • Compared the actual result with expected results. Validated the data by reverse engineering methodology ie backward navigation from target to source.
  • Used Clear Quest for defect tracking and reporting, updating the bug status and discussed with developers to resolve the bugs.
  • Responsible for testing the reports according to client’s requirement using Business Objects 6.5.1.
  • Preparing Defect Reports and Test Summary Report documents.
  • Checking the status of ETL jobs in UNIX.
  • Interacted with Business Analyst, Database Administrators, development team and End Users.

Environment: Informatica 7.1, Teradata V2R5, DB2, SQL, PL/SQL, Flat files, SQL Assistant 7.2, UNIX, Autosys, Rational Clear Case, XML Files, VSAM Files, Business Objects 6.5.1, JCL, COBOL II, Quality center 8.0, PERL Scripting, Shell Scripting, UNIX

Confidential

Backend Developer/Tester

Responsibilities:

  • Used SQL to query the DB and created/modified SQL scripts to validate backend data.
  • Interacted with the Business Users for user acceptance testing.
  • Created Test Cases using the SDLC procedures and reviewed them with the Test lead and Development team.
  • Executed all the Test Cases in the Test Environment and maintained them and documenting the test queries and result for future s.
  • Developed complex mappings using Informatica Power Center Designer to transform and load the data from various source systems like SQL Server, DB2 into the Oracle target database.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Created and scheduled Worklets. Setup workflow and Tasks to schedule the loads at required frequency using Workflow Manager.
  • Worked in the performance tuning of the programs, ETL Procedures and processes.
  • Developed test cases, prepared test data sets, performed system integration testing defined by business rules and executed through ETL.
  • Validating the load process of ETL to make sure the target tables are populated according the data mapping provided that satisfies the transformation rules.
  • Validating the load process of ETL to make sure the target tables are populated according the data mapping provided that satisfies the transformation rules.
  • Validating the data files from source to make sure correct data has been captured to be loaded to target tables.
  • Extensive experience using SQL to develop queries.
  • Full lifecycle design, development in Teradata database and/or dimensional data modeling, data dictionary and Meta data repository.
  • Experience in document traceability of requirements to tests and capture evidence of test pass/fail.
  • History of successful analysis, metrics, risk / change management skills.
  • Validating the Archive process to purge the data that meet the defined business rules.
  • Deployment of code in the UNIX boxes for testing purpose.
  • Involved in validating the aggregate table based on the rollup process documented in the data mapping.
  • Developed Traceability matrix from Requirements document, detailed design.

Environment: Informatica 6.1, SAP, Business Objects 6.5, Teradata V2R4, Query Man, Oracle 8i, Test Director 7.5, Java Script and HTML, Java, Control M, HP-UNIX, Star Team, HP Service manager, Web Services, Web logic, Windows XP, UNIX Shell Scripting, XML Files, VSAM Files, COBOL II, JCL, Flat Files.

We'd love your feedback!