We provide IT Staff Augmentation Services!

Etl Qa Test Analyst Resune Profile Oakbrook,il

3.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY

  • Over 7 years of experience Software Quality Assurance QA experience testing Data Warehouse, Database ETL BI , Web, Client-Server Systems and Applications for various Industries.
  • Experience in defining Testing Methodologies creating Test Plans and Test Cases, Verifying and Validating Application Software and Documentation based on standards for Software Development and effective QA implementation in all phases of Software Development Life Cycle SDLC .
  • Strong in Software Analysis, Planning, Design, Development, Testing, Maintenance and Augmentation for various applications in data warehousing, metadata repositories, data migration, data mining and Enterprise Business Intelligence.
  • Expert in ETL, Data Warehousing, front-end and BI testing.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica PowerCenter
  • Extensive experience in writing SQL to validate the database systems and for backend database testing.
  • Extensively worked on design and implementation of Database Management Systems such as Oracle 10g / 9i / 8i / 7.x, DB2 UDB, MS SQL Server, and MS Access.
  • Implemented all stages of development processes including extraction, transformation and loading ETL data from various sources into Data Warehouse and Data Marts using Informatica PowerCenter using Informatica Designer Source Qualifier, Warehouse Analyzer, Transformation Developer, Mapping and Mapplet Designer , Repository Manager, Workflow Manager and Workflow Monitor.
  • Have good knowledge of HIPPA 4010 /5010 versions.
  • Profound understanding of insurance policies like HMO and PPO and proven experience with HIPPA 4010 EDI transaction codes such as 270/271 inquire/response health care benefits ,276/277 Claim status , 834 Benefit enrollment , 835 Payment/remittance advice , 837 Health care claim .
  • Experienced in working with Business Users, Business Analysts, IT leads and Developers in identifying and gathering Business requirements to further translate requirements into functional and technical design specifications.
  • Worked extensively in UNIX environment
  • Comprehensive knowledge of Ralph Kimball's data modeling concepts including Dimensional Data Modeling and Star/Snowflake Schema Modeling.
  • Extensive experience in performing Black Box, Regression, Integration, and User Acceptance testing.
  • Experienced working with Excel Pivot and macros for various business scenarios.
  • Effective independently or in a team. Excellent communication as well as inter-personnel skills. Ability to convey technical information at all levels. Excels in research, analysis, and problem solving skills.

TECHNICAL SKILLS

ETL Tools

Informatica PowerCenter 9.0/8.x/7.x,Ab Initio GDE 1.15/1.14 / 1.13 / 1.12, DataStage 7.x/6.x, Operating System 2.15/2.14 / 2.12 / 2.11, EME

Data Bases

Oracle 10g / 9i / 8i / 7.x, MainFrame via web3270, DB2 UDB, MS SQL Server 2008/2005/2000 / 7.0, MS-Access, teradata v2r6

Development Languages

SQL Plus, T-SQL, PL/SQL 2.2 / 8.x, Unix Shell Scripting, TOAD 7 / 7.6, SQL Loader, VBA

Operating Systems

UNIX, Windows vista / XP / 2000 / NT / 98 / 95.

BI Tools

Cognos 8.4 , business objects xir3, ssrs, ssas, olap

Data Modeling Tools

ERwin 4.1 / 4.0

Methodologies

Ralph Kimball's Data modeling, Star and Snowflake Schema Modeling

Testing Tools

Mercury Quality Center 9.0 / 8.0, Test Director 7.6 / 7.0, Quick Test Professional 6.5 / 5.6, Win Runner 7.5 / 7.0

CAD and CAM Tools

Auto CAD2000/14/12, Microstation95, Gerb Tool7.3

Workflow Tools

PuTTY, WinSCP3, MS-Project, MS-Excel, MS-PowerPoint, MS-Word.

Management Tools

Peregrine ServiceCenter 5.1, HP Project and Portfolio Management Center ITG 7.1

PROFESSIONAL EXPERIENCE

Confidential

ETL QA Test Analyst

Responsibilities:

  • Analysis of Business requirements Design Specification Document to determine the functionality of the ETL Processes.
  • Good Understanding of the EDI Electronic data interchange , Implementation and Knowledge of HIPAA code sets.
  • Tested the ETL Informatica mappings and other ETL Processes Data Warehouse Testing .
  • Worked on HIPAA Transactions and Code Sets Standards according to the test scenarios such as 270/271, 276/277,837/835 transactions.
  • Worked with Agile Methodology.
  • Performed HIPAA Analysis and Testing on 4010/5010, including Claims handling, with Payer and Provider experience and EDI file generation and testing
  • Involved in code changes for SAS programs and UNIX shell scripts
  • Tested SAS Programs, to create customized ad-hoc reports, processed data for publishing business reports.
  • Involved in testing the Cognos reports by writing complex SQL queries.
  • Created flow charts to exhibit the flow of data from source datasets to the final reports using MS VISIO.
  • Interacted with end users to obtain specific system requirements and for User Acceptance Tests UAT .
  • Day-to-day Cognos administration activities like monitoring Scheduled jobs like Cube refresh, Impromptu scheduled reports, Backup and recovery maintenance
  • Created Test input requirements and prepared the test data for Data Driven testing.
  • Generated reports using Informatica Power Analyzer.
  • Prepared extensive set of validation test cases to verify the data
  • Tuned stored procedures/scripts, SQL queries to improve the system performance
  • Developed Test Plan and Test Cases using HP Quality Center to test the application for the new code.
  • Prepared Test Cases for the mappings developed through the ETL Informatica tool and executed those Test Cases.
  • Used Quality Center as central repository to maintain the test scripts and print the status report of the same.
  • Worked with Rational Clear Quest for bug tracking.
  • Wrote several UNIX scripts for invoking data reconciliation.
  • Involved in extensive DATA validation using SQL queries and back-end testing

Environment: Informatica 8.6,HIPPA, EDI 5010/4010, Cognos 8 BI Report Studio, Cognos 7.3 Series, Oracle 10g, SQL Server 2005/2000, HP Quality Center 9.2, SQL,PL/SQL,Sybase, QTP, TOAD, XML, Korn Shell Scripts, UNIX, Windows XP

Confidential
ETL/BI Analyst EDWH
  • Reviewed the Business Requirement Documents and the Functional Specification.
  • Prepared Test Plan from the Business Requirements and Functional Specification.
  • Developed Test Cases for Deployment Verification, ETL Data Validation, Cube Testing and Report testing.
  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping Mapplet Designer and Transformation.
  • Tested to verify that all data were synchronized after the data is troubleshoot and also used SQL to verify/validate my test cases.
  • Worked on a Business Intelligence reporting system that was primarily functioning on Oracle Applications OLTP environment with Business Objects for Business Intelligence reporting
  • Tested the reports using Business Objects functionalities like Queries, Slice and Dice, Drill Down, Cross Tab, Master Detail and Formulae etc.
  • Responsible for testing Business Reports developed by Business Objects XIR2.
  • The reports that were created in Business Objects were testing by running the SQL statements
  • Supported the extraction, transformation and load process ETL for a Data Warehouse from their legacy systems using Informatica and provide technical support and hands-on mentoring in the use of Informatica for testing.
  • Extensively used Informatica to load data from Flat Files to Teradata, Teradata to Flat Files and Teradata to Teradata
  • Reviewed Informatica mappings and test cases before delivering to Client.
  • Worked as ETL Tester responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.
  • Written several UNIX scripts for invoking data reconciliation.
  • Experienced in writing complex SQL queries for extracting data from multiple tables.
  • Testing has been done based on Change Requests and Defect Requests.
  • Preparation of System Test Results after Test case execution.
  • Performed Functional, Regression, Data Integrity, System, Compatibility testing
  • Extensively executed T-SQL queries in order to view successful transactions of data and for validating data in Sql Server Database.
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • TOAD is used to perform manual test in regular basis. UNIX and Oracle are using in this project to write Shell Scripts and SQL queries.
  • Wrote SQL queries to validate source data versus data in the data warehouse including identification of duplicate records.
  • Experienced in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results using Mercury Quality Center
  • Prepared Test status reports for each stage and logged any unresolved issues into Issues log.
  • Used T-SQL for Querying the SQL Server database for data validation.
  • Writing the test scripts for manual testing.
  • Involved with ETL test data creation for all the ETL mapping rules.
  • Preparing and supporting the QA and UAT test environments.
  • Tested different detail, summary reports and on demand reports.
  • Communicated discrepancies determined in testing to impacted areas and monitored resolution.

Environment: SQL, PL/SQL, UNIX, DB2, TERADATA V2R6 MLOAD, FLOAD, FEXPORT, BTEQ, TPUMP , putty, Shell Scripting, Business Objects XIR2/6.5.1/6.0/5.1.x/4.0, XML Files, IBM, Informatica PowerCenter 9.0/8.x/7.x, AUTOSYS, ORACLE 10G, TOAD 10, CA Erwin 4.0

Confidential
ETL/BI Test ANALYST
  • Reviewed the Business Requirement Documents and the Functional Specification.
  • Prepared Test Plan from the Business Requirements and Functional Specification.
  • Carried out data profiling for multiple loan feeds.
  • Performed ETL testing based on ETL mapping document for data movement from source to target.
  • Extensively used Informatica to load data from Flat Files to Teradata, Teradata to Flat Files and Teradata to Teradata.
  • Written several complex SQL queries to validate the Data Transformation Rules for ETL testing.
  • Written extensive UNIX Shell scripting for data parsing and text parsing needs including archiving the old data, running backend jobs setting up of job dependencies.
  • Performed extensive data validations against Data Warehouse
  • Loaded flat file Data into Teradata tables using Unix Shell scripts.
  • Responsible for verifying business requirements, ETL Analysis, ETL test and design of the flow and the logic for the Data warehouse using Informatica and Shell Scripting
  • Tested several Informatica Mappings to validate the business conditions.
  • Conditional testing of constraints based on the business rules
  • Designed and executed the test cases on the application as per company standards and tracked the defects using HP Quality Center 9.2
  • Designed and prepared scripts to monitor uptime/downtime of different system components
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Monitored the data movement process through Data Extract to flat files through Informatica execution flows and Loading data to Data mart through utilities.
  • Testing the ETL data movement from Oracle Data mart to Teradata on an Incremental and full load basis.
  • Developed the ETL process to automate the testing process and also to load the test data into the testing tables.
  • Used Excel Pivoting for various running totals, total sales for trades, highest performance of trades

Environment: VBA, SQL, PL/SQL, Excel Pivot, Informatica PowerCenter 8.6.1, Oracle 10g/9i, MainFrame via Web3270, VSAM Files, Copy Books, Cognos 8 BI , Cognos Connection, Cognos 8 BI Query Studio., DB2 , WinSQL, UNIX AIX , Linux, Putty, Mercury Quality Center 9.0/8.0, Autosys, Teradata, TOAD, XML

Confidential
ETL Tester
  • Executed campaign based on customer requirements
  • Did extensive work with ETL testing including Data Completeness, Data Transformation Data Quality for various data feeds coming from source.
  • Did functional testing using QTP
  • Involved in automation of test cases using QTP.
  • Extraction, Transaction and Loading was done using different Components and different expressions using Ab Initio to create test data.
  • Different Ab Initio components especially are used effectively to develop and maintain the database
  • Developed inline view queries and complex SQL queries and improved the query performance for the same.
  • Delivered file in various file formatting system ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.
  • Responsible for analyzing various data sources such as flat files, Relational Data Oracle, DB2 UDB, MS SQL Server from various heterogeneous data sources.
  • Followed company code standardization rule
  • Identify issues, information, and behaviors during the adoption of a proprietary information management system.
  • Accelerate the rate of adoption of the system, improve the quality of the data being input and generated, and promote accountability amongst the staff and users.
  • Develop code necessary to introduce additional reports, reverse engineer data models to the business meaning, and instruct users on the account management and implementation process advantages derived from the system.
  • Identify and document deficiencies in the proprietary information management system during initial implementation.

Environment: VBA, Excel, AbInitio,ERwin 4.1, Oracle 9i, DB2 ,SQL Server 2000, SQL, PL/SQL, TOAD, SQL Developer, Mercury Quality Center 8.0, Microsoft Office 2003, Windows XP/2000

Confidential
ETL Analyst/Tester
  • Responsible for building the management process, making sure controls are being followed, assisting in the requirements design, and coordinating the efforts of performers in the Enterprise Information Management Department.
  • Involved unit and integration test for the Informatica and database level.
  • Created technical specifications, mapping documents and managed test cases.
  • Participated in development of an estimation tracking tool for level of effort projections.
  • Gathered information, compiled findings and prepared management reports for staffing levels.
  • Developed database applications for managing IT staffing requirements and monitoring the status of outstanding requisitions.
  • Written several complex SQL queries to validate the data conditions as per the mapping document.
  • Provided data analysis, identified trends, summarized results and generated reports for Card Solutions Delivery reorganization effort.
  • Extracted data from different sources like, Oracle, Flat files, Xml loaded into Operational Data Source and tested the same.
  • Extensively used SQL and PL/SQL Procedures and Worked in both UNIX and Windows environment.
  • Worked on loading of data from several flat files to XML Targets.
  • Created UNIX shell scripts for Informatica ETL tool to automate sessions.
  • Developed graphic representation of various metrics used in the forecasting, budgeting and procurement processes for the Merchandising Department.
  • Utilized Access database to collect data from multiple database sources using ODBC methods.
  • Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed. Debugged mappings for failed sessions.
  • Extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings and sessions.
  • Responsible for Unit Testing of the Mappings created according to the business requirements.

Environment: Oracle 7.3, SQL, PL/SQL, SQL Plus, Windows 2000/NT/98/95, Microsoft office 97, Informatica 6.1, UNIX, PERL, Shell Scripting, XML

Confidential

Backend/ETL/SQL Tester

Responsibilities:

  • Used Quest Software's Toad application to set up data, create, and execute SQL script on Oracle 9i database. Completely involved in Database Testing.
  • Design and development of ETL processes using Data Stage ETL tool for dimension and fact file creation. .
  • Validating the complete source to target mappings along with transformation rules.
  • Created and executed detailed test cases with step by step procedure and expected result
  • Experience in ETL Data Warehousing, database testing using DataStage for Workflow process.
  • Extracted data from Relational Sources, Flat Files as well as Excel sheets.
  • Profound insight to determine priorities, schedule work, and meet critical deadlines within budget guidelines.
  • Experience in working independently and multi task without negative impact to timelines or quality.
  • Strong Knowledge on the phases of Software Development Life Cycle SDLC Methodology.
  • Utilized Cognos 7.3 Series for building the Reports.
  • Experience in preparing the traceability matrix, test scenarios and test cases.
  • Involved in preparation of Requirement Traceability Metrics RTM , Software Metrics, Defect Report, Weekly Status Reports and SQA Report using Test Director.
  • Perform black box testing by designing and constructing test cases, test data and test execution. Experienced in writing complex SQL queries for extracting data from multiple tables.
  • Review the test cases written based on the Change Request document.
  • Testing has been done based on Change Requests and Defect Requests.
  • Preparation of System Test Results after Test case execution.
  • Experienced in writing UNIX script.
  • Extensively used PL/SQL features such as procedures, functions, packages database triggers for maintaining complex integrity constraints and implementing the complex business rules.
  • Tested data migration to ensure that integrity of data was not compromised.
  • Experienced in writing complex SQL queries for extracting data from multiple tables.
  • Review the test cases written based on the Change Request document.
  • Testing has been done based on Change Requests and Defect Requests.
  • Preparation of System Test Results after Test case execution.
  • Created database objects such as tables, views, synonyms, indexes and sequences as well as custom packages tailored to business requirements.
  • Created Indexes and partitioned the tables to improve the performance of the queries.
  • Tested Cognos Reportnet reports into Cognos 8BI.
  • Extensively worked with XML, Flat Files, DAT, CSV, Excel Sheets and Fixed Length Files
  • Designed and Developed partition exchange process to minimize user access downtime to the warehouse.
  • Developed SQL loader scripts to load the data into the custom tables.
  • Involved in debugging and tuning the PL/SQL code, tuning queries, optimization for the Oracle database and wrote Unix Korn Shell Scripts to schedule the business process.

Environment: DataStage 7.x/6.x, Designer, Director, Manager, Parallel Extender , Erwin, Oracle 9i, TOAD, XML, XSLT, XML Spy 2008, SQL Server 2000, DB2, MS-Access, XML files, PL/SQL, Cognos 8 BI Report Studio, Cognos 7.3 Series UNIX, Shell, Windows NT

We'd love your feedback!