We provide IT Staff Augmentation Services!

Etl/qa Analyst Resume Profile

3.00/5 (Submit Your Rating)

Albany, NY

Summary:

  • 7 years of IT experience in the analysis, design, development, testing and implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications.
  • Worked with many ETL testing and BI testing projects.
  • Solid Back End Testing experience by writing and executing SQL and PL/SQL Queries.
  • Experience in Data Analysis, Data Validation, Data Cleansing, Data Standardization, Data Verification and identifying data mismatch.
  • Excellent testing experience in all phases and stages of Software Testing Life Cycle and Software Development Life Cycle SDLC with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling.
  • Very good understanding of Data Warehousing concepts, Data Analysis, Data Warehouse Architecture and Designing.
  • Experience in Data Modeling using Erwin in Client/Server and distributed applications development.
  • Ability to quickly adapt to different project environments, work in teams and accomplish difficult tasks independently within timeframe.
  • Expert in writing SQL queries and Test Case Design, Test Tool Usage, Test Execution, and Defect Management.
  • Experience in UNIX shell scripting and configuring cron-jobs for ETL Loads
  • Extensive experience in providing end-to-end business intelligence solution by dimensional modeling design, building business models, configuring metadata, Answers, Delivers/iBots, creating Reports/Dashboards, cache monitor, and building Siebel Analytics/OBIEE Repository.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing in a corporate-wide-ETL solution using ETL tools like Informatica, SSIS, DataStage and Ab Initio
  • Expertise in Testing complex Business rules by creating mapping and various transformations
  • Strong working experience on DSS Decision Support Systems applications and Extraction, Transformation and Load ETL of data from Legacy systems.
  • Extensive experience in working with databases including Oracle 10g/9i/8i, MS SQL Server 2008/2005/2000/7.0
  • Strong in testing Stored Procedures, Functions, Triggers and packages utilizing PL/SQL.
  • Good experience in Data Modeling using Star Schema and Snow Flake Schema and well versed with UNIX shell wrappers, KSH and Oracle PL/SQL programming.
  • Extensive experience in writing SQL to validate the database systems and for backend database testing.
  • Good experience in data sources, data profiling, data validation, developing low level design patterns based on the business and functional requirements.
  • Excellent communication, analytical and interpersonal skills.

Technical Skills:

OPERATING SYSTEMS

Windows XP,NT/95/2000, OS/2, Sun Solaris 2.6/2.7,Linux 6.X/7.X/8.X

LANGUAGES KNOWN

C, PL/SQL 8.x/2.x, SQL Plus, SAS 9.1/8.1

RDBMS

Oracle 7.X/8/9i/10g, Teradata V2R6, MS SQL Server 6.5/7.0/2000, MS Access 7.0/'97/2000, SQL Server 11.x/12.0

REPORTING TOOLS

Business Objects XIR3/R2/R1/6.0/ 5.1, OBIEE 10.1.3, Cognos 6.5/7.0/8.0

SCRIPTING LANGUAGES

VB Script, Java Script, XSLT, PERL, UNIX Shell Scripting

DATABASES

IBM DB2 8.x, Oracle 9i/10g, Teradata V2R6, Sybase 12.3, SQL Server 2000/2005, Netezza NPS 8050

TECHNOLOGIES

Active X, OLEDB, and ODBC

WEB SERVERS

Java WebServer2.0, Netscape Enterprise Server, Web Logic 6.0

DATAMODELLING TOOLS

Erwin 3.5.1/4.x, Designer 2000

ETL Tools

Informatica PowerCenter / PowerMart 8.6.1/8.1/7.1/6.1/5.1, DataStage 8.1/7.x, ETL, Data Mining, DataMart, OLAP, OLTP, SQL Loader, TOAD 7.5.2, DataFlux, WinSQL

Apr 2013 to Present

Confidential

Role: ETL/QA Analyst

Responsibilities:

  • Participate in Analysis of Business and Functional requirements, Reviewing and Understanding the product specification document
  • Develop test scripts based on defined test cases utilizing the testing too.
  • Involved in developing detailed test plan, test cases and test scripts for Functional, Security and Regression Testing.
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Wrote complex SQL queries for data validation for verifying the ETL mapping rules.
  • Involved in extensive DATA validation using SQL queries and back-end testing.
  • Used SQL tools to run SQL queries and validate the data loaded in to the target tables
  • Tested UNIX Shell Scripts and SQLs to get data from Oracle tables before executing Informatica workflows.
  • Preventing occurrences of multiple runs by flagging processed dates
  • Testing of records with logical delete using flags
  • Identifying duplicate records in the staging area before data gets processed
  • Extensively written test scripts for back-end validations
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS.
  • Extensively used Informatica Power Center for running the workflows, individual session. Uses Workflow monitor, session logs in the UNIX environment to analyze the defects while running the workflow.
  • Validated the data transformations developed by DB2 stored procedures and Informatica.
  • Parameterized the scripts to validate the different Data Fields.
  • Tested the source and target systems Extracted data from Teradata tables and upload to Oracle.
  • Performed smoke testing, system testing, user acceptance testing, regression testing, and client updating testing.
  • Run Test Scripts and Capture test Results.
  • Extensively worked on data validation of the data in the OBIEE reports and OBIEE dashboards.
  • Responsible for issues tracking which are encountered during testing of various modules of the application/client.
  • Performed quality assurance reviews on all System Development Life Cycle.
  • Consolidated bugs at the end of the day using Rational Clear Quest.
  • Design, build and test -reports, Dashboards and other functionality within the OBIEE suite.
  • Take Specification for Defect Reports and create those reports.
  • Interacting with developers for the bug fixes and problem resolution.
  • Coordinated in meetings with Business, Project Managers, Technical Analysts and Developers.

Environment: Informatica Power Center 8.6.1,Unix Korn Shell Scripting, Windows XP, Rational Clear Quest 8.0, Oracle 11g/10g, TOAD, Oracle SQL developer, OBIEE 11.1.1.5, SQL, PL/SQL, MS Excel, Rational Clear Case 8.0

Confidential

ETL / QA Analyst Medicaid Data Mart

Responsibilities:

  • Analyzed the Business and User requirements and Participated in the creation, preparation, and conduct of quality assurance reviews and Functional/Regression testing.
  • Reviewed project documentation, business requirements to prepare detailed test schedules and plans.
  • Performed Integration, System, Regression, Performance and User Acceptance testing of an application.
  • Written Test Scripts based on the Business requirements and executed Functional testing and data validation, data reconciliation with defect correction and retesting, followed by regression and performance testing.
  • Worked as ETL Tester responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.
  • Involved in Testing of ETL processes using DataStage ETL tool for dimension and fact file creation
  • Used SQL for Querying the DB2 database in UNIX environment
  • Tested the Oracle Applications and Portals like Executive Dashboard, Drill Down, Summary Reports, Portal built on Oracle.
  • Extracted data from Oracle and upload to Teradata tables using Teradata utilities FASTLOAD
  • Used Teradata SQL Assistant for accessing Teradata database on DEV,QA and Production areas
  • Used TOAD Software for Querying ORACLE. And Used Teradata SQL Assistant for Querying Teradata
  • Involved in Testing part for Extracting data from different source systems into the target database using Datastage Server 8.1
  • Responsible for different Data mapping activities from Source systems to Teradata
  • Executed Datastage ETL jobs using UNIX to load the data into target tables
  • Involved in creating both positive and negative test data to cover all business scenarios
  • Used Clear Quest to track and report system defects and bug fixes. Written modification requests for the bugs in the application and helped developers to track and resolve the problems.
  • Involved in other test planning meetings and submitted test metrics daily to the management
  • Participated in defining and executing test strategies using agile methodology.
  • Written and executed the functional and regression test script using Test Script QTP.
  • Design, build and test-reports, Dashboards, BI Publisher and Microsoft Integration and other functionality within the OBIEE suite.
  • Tested the reports different types of Customized Reports Drilldown, Aggregation Created by OBIEE to meet client requirements.
  • Recorded and programmed quick test scripts through the expert view, by adding conditional statements, loops and other user-defined functions.
  • Involved in Performance testing, Regression testing, Unit Testing, System Testing and User Acceptance Testing.
  • Ran SQL queries and store procedures for data base testing for the verification of results retrieved.

Environment: DataStage 8.1, Oracle 10g, IBM DB2, QTP, TOAD, OBIEE 10.1.3.1, SQL, PL/SQL, MS Excel, Teradata V2R6, MLOAD, FLOAD, FEXPORT, TPUMP, BTEQ, Teradata SQL Assistant, Windows 2000, Windows XP, Rational Clear Quest, PERL, UNIX Shell Scripting, Control M

Confidential

ETL QA Analyst

Responsibilities:

  • Involved in understanding Logical and Physical Data model using Erwin Tool.
  • Written Test Plan, Test Strategy, Test Script, Test Cases, Reported bugs and tracked defects using Quality Center 10.0
  • Performed all aspects of verification, validation including functional, structural, regression, load and system testing
  • Writing complex SQL queries for data validation for verifying the ETL Mapping Rules.
  • Collected requirements and tested several business reports.
  • Extensively test the reports for data accuracy and universe related errors.
  • Tested several dashboards and deployed them across the organization to monitor the performance.
  • Involved in user training sessions and assisting in UAT User Acceptance Testing .
  • Experience testing data conversions and migrations in cross platform scenarios Informatica power center
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Tested Standard and Ad-hoc Impromptu Reports
  • Tested multi dimensional cubes using models and implemented incremental updates for power cubes
  • Created Impromptu Reports and modified existing reports as per users requirement
  • Experience in creating UNIX scripts for file transfer and file manipulations
  • Developed and executed Informatica Workflows and verified the results.
  • Created ETL execution scripts for automating jobs.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Tested the different sources such as Flat files, Main Frame Legacy Flat Files, SQL server 2005 and Oracle to load into the Teradata data warehouse
  • Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices.
  • Performed periodic checks to run crosscheck against QA/SIT/PROD environments to ensure it is up and running.
  • Tested various Drill-through reports from report to report.
  • Performed front-end testing on Data Portal and Executive Sales Dashboard portal
  • Tested Merchandise Data Mart DM of Oracle system and Stored Procedures
  • Tested Operation Data Store process of ODS/DWL product
  • Used workflow manager for session management, database connection management and scheduling of jobs.

Environment: Informatica PowerCenter 8.6.1, XML, XSLT, XSD, WinSQL, Windows NT 4.0, SQL Server 2008, Quality Center 10.0, DTS, T-SQL,SQL, Teradata V2R6, TOAD, Teradata SQL Assistant 6.0, Oracle 10gi, PL/SQL, IBM DB2, WinSQL, Cognos Series 8.0, Microsoft Visio 2003, MS Access, XML/VSAM/Flat Files, Autosys, PERL

Confidential

Sr. ETL/QA Analyst Equity Financing Data Warehouse

Responsibilities

  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center 9.0
  • Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Extraction of test data from tables and loading of data into SQL tables.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mapping
  • Tested the ETL Informatica mappings and other ETL Processes Data Warehouse Testing
  • Written several complex SQL queries for validating Business Objects Reports.
  • Tested several stored procedures.
  • Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Validating the data passed to downstream systems.
  • Worked with Testing Data Extraction, Transformation and Loading ETL .
  • Involved in testing data mapping and conversion in a server based data warehouse.
  • Involved in testing the UI applications
  • Created Test input requirements and prepared the test data for Data Driven testing.
  • Tested the reports based upon data generated by using BO connected to different data sources
  • Written test cases to test the application manually in Quality Center and automated using QTP
  • Tested Catalog in Impromptu Administrator to create Reports
  • Tested Cubes by using BO Power Play to analysis
  • Worked with business team to test the reports developed in BO
  • Tested multi dimensional Cubes for reporting using Transformer.
  • Used Informatica Designer to test mappings, transformations and source and target tables
  • Tested various Drill through reports from report to report.
  • Reported operations sum and average, intersection, etc.
  • Tested whether the reports developed in BO are as per company standards.
  • Used Quality Center to track and report system defects
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.

Environment: Informatica 8.1/7.1, Informix, DB2, Java, Business Objects , SQL, QTP,SQL Server 2000/2005, Teradata V2R5 MLOAD, FLOAD, FAST EXPORT, BTEQ , Teradata SQL Assistant 7.0, XML, XSLT, IBM AIX 5.3, UNIX, Shell Scripting, WINSQL, Ultra edit, Rumba UNIX Display, Quality Center 8.2

Confidential

Software Specialist

Responsibilities

  • Designed and developed UNIX shell scripts as part of the ETL process to automate the process of loading, pulling the data for testing ETL loads.
  • Written several shell scripts using UNIX Korn shell for file transfers, data archiving, error log creations and log file cleanup process.
  • Developed and Tested UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
  • Written several complex PL/SQL statements for various business scenarios.
  • Loaded data from operational data store ODS to data warehouse tables by writing and executing foreign key validation programs to validate where exactly star schema appears, with fact tables and dimensions/lookup tables.
  • Writing Triggers enforcing Integrity constraints, Stored Procedures for Complex mappings, and cursors for data extraction.
  • Worked extensively with mappings using expressions, aggregators, filters, lookup and procedures to develop and feed Data Mart.
  • Did data parsing, text processing and connecting to the database using PERL.
  • Developed UNIX Shell scripts to automate repetitive database processes
  • Tested several ETL routines and procedures.
  • Identify the primary key logical / physical and put update or insert logic
  • Deleting the target data before processing based on logical or physical primary key
  • Design and execute the test cases on the application as per company standards
  • Preventing occurrences of multiple runs by flagging processed dates
  • Written Teradata MLOAD, FLOAD, BTEQ, TPUMP, FEXPORT Case statements.
  • Tuned database and SQL statements and schemas for optimal performance.
  • Expertise in SQL queries for the cross verification of data.

Environment: Oracle 7.0, SQL Plus, SQL, Test Director, SQL Server 2000, T-SQL, SQL, PL/SQL, Visual Basic 6.0, Windows 95, XML, XSLT, XSD, UNIX, Korn shell scripting, PERL, MVS, JCL, ISPF, VSAM Files, OS/390

We'd love your feedback!