We provide IT Staff Augmentation Services!

Sr. Ab Initio Etl / Dw Test Analyst Resume

2.00/5 (Submit Your Rating)

Memphis, TennesseE

SUMMARY:

  • 7+ years of IT experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications.
  • Hands on experience in writing test scripts, preparing test data, testing Informatica Mappings and creating SQL scripts using stored procedures, functions, PL/SQL
  • Expert in Equity Sales System, Home Mortgage, Security Lending, Compliance and Middle Office Reconciliation
  • Solid Back End Testing experience by writing and executing SQL Queries.
  • Experience in Data Analysis, Data Validation, Data Verification, Data Cleansing, Data Verification and identifying data mismatch.
  • Excellent testing experience in all phases and stages of Software Testing Life Cycle and Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling.
  • Developed Ab Initio test environment for UAT testing and used all Ab Initio components while testing the graphs and ran the jobs on back ground.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Ab Initio graphs
  • Tested the ETL Ab Initio graphs and other ETL Processes (Data Warehouse Testing)
  • Expert in writing SQL queries and has expertise in Test Case Design, Test Tool Usage, Test Execution, and Defect Management.
  • Experience in UNIX shell scripting and configuring cron - jobs for ETL Loads
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using Data Stage
  • Expertise in Testing complex Business rules by creating mapping and various transformations
  • Experience in testing XML files and validating the data loaded to staging tables.
  • Experience in testing and writing SQL and PL/SQL statements.
  • Strong working experience on DSS (Decision Support Systems) applications, and Extraction, Transformation and Load (ETL) of data from Legacy systems using Data Stage, SSIS, Ab Initio & Informatica

TECHNICAL SKILLS:

Data Warehousing: Ab Initio CO>OP 2.15, GDE 1.15, Informatica 8.6.1/8.1/7.1/6/5.1.1/1.75 , Data Stage 8.x

Reporting Tools: Business Objects 6.5/XIR3, Cognos 8.0 Suite, Crystal Reports, SSAS, SSRS, MicroStrategy 8.x

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin

Testing Tools: Win Runner, Load Runner, Test Director, HP Quality Center, Rational Tools

RDBMS: Oracle 10g/9i/8i/7.x, MS SQL Server 2005/2008, UDB DB2, Sybase, Teradata V2R6, MS Access 2008

Programming: UNIX Shell Scripting, Korn Shell, C Shell, Bourne Shell, Bash, SQL, SQL*Plus, PL/SQL, TOAD, C++

Web Technologies: JavaScript, HTML 4.0, and DHTML, .NET, Java, J2EE, XML, XSD, XSLT

Environment: UNIX, MVS, HP-UX, IBM AIX 4.2/4.3, Hyperion, Novell NetWare, Win 3.x/95/98, NT 4.0, Sun-Ultra, Sun-Spark, Sun Classic, and SCO

MPP Databases: Netezza NPS 8050

PROFESSIONAL EXPERIENCE:

Confidential, Memphis, Tennessee

Sr. Ab Initio ETL / DW Test Analyst

Responsibilities:

  • Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
  • Developed and executed various manual testing scenarios and exceptionally documented the process to perform functional testing of the application
  • Managed multiple OLAP and ETL projects for various testing needs.
  • Debugging the SQL-Statements and stored procedures for various business scenarios.
  • Written extensive PERL and UNIX Shell scripting for data parsing and text parsing needs including archiving the old data, running backend jobs & setting up of job dependencies.
  • Performed extensive Data Validation, Data Verification against Data Warehouse
  • Loading Flat file Data into Teradata tables using Unix Shell scripts.
  • As a ETL Tester responsible for the business requirements, ETL Analysis, MVS, ETL test and design of the flow and the logic for the Data warehouse project using Informatica, Shell Scripting & PERL
  • Tested several Cognos Reports for several business needs including Dashboards, Drill-Down, Master-Detailed, Aggregated, KPI’s and Web Reports.
  • Tested and validated the Report Net reports by running similar SQL queries against the source system(s).
  • Metadata graphs from legacy source system to target database fields and involved in creating Ab Initio DMLs.
  • Created ETL test data for all ETL mapping rules to test the functionality of the SSIS Packages.
  • Tested and validated the cube data, ensuring that the data is correct by comparing the data results to comparable source system reports or by querying individual transactions and forms.
  • Tested several Informatica Mappings to validate the business conditions.
  • Testing the source and target databases for conformance to specifications
  • Conditional testing of constraints based on the business rules
  • Involved in developing Unix Korn Shell wrappers to run various Ab Initio Scripts
  • Design and execute the test cases on the application as per company standards and tracked the defects using HP Quality Center 10
  • Designed and prepared scripts to monitor uptime/downtime of different system components
  • Preventing occurrences of multiple runs by flagging processed dates
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Testing of records with logical delete using flags
  • Worked with all kinds of components with Ab Initio including Dedup, Denormalize, Normalize, Rollup, Scan, Reformat, Redifine, Sort, Joiner, XML Read, XML Write, FBE, Partition Components.
  • Tested Ab Initio graphs and used Ab Initio as an ETL tool to load the final loads.
  • Worked with all kinds of Derivatives including OPTIONS, SWAPS, FORWARDS, FUTURES and Other Derivatives like ELN, Foreign Exchange and Fund Derivatives.
  • Interacting with senior peers or subject matter experts to learn more about the data
  • Identifying duplicate records in the staging area before data gets processed
  • Extensively written test scripts for back-end validations
  • Tested Netezza database for Data Validation, Data Verification using NZ SQL.

Environment: Ab Initio CO>OP 2.15, GDE 1.15, MVS, ISPF, COBOL II, VSAM Files, Copy Books, Data Profiler, Informatica 8.5.1, SSIS/SSAS/SSRS, XML, XSLT, PERL, WinSQL, Windows NT 4.0, SQL Server 2008, QTP 9.2, Netezza NPS 8050, NZ SQL, Quality Center 9.2, DTS, T-SQL, SQL, Teradata V2R6, TOAD, Teradata SQL Assistant 6.0, Oracle 9i/10g, PL/SQL, MVS, IBM DB2, SAS, Cognos ReportNet8.0, MS Access/Excel/Visio, UNIX Shell Scripting, Hyperion

Confidential, San Francisco, CA

DWH / ETL Tester (BASEL II Compliance Data Mart)

Responsibilities:

  • Analyzed the Functional requirements using Scenarios & DDD(Detailed Design Document)
  • Worked with development team to ensure testing issues are resolved on the basis of using defect reports.
  • Generated the detailed reports of the Bugs, go no go reports and comparison charts.
  • Involved in Manual and Automated testing using QTP and Quality Center.
  • Conducted Black Box - Functional, Regression and Data Driven. White box - Unit and Integration Testing (positive and negative scenarios).
  • Defects tracking, review, analyze and compare results using Quality Center.
  • Participating in the MR/CR review meetings to resolve the issues.
  • Defined the Scope for System and Integration Testing
  • Identifying field and data defects with required information in datastage ETL and MVS, process in various jobs and one to one mapping.
  • Prepares and submit the summarized audit reports and taking corrective actions
  • Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.
  • Tested several data validation graphs developed in Ab Initio environment.
  • Worked with all kinds of components with Ab Initio including Dedup, Denormalize, Normalize, Rollup, Scan, Reformat, Redefine, Sort, Joiner, XML Read, XML Write, FBE, Partition Components.
  • Preparation of various test documents for ETL process in Quality Center.
  • Involved in Test Scheduling and milestones with the dependencies
  • Functionality testing of email notification in datastage job failures, abort or data issue problems.
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
  • Created and executed test cases for DataStage jobs to upload master data to repository.
  • Identified & presented effort estimations related to testing efforts to Project Management Team
  • Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
  • Tested slides for data flow and process flows using PowerPoint and Microsoft Visio
  • Tested graphs for extracting, cleansing, transforming, integrating, and loading data using Ab Initio ETL Tool.
  • Used all Teradata utilities including Fast Load, Multi Load, Fast Export, SQL Assistant, BTEQ & TPUMP
  • Using Import Wizard deploying the reports from Dev system to QA box.
  • Used Sets, Metrics, Productive Analysis and Enterprise Performance Management (Scorecard and Dashboard).
  • Tested ad hoc and canned reports for Business objects.
  • Tested reports for various portfolios using universe as the main data provider.
  • Tested reports using Business Objects functionality like Queries, Slice and Dice, Drill down, @Functions, Formulae etc.
  • Used T-SQL for Querying the SQL Server2008 database for data validation and data conditioning.
  • Tested graphical representation of reports such as Bar charts, Pie charts as per End user requirements.
  • Worked with SSIS system variable, passing variables between packages.
  • Tested Desktop Intelligence and Web Intelligence reports.
  • Conducted test case review and revision of surrogate key generation in DataStage to uniquely identify master data elements for newly inserted data.
  • Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
  • Check the naming standards, data integrity and referential integrity.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Ab Initio CO>OP 2.14, GDE 1.14, QTP 9.2, SSIS/SSAS/SSRS, SQL Server 2005, Quality Center 9.0, Load Runner7.0, PERL, Business Objects XIR2, Oracle 10g, Java, Unix AIX 5.2,VB Script, T- SQL, Hyperion, SAS, XML, XSLT, XML Spy, SQL, PL/SQL, Stored Procedures, Shell Scripting, XPath, OLAP, MVS, Tera Data V2R5, MLOAD, FLOAD, FEXPORT, TPUMP, BTEQ, COBOL II, VSAM, JCL

Confidential, Manhattan, New York

DWH Tester (Global Wealth Management - EDWH - Master Data Management)

Responsibilities:

  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Develop test plans based on test strategy. Created and executed test cases based on test strategy and test plans based on ETL Mapping document.
  • Written complex SQL queries for querying data against different data bases for data verification process.
  • Designed the data flow diagrams using MS VISIO.
  • Prepared the Test Plan and Testing Strategies for Data Warehousing Application.
  • Preparation of technical specifications and Source to Target mappings.
  • Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security
  • Written test cases to test the application manually in Quality Center and automated using Quick Test Pro.
  • Worked with SSIS system variable, passing variables between packages.
  • Created cascading prompts at the universe level. These cascading prompts were used within full client and thin client reports to narrow down the selection parameters.
  • Tested different types of reports, like Master/Detail, Cross Tab and Charts (for trend analysis).
  • Developed automated test scripts from manual test cases for Regression testing based on the requirement documents using Quick Test Professional.
  • Written Test Plans and Test Cases on Mercury’s Test Director Tool.
  • Defects identified in testing environment where communicated to the developers using defect tracking tool Mercury Test Director.
  • Developed scripts, utilities, simulators, data sets and other programmatic test tools as required executing test plans.
  • Tested a number of complex ETL mappings, mapplets and reusable transformations for daily data loads.
  • Designed and supported SQL 2000 Reporting services, Integration services and Analysis services.
  • Creating test cases for ETL mappings and design documents for production support.
  • Setting up, monitoring and using Job Control System in Development/QA/Prod.
  • Extensively worked with flat files and excel sheet data sources. Wrote scripts to convert excel to flat files.
  • Scheduling and automating jobs to be run in a batch process.
  • Effectively communicate testing activities and findings in oral and written formats.
  • Reported bugs and tracked defects using Test Director 6.5
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • Worked with ETL group for understating mappings for dimensions and facts.
  • Extracted data from various sources like Oracle, flat files and SQL Server.
  • Worked on issues with migration from development to testing.
  • Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic.
  • Compared the actual result with expected results. Validated the data by reverse engineering methodology i.e. backward navigation from target to source.

Environment: SAS/Base 8.1, SAS/Macros, SAS/ETL, PERL, UNIX Shell Scripting, Informatica Power Center 7.1 (Power Center Designer, workflow manager, workflow monitor), Hyperion, Mercury Test Director 6.5, QTP 7.2, SQL *Loader, Cognos 7.0, Oracle8i, SQL Server 2000, Erwin 3.5, Windows 2000, TOAD 7, Business Objects 6.1, Teradata V2R4

Confidential, Atlanta GA

ETL QA Tester

Responsibilities:

  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Performed segmentation to extract data and create lists to support direct marketing mailings and marketing mailing campaigns.
  • Developed automation test scripts to test the database by retrieving the data from the tables using HP QTP and VB Scripting
  • Reviewed and approved database modifications
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mappings.
  • Troubleshooting, resolving and escalating data related issues and validating data to improve data quality.
  • Extensively tested the Business Objects report by running the SQL queries on the database by reviewing the report requirement documentation.
  • Validating the reporting objects in the reporter against the design specification document.
  • Validating the data files from source to make sure correct data has been captured to be loaded to target tables
  • Create XML request files for test scenarios and test cases for the claims control web services
  • Validating the load process of ETL to make sure the target tables are populated according the data mapping provided that satisfies the transformation rules.
  • Tracked the progress of the defects till end of the defect life cycle, used HP Quality Center for defect tracking and maintained the workflows and trail history of bugs using HP Quality Center.
  • Validating the Archive process to purge the data that meet the defined business rules.
  • Writing complex SQL queries using Case Logic, Intersect, Minus, Sub Queries, Inline Views, and Union in Oracle.
  • Did testing on ETL Informatica process for various data loading scenarios.
  • Tested Canned/Ad-hoc reports using Business Objects Reporter functionalities like Cross Tab, Master Detail and Formulas, Slice and Dice, Drill Down, variables, filters, conditions, breaks, sorting, @Functions, Alerts, Cascading Prompts and User Defined Objects.
  • Validated the ETL Scripts in the Target Database (Oracle) for creating the cube before migrating it to SQL Server.

Environment: Informatica PowerCenter 8.6, QTP, Business Objects, Quality Center 10, Oracle 10g/9i, Flat files, SQL, PL/SQL, TOAD 7.x, java, Windows NT, XML,Agile, Shell Scripting, Sun Solaris

We'd love your feedback!