Data Analyst / Tester Resume
SUMMARY:
- 7 years of IT industry experience in Business Intelligence – Data warehousing – ETL –Disparate heterogeneous Databases with strong analytical and problem solving abilities.
- Worked on various Domains like Financial, Pharma and Telecommunications.
- Excellent understanding of Data warehousing Concepts - Star and Snowflake schema, SCD Type1/Type2, CDC, Surrogate Keys, Normalization/De-Normalization, Dimension & Fact tables.
- Strong working experience in the Data Analysis, Data Verification and Validation of an ETL Applications using Backend/Database Testing.
- Worked with heterogeneous relational sources like Oracle, MS Access and SQL Server
- Experience in working third party tools like WinSQL and TOAD.
- Experienced in complete Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) and Bug Life Cycle (BLC)
- Extensive experience in Black Box testing, Functional testing, Integration testing, System testing, Regression Testing, Performance testing on Warehouse Applications.
- Involved in preparation of Test Cases based on High level requirements and make an ensure detailed/system level process wrote Test Scripts
- Involved in preparation of Test data to test the functionality of ETL Sources Mappings and Targets.
- Knowledge on SDLC, V-Model and Agile Methodology.
- Have hands on experience in testing the ETL (informatica) mappings and BO reports.
- Expertise in Defect management, Bug tracking, Bug Reports and generating the graphs using Test Management Tools such as Quality Center, Test Director.
- Involved in preparation of Traceability Matrix based on High/Detail requirements and Test Cases.
- Experience in maintaining support documents, Test Plans, QA Sign off Documents and Maintaining Weekly Status Reports.
- Experienced in handling concurrent projects and providing expected results in the given timeline.
- Excellent communication, documentation and presentation skills
TECHNICAL SKILLS:
Testing Tools:
Quality Center 9.0, Test Director 8.0, Rational Requisite Pro, Rational Test Manager, Rational Clear Quest.
Data Modeling:
Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical / Logical Data Modeling, Erwin 4/3
Databases:
Oracle 9i/8i, SQL Server 2000, DB2, TeraData, MS Access 2000.
BI :
Business objects
ETL :
Informatica Power Center 8.X/7.x/6.2
Tools:
WinSQL, TOAD, Autosys
Environment:
Windows95/98/2000/XP/2003, Unix 5.2/4.3,Sun Solaris7, WinNT4.0
EDUCATION:
Bachelors in Engineering
PROFESSIONAL EXPERIENCE:
Confidential,LA, CA Jun’10–TillDate
Data analyst / Tester (BASEL II-LMT project)
Confidential,is a full-service commercial bank, providing a broad mix of financial services to businesses and individuals, also offers investment and financial management, trust services and private banking.
I am working with different projects:
LMT: (Limit Management System) ALGO Credit Limit (ACL) is a credit exposure and limits management system that measures and consolidates credit risk data used in the banking & financial sector. Consolidated credit risk data provides an enterprise-wide view of limits, exposures, capital excesses, and capital availability.
RWA: (Risk Weighted Assets) the scope and objectives of this project are to develop end-to-end business, functional and data requirements for the economic capital migration portion of the banks’ overall Basel II solution. These include Economic Capital (EC) calculations for credit, market and operational risk portfolio, RAROC calculations, ongoing monitoring needs and other framework requirements
OWE: (Other Wholesale Exposures) the scope and objective of this project is to integrate the other wholesale exposure data and create flat file reports using informatica which are used as sources by the RWA team.
Responsibilities:
- Involved in discussions to gather source data information from different source systems
- Involved in requirement gatherings with Business team.
- Involved in creating the test strategy.
- Prepared Test plan, test case according to the Source to target mapping document
- Tested the ETL mapping which were developed to load from different source systems in the Oracle Staging /Target areas
- Validated the data between the Staging to Target using the Source to target mapping document as reference.
- Helped the Infrastructure team in loading the XML’s generated by the informatica into the ALGO Limit management system
- Tested the data using the Logs generated after loading the data in to LMT
- Involved in Functional Testing of the LMT project.
- Tested the BO reports generate by the BO team using ACR (ALGO) as a source.
- Prepared Traceability Matrix with requirements versus test cases.
- Involved in identifying the defects and developed Defect Tracking report using Mercury Quality Centre this involved Bug life cycle.
- Supported various leading vendor solutions for financial Risk based management solutions such as Algorithmics LMS and Moody’s Fermat RWA applications
Environment: Informatica Power Center 8.6, Business Objects XI R3.1,Oracle, Toad, Microsoft Outlook, Windows XP,Share Point, Quality Center, UNIX Shell Scripting, Altova XML SPY, XML
Confidential,Maine May 2009 – June2010
Data analyst / Tester (Informatica)
Confidential,are going to become as TD Bank. Worked on Integration of both banking platforms. A complex ETL process developed in Informatica Power Center extracts this data from many disparate data sources (DB2,Teradata,Oracle databases and flat files), identifies erroneous data, and transforms/loads clean data it into an Oracle 9i data warehouse that is optimized for query and reporting.
Responsibilities:
- Extensively tested the ETL mappings which were developed to load data from Oracle and SQL Server sources into the Oracle Staging/Target Area
- Verified and validated data model structure and E-R modeling with all related entities and relationship with each entity based on the rules using Erwin as per the business requirements.
- Worked in the implementation of loan and risk management system applications
- Writing and executing Test Cases with Test Data.
- UsedSQLQueries to verify the data from theOracleand MS SQL Server databases
- Performed backend testing using SQL queries and analyzed the server performance on UNIX
- Tested mappings and SQL queries in transformations such asExpression transformation, Filtertransformation, Lookup transformation, Joiner transformation, XML transformation, Aggregator transformation, and Update strategy transformation.
- Extensively usedInformatica Debuggerto validate maps and to gain troubleshooting information about data and error conditions.
- Interacted with client user personnel to ensure continuing adherence to requirements and user standards.
- Involved in formal review meetings with project teams and developers to report, demonstrate, prioritize and suggest resolution of issues discovered during testing
- Tested formulas for various loan, interest calculations, and currency conversions required for reporting purposes.
- Actively participated inSystem testing,integration testingand coordinated UAT
Environment:Informatica Power Center 8.6/8.1.1, Workflow Designer, Power center Designer, Repository Manager, COGNOS Power play 6.6, XML,SQL server , DB2, Oracle 9i, SQL,PL/SQL, SQL Loader, UNIX Shell Scripting, Windows NT/2000/XP, Visio, Erwin 3.5.2,Quality Center
Confidential,Massachusetts Oct 2007 – May 2009
ETL/BI/QA Analyst
Confidential,is one of the largest pharmaceutical distributors in North America, Provides decision support software to help physicians determine the best possible clinical diagnosis and treatment plans for patients. This project was aimed at making a data warehousing system out of health care customers having different health care policies. This project involved designing business logic to find policy extensions for existing customers and to do fraud claims analysis.
Responsibilities:
- Reviewing the business requirements and working with business and requirements team for gaps found during the review
- Creating and executing SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data using TOAD.
- Identified and tracked the slowly changing dimensions/mini dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
- Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
- Developed and executed various manual testing scenarios and exceptionally documented the process to perform functional testing of the application
- Tested Informatica mappings. Performed data validations against Data Warehouse
- Performed extensive data validations against Data Warehouse
- Extensively used various Informatica Designer Tool’s components such as source analyzer, transformation developer, mapping designer, mapplet designer, workflow manager and workflow monitor
- Writing the test scenarios based on business requirements and Business use cases
- Developed a detailed Test Plan & Test strategy based on Business requirements specifications
Writing and execution of Test cases
- Documented Test Cases corresponding to business rules and other operating conditions.
Carried out ETL Testing using Informatica.
- Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
- Executed sessions and batches in Informatica and tracked the log file for failed sessions.
- Compared the actual result with expected results. Validated the data by reverse engineering methodology i.e. backward navigation from target to source.
- Extensively worked on Backend using SQL Queries to validate the data in the database.
- Responsible for creating test data for testing environment
- Bug reporting using Test Director
Environment: Test Director, Informatica 8.1/7.1.4, SQL, PL/SQL, UNIX Shell Scripting, Autosys, MS SQL Server 2005, TOAD, XML, XSLT, XSD
Confidential,CA Jan 2007 – Sep 2007
Informatica Developer/Tester
Confidential, is the largest data warehouse developed and maintaining by Verizon Wireless Services. As the name implies it contains all the metrics of various services that are offered by Verizon to the customers throughout U.S.A. these metric provides the useful information to the business people for analyzing the market trend and providing various new facilities to the customer by taking effective decisions. It contains the data of various customers who are using various services like DSL, FTTP, PPV, VASIP & FIOS etc...The data for the warehouse is coming from various source systems and it got populated into warehouse by using internal staging system, whenever necessary. The warehouse consists of data from granular level to aggregate level which is used for Cube creation for generating reports.
Responsibilities:
- Involved in requirement gathering, analysis and study of existing systems.
- Extensively worked on transformations like Lookup, Filter, Expression, Aggregator, Router, Source Qualifier, Sorter, Sequence Generator, Rank, etc.
- Designed and developed new mappings using Connected, Unconnected Look ups and Update strategy transformations.
- Developed joiner transformation for extracting data from multiple sources.
- Extensively used workflow manager for creating and scheduling various sessions.
- Actively involved in creating Test Plans based on the requirements submitted by the Business analysts..
- Extensively used ETL to load data from flat files (excel/access) to Oracle Database.
- Created Test plans and Test Scenarios.
- Unit tested the mappings before including the developed sessions into the already existing batch.
- Created Informatica Mappings with PL/SQL procedures/functions to build business rules to load data.
- Responsible for monitoring all the sessions that are running, scheduled, completed and failed.
- Involved in improving performance of the Server Sessions by identifying and eliminating the Performance Bottlenecks.
- Responsible for monitoring all the sessions that are running, scheduled, completed and failed.
- Experience in the Data Warehouse Lifecycle (Requirement Analysis, Design, Development and Testing).
- Also involved in moving the mappings from Test Repository to Production after duly testing all transformations.
- Experience in using SAS to analyze trends and create reports based on business needs
- Experience working with SAS cubes for OLAP purposes
- Extracting data from the database using SAS/Access, SAS SQL procedures and create SAS data sets.
- Creating SAS dataset from tables in Database using SAS/Access
- Wrote PL/SQL Packages and Stored procedures to implement business rules and validations.
Environment: Informatica – PowerCenter7.1.1/6.1, Quality center,DB2, Oracle9i, SQL server 2000, SQL, PL/SQL, UNIX Shell Script, SQL*Loader, Microsoft Excel, Microsoft Access, Windows NT/2000, Erwin 4.0., Business Objects, SAS
Confidential,INDIA May 2005 - Dec 2006
Database Developer
Project Description:
Confidential,is a worldwide information technology services and solutions company that
has established itself in more than 100 countries. This project involved working with
A financial data fed periodically by source systems.
Responsibilities:
- Involved in creation of schema objects like indexes, views, stored procedures, functions, Packages and synonyms.
- Involved in the analysis of how the purchase order process is organized.
- Involved in developing Triggers which internally calls procedures and functions.
Environment: Oracle 7.X, SQL, PL/SQL, Windows