We provide IT Staff Augmentation Services!

Etl Quality Analyst Resume

3.00/5 (Submit Your Rating)

SUMMARY:

  • Over Seven years of IT experience in various phases of IT projects such as testing, deployment and application support.
  • Expertise in creating Test Plan documents and developing test strategy documents and preparing the Traceability Matrices.
  • Expertise in designing the test scenarios and scripting the test cases in order to test the application.
  • Expertise in QA process and different levels of testing like unit, Functional, Regression, and Integration testing with business requirements specified.
  • Strong knowledge of Business Intelligence, Data Warehousing and Business Architecture
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using AbInitio and Informatica Power Center.
  • Experience in Dimensional Data Modeling using Star and Snow Flake Schema.
  • Strong working experience in the Data Analysis and Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL).
  • Extensive experience in testing of Financial services domain, Banking, Mortgage and Equities trading models.
  • Have the ability to elicit, understand and articulate business requirements and perform detailed analysis to map them to technical and functional requirements.
  • Experienced with V-Model, Agile model and water fall approach
  • Experienced in Test and Defect Management using Mercury Test Director, HP (Formerly Mercury) Quality Center and Bugzilla.
  • Experience in analyzing defect severity, priority and documentation and scheduling for the fix.
  • Experience in release management and defect control management.
  • End to End responsibilities to ensure the defects are closed and fixes have been implemented.
  • Hands on experience with all phases of Software Development Life Cycle (SDLC)
  • Strong experience in testing tools like TestDirector7.6/8, Quality Center, Quick Test Pro.
  • Experience in testing and writing SQL and PL/SQL statements.
  • Proficient in working with UNIX shell scripts.
  • Experience with integration testing of the applications. Worked with external vendors to get synchronized with the timing of the process runs in QA environment and run the whole process developed by replicating the production environment and making sure everything from start to end is working as expected.
  • Excellent communication skills and Ability to work in-groups as well as independently with minimum supervision and initiative to learn new technologies and tools quickly.
  • Good Team Player with positive attitude towards work as well working independently and Flexibility to

work over time when needed.

TECHNICAL EXPERTISE:

ETL Tools

Business Objects Data Services 3.1, Informatica Power Center 7.1.2/6.2,Ab initio, Data Stage, Oracle Data Integrator (ODI), TIDAL.

OLAP Tools

Business Objects 6.0/6.5, Cognos

Data Bases

Oracle 10g/9i/8i, DB2 8.2.2, MS SQL Server, Teradata 12

Reporting Tools

Crystal Reports, Business Objects Webi, SAS

Languages

SQL, PL/SQL, C, C++, Visual Basic, Java, .Net, HTML

Testing Tools

Test Director 6/7.5/8.0, Quality Center, clear quest, MS Visual Studio Test Professional 2010.

Operating Systems

WINDOWS 95/98/00/NT/XP and UNIX

Other Tools

DB2 Control center, WinSQL, TOAD, PUTTY, Ultra Edit, VSS

EDUCATION:
Bachelor of Computer science, JNTU, India.

WORK EXPERIENCE:

Confidential,

ETL Quality Analyst,

Oct 2010–April 2012

Confidential,

ETL Quality Analyst,

Nov 2009 – Sept 2010

Confidential,

Quality Analyst,

July 2008 – Oct 2009

Confidential,

Quality Analyst

Aug 2007 - Jun 2008

Confidential,

Junior Quality Analyst,

Jun 2005 – Aug 2007

Confidential,Richmond, VA Oct 2010 – April 2012
Position: ETL Quality Analyst

Project: US Card Data Sourcing & Gap Remediation project on ensuring the availability of historical data required for future model development and ongoing data required for Basel II compliance within the US Card line of business. This project focuses on sourcing all of the economic data used within Capital One for modeling, forecasting, R&D, credit decision purposes and consolidating that data in a single production environment. Economic data is defined by data that deals with the entire economy, not individual companies, and is changing on a constant basis. This project is to automate the sourcing and loading of the economic data into the appropriate data stores. It includes the sourcing of data from various sources like moody’s, Lexis-Nexis etc.
The Basel II need for this data results from the requirement to have up to date economic data available in a production environment for model development and execution for both the Retail and Commercial work streams. The other groups with Capital One that have a need for this data number in the hundreds, and include US Card Decision Services, Home Loans, and Recoveries.

Responsibilities:

  • Review of Business Functional Specifications, Functional Design Specifications and Detailed Design Specifications.
  • Involved in SDLC from Inception, Transformation to Execution including Design, Development and Implementation.
  • Developed Project test plan, test cases, test scenarios and test conditions based on Mapping Document and technical specification documents.
  • Tested incremental iterative application releases using an Agile Methodology (Scrum)
  • Created test cases and SQL Script for validating the landing, Relational, Dimensional and Fact tables with referencing those to the workflows, session in the Informatica folder.
  • Analyzed the requirements and helped in preparing the environments test data for testing.
  • Tested ETL (Informatica) mappings for each extract in the Informatica Designer and verified the session logs through Informatica Workflow Manager
  • Extensively tested ETL (Informatica) part, using SQL queries for consistency and errors while extracting / transforming / loading the data
  • Validated the data at each transformation. From source table to staging tables and then finally to destination tables
  • Validated Complex mappings in Informatica to load the data from various sources using different transformations like Union, Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, XML Source Qualifier, Rank and Router transformations.
  • Verified data flow between STG to Load, Load to Store, Store to data warehouse and DHW to Meta Data view
  • Expertise in executing scripts in UNIX environment to load the data from source database to target Data warehouse.
  • Develop parameter driven ETL process to map source systems to target data warehouse with Datastage complete source system profiling.
  • Coordinate the workload of offshore resources to ensure on-time delivery.
  • Developed SQL queries to validate data in ODS, Data warehouse, Data mart and reports
  • Data Auditing for DW Integrity and Data Validation to ensure accuracy.
  • Tested the functionality by writing complex SQL queries and match it against the fact tables.
  • Validating the Informatica workflows and Session hierarchies.
  • Monitor the Informatica workflow’s and session’s parameter files information from the UNIX box.
  • Validated ETL mappings by running the Session, Workflows from the Informatica Monitor and verified the log files
  • Verified the Target table inserts and updates that occur based on different conditions mention in the mapping specification document.
  • Used the Datastage Director and its run-time engine to schedule the jobs, validating and debugging its components, and monitoring the resulting executable versions.
  • Verified the data loads to critical and non-critical columns based on the ETL updates or inserts
  • Validation Informatica nightly batch process according to schedule process in TIDAL
  • Performed Integration, System and Regression Testing. Assisted the users with User Acceptance Testing (UAT) and logged defects using HP Quality Center.
  • Summarize test results in formal test analysis reports according to the Documentation standards.
  • Involved testing in web services with the help of XML
  • Extensively used SQL (DDL, DML) in order ensures database integrity and the data consistency of the reports
  • Ran Linux/Unix scripts, Informatica workflows and sessions for validation of data mappings and corresponding parameter files.
  • Performed event validations in ABC automated event handling process, also verified error email notifications

Environment: Oracle10g, Microsoft SQL server 2008, ODI, Windows XP, UNIX, Putty, HP Quality Center 10, Informatica 8.1.1, Datastage.
Document management tools: Live link, Share point
Reporting Tools: Business Objects

Confidential,Chicago, IL Nov 2009Sept 2010
Position: ETL Quality Analyst

Project: The Mortgage Application, which deals with tax calculations involved with Retained Mortgage Assets. During execution its day-to-day financial transactions, Mortgage Application would pull out the retained portfolio records out of Definitive Sources on daily basis, gathers additional required data, errors out incomplete data, performs required daily calculations, and makes them ready for amortization. Apart from overriding the data, tax users would also be able to enter the data that has been obtained from third parties.

Responsibilities:

  • Created Test Plan and developed an test strategy for Day 2
  • Understanding the specifications for Data Warehouse ETL Processes and interacting with the data analysts and the end users for informational requirements.
  • Tested several complex reports generated by Cognos including Dashboard, Summary Reports, Master Detailed, Drill Down and Score Cards.
  • Used DataStage as an ETL Tool for Developing the Data Warehouse.
  • Used Query Studio to test ad hoc reports
  • Created Manual Test Suites for various modules.
  • Tested stored procedures & views in Oracle 10g and Sybase 12.3
  • Compared and Tested Source data with XML Output flow.
  • Deliver testing assignments within the established commitments, ensuring quality, performance and conformance with established specifications
  • Used the DataStage Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database.
  • Developed advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Written several UNIX scripts for invoking data reconciliation.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Reported bugs and tracked defects using Quality Center 8.0 (Test Director)
  • Extensively used SQL to verify and validate the data load.
  • Perform all types of testing including functional testing (black box), back-end database testing (white box), data integration testing and system testing
  • Worked with developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Involved in extensive DATA validation using SQL queries and back-end testing.
  • Performed the tests in both the system Testing and Intigration Testing, QA and contingency/backup environments
  • Used the Datastage Director and its run-time engine to schedule the jobs, validating and debugging its components, and monitoring the resulting executable versions
  • Develop parameter driven ETL process to map source systems to target data warehouse with DataStage complete source system profiling.
  • Writing the test scripts for manual testing.
  • Submitted weekly bug or issue report updates to the Project Manager in the form of the QA Error Log.
  • Involving in Functional testing, End to End testing and Regression Testing
  • Involved in Front to Back testing for all European and Asia pacific regions
  • Preparing and supporting the QA and UAT test environments

Environment : Rational Clear Quest, Oracle 10G, DB2, TOAD, PL/SQL, DB2, Unix, Rapid SQL Navigator, Windows XP, Informatica, Data stage 7.5

Confidential,CT July 2008 – Oct 2009
Position : Quality Analyst

Project: Aetna offers a broad range of traditional and consumer-directed health insurance products and related services, including medical, pharmacy, dental, behavioral health, group life and disability plans, and medical management capabilities and health care management services for Medicaid plans.

Responsibilities:

  • Supervised the automation testing activities, tracked progress and provided daily status to the management.
  • Developed Project test plan, test cases, test scenarios and test conditions based on Mapping Document and technical specification documents.
  • Used Mercury QTP in collaboration with an in-house developed tool to debug and create scripts using VB script.
  • Tested the EDI transactions (270,271,275,276,277,834,835 and 837) along with the XML schemas and backend database under the HIPAA compliances.
  • Validating all the incoming/outgoing data from EDI 837/835 interfaces.
  • Prepared trading partners for conversion from propriety file 837 HIPAA EDI X12 standards.
  • Using BizTalk server converted claims from x12 to xml and xml to x12 format.
  • Reviewed EDI 837 claims and flagged HIPAA non-complaint claims received from the payer side.
  • Identified the data in the mainframe databases and transported data to the test databases.
  • Worked closely with business representative, Quality control analysts, customer service representatives from different plans.
  • Configured web services for manual testing and performance testing environment.
  • Tested web services manually.
  • Assisted operations in configuring QA environment and MQ clustering environment.
  • Used clearQuest to create defects, issues, risks and action items for traceability and transparency.
  • Verified the SOAP message delivery to the web services and verified the XML formatted response using the SOAP UI.
  • Worked on testing the real time and batch transactions and verified at the backend database.
  • Used HP Quality Center (QC) as test case repository and for requirements traceablility to the test cases.
  • Recorded and Run tests using QTP to automate the testing efforts.
  • Used regular expressions, synchronization points, checkpoints in automating the qtp script.
  • Created virtual objects .
  • Created test cases off the system requirements, business requirements and design document for different integrated components.
  • Used XMLSpy and texpad to edit XML files for configuring purposes.
  • Performed automated smoke tests and automated regression tests after every new release to verify the high-level functionality.
  • Updated Team Foundation Server and tracked the code using TFS.
  • Involved in writing complex SQL queries for database testing.
  • Involved in logging and tracking defects in Dev Track (defect management tool) with proper severity and priority.

Environment: DB Visualizer for DB2 and Oracle, Biztalk Server, VSTS, IBM MQ, Java based Applications, Rational Tools, XML Spy, Text Pad, SOAP UI, and Web Services.

Confidential,Auburn Hills, MI Aug 2007 to Jun 2008
Quality Analyst

Project: CN Financial Reporting System - This project is developed for reporting costs associated with changes notices of several auto parts. The Cost Continuum data warehouse at high level contains Part, CN, Model Year, Family and Cost, and reporting AutoProvisions data mart contains Part, CN, Model Year and Family, Plant and Cost. This project also involves the development of Cognos reports for Cost Continuum and Auto provisions.

Responsibilities:

  • Created test plan and test cases for manual and automated testing from the business requirements to match the project\'s initiatives.
  • Created and executed detailed test cases with step by step procedure and expected result.
  • Maintained the test logs, test reports, test issues, defect tracking using test director.
  • Involved in preparation of Requirement Traceability Metrics (RTM), Software Metrics, Defect Report, Weekly Status Reports and SQA Report using Test Director.
  • Performed black box testing by designing and constructing test cases, test data and test execution.
  • Verified defects and Perform database functional, integration and regression testing as needed to minimize defects.
  • In-depth knowledge of Test Planning, Test Case Specification, and Test Procedure Development
  • Responsible for gathering requirements from the Subject Matter Expert. (SME).
  • Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.
  • Tested data migration to ensure that integrity of data was not compromised.
  • Executed test cases, identified products issues, wrote detailed bug reports and participated in bug review during the product development stages.
  • Provide update to management on testing status and some other issues
  • Performed database validation according to the business logic by comparing the source to the target.
  • Tested Data Warehousing application on Oracle Database.
  • Responsible for the requirements: ETL Analysis, ETL Testing and designing of the flow and logic for the Data warehouse project
  • Created and executed SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data using TOAD.
  • Ran workflows created in Informatica by developers then compared before and after transformation of data generated to ensure successful transformation.

Environment: Oracle8i, Quality Center, QTP, Windows 9x/NT/2000/XP, UNIX, RPG, As400, Java, Visual Basic.

Confidential,Bangalore, INDIA Jun 2005 – Aug 2007

Junior Quality Analyst

Responsibilities:

  • Supported production environment, reported on going bugs and escalated issues.
  • Created test plans, test cases for the business requirements for the Igen3 digital printers.
  • Executed the test cases and logged defects in Test Director.
  • Performed backend testing in Oracle database.
  • Involved in GUI and functional testing of windows and web based applications.
  • Created GUI maps to enable Win Runner to identify the various objects in the application.

Environment: Mercury Interactive tools, Oracle UNIX, WinNT.

We'd love your feedback!