We provide IT Staff Augmentation Services!

Data Quality Assurance Analyst / Etl Analyst Resume

5.00/5 (Submit Your Rating)

Boston, MA

PROFESSIONAL SUMMARY:

  • Data Quality Assurance Analyst with extensive and diverse Data Warehousing Quality Assurance and Analysis experience. Expert in ETL and SQL for Database and Data Warehousing build and testing. Highly Analytical and strong thinking for testing, delivery, and support and capable of working with large onshore and offshore teams. Seeking a fast - paced dynamic environment to align my professional skills in Data Warehousing.
  • Over 7+ years of IT Experience in Database Design, Development, and support in intelligence of Microsoft SQL Server Data Warehouse, Development, Quality Assurance /ETL Analyst and Production Environments.
  • Strong Data Warehousing experience in Application development & Quality Assurance testing using Informatica Power Center (Designer, Workflow Manager, Workflow Monitor).
  • Experience in creating complex Informatica Mappings using Source Qualifier, Expression, Router, Aggregator, Lookup, and other transformations in Informatica and well versed in debugging an Informatica mapping using Debugger
  • Extensive experience in Microsoft BI studio products like SSIS, SSAS, SSRS for implementation of ETL Methodology in data extraction, transformation, and loading.
  • Strong Understanding and experience in Software Testing Life Cycle.
  • Expertise in building Test Strategies, Test Plans, Test Cases, Manual and Automated Scripts and creating and maintaining Traceability Matrices.
  • Expertise in OLTP/OLAP system study, Analysis and developing Database Schemas like star schema and snowflake schema used in relational, dimensional, and multidimensional modeling.
  • Experience at Transforming and validating data using SSIS Transformations like Conditional Split, Lookup, Merge Join and Sort and Derived Column for unstructured and redundant data.
  • Preparing Dashboards using advance table calculations, parameters, calculated fields, groups, sets, forecasting and hierarchies in Tableau.
  • Expert Knowledge of Integration Services (SSIS), Reporting Services (SSRS) and Analysis Services (SSAS).
  • Supported User acceptance testing (UAT) in preparing the test scripts and test execution
  • Created status reports.
  • Strong experience in testing Data Warehouse (ETL) and Business Intelligence (BI) applications.
  • Strong experience in Data Analysis, Data Validation, Data Profiling, Data Verification, Data Loading.
  • Experience in Dimensional Data Modeling using Star and Snowflake Schema
  • Strong working experience in the Data Analysis and Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL)
  • Experience in maintaining Full and Incremental loads through ETL tools on different environments.
  • Experienced in Defect Management using ALM, HP Quality Center
  • Good experience in writing SQL to perform data validation in migration as part of backend testing
  • Good exposure with databases like Oracle 10g, DB2, SQL Server 2012
  • Experience in Database Development, modeling, Data Warehousing, Design and Technical Management.
  • Well experience in data Extraction, Transforming and Loading (ETL) using various tools such as SQL Server Integration Services (SSIS), DTS, Bulk Insert and Bulk Copy Program utility.
  • Strong experience in Installing and Configuring of various Informatica MDM components
  • Experience in debugging the issues by analyzing the SQL queries.
  • Good experience in working with UNIX scripts in executing and validating logs for the issues.

TECHNICAL SKILLS:

ETL Tools: Informatica 5.x/6.x/7.x/8.x (PowerMart & PowerCenter), MSBI, SSIS, SSRS

BI Tools: Tableau, Micro Strategy

Testing Tools: HP Quality Center 10.0, ALM

UtilitiesToad, Oracle SQL Developer, Putty

Databases: SQL Server 2005/2008, Oracle 11i/10g/9i, Teradata, DB2

Programming Languages: SQL, PL/SQL, UNIX Shell Scripting

Operating System: MS - Windows 9x/NT/2000/XP, Red Hat Linux, UNIX (Sun Solaris 5.8)

PROFESSIONAL EXPERIENCE:

Confidential, Boston, MA

Data Quality Assurance Analyst / ETL Analyst

Responsibilities:

  • Created Test strategy Test plans and Test scenario based on the user stories and enabler stories and reviewed it with the developers and Product owners.
  • Performed, Functional, Integration, Regression, End to End testing and participated in User Acceptance Testing
  • Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices.
  • Exclusively involved in execution of Autosys jobs, PL/SQL batch programs and responsible for reporting the defects to development team.
  • Actively participated in Sprint planning, User story refinement, Daily huddles, Sprint review and Retrospective meetings.
  • Developed Power BI reports using query logic based on business rules, and visuals selected by united stakeholders.
  • Creating Power Pivot models and Power Pivot reports, publish reports on SharePoint, and deploy models into SQL Server SSAS instance.
  • Merging and transforming data according to the requirements for ease of analysis in Excel by using Power Query in Power BI.
  • Involved in implementing the Landing Process of loading the customer/product Data Set into Informatica MDM Hub using ETL tool that was coming from various source systems.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Worked in Informatica MDM Hub Match and Merge Rules, Batch Jobs and Batch Groups.
  • Developed packages to handle data cleansing, loading multiple files and executing tasks multiple times (using For Each Loop).
  • Identified and worked with Parameters for parameterized reports in SSRS.
  • Used SSIS and Azure Data Factory to import data, Azure SQL Server to store it, and SSRS to present it.
  • Created an Azure DevOps Git repository and branching strategy. Implemented policies to enforce branching strategy so Master would never contain code that did not build or pass unit tests.
  • Experience in testing and writing SQL statements
  • Worked in an agile technology.
  • Wrote Test cases in ALM and Defect report generation using ALM (Application Life cycle Management)
  • Worked with batch jobs, monitoring and scheduling of jobs as part of the interface testing-validation of data flow between two systems as per predefined fields mapping sheet and mode of communication.
  • Involved in backend testing for the front end of the application using SQL Queries in Teradata data base.
  • Used Agile Test Methods to provide rapid feedback to the developers significantly helping them uncover important risks.
  • Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources systems including Oracle and Teradata
  • Demoed sprint stories to product owner and the team for story acceptance.
  • Strong understanding of Batch files and XML request/ Response validation
  • SQL query writing to compare
  • Extract file / XML response data with Database
  • Strong understanding of Core Banking Run calendar (End of Day, End of Quarter, End of Year etc.)
  • To have frequent meetings with all stake holders for better clarity and understanding.
  • Strong background in testing in an Agile/ SCRUM methodology

Environment: Informatica 9.1, Teradata, Quality Center & ALM 11.0, SQL, Autosys UNIX, Oracle.

Confidential, Cleveland, OH

ETL/ QA Analyst

Responsibilities:

  • Involved in Design Review, Test Case Review meetings with business analysts and developers.
  • Developed required SQL scripts for reports and extract generated EOD., by executing the SQL queries. Wrote SQL Queries to extract data from the Tables.
  • Executed the Test Scripts on different releases and validated the actual results against the expected results.
  • Experienced in giving to the functional testers to execute Test Cases using QTP/UFT as part of Regression Testing.
  • Involved in the complete SDLC- (System Development Life Cycle) from Requirement gathering to actual delivery of the Extracts which include all the loan origination and servicing data
  • Generated multiple enterprise reports using SSRS from OLTP and OLAP, and included various reporting features such as group by, drilldowns, drill through, sub-reports, navigation reports etc.
  • Developed reports, dashboards, and SQL queries using Tableau.
  • Generated SSRS Report through SSIS Package using script component as per business requirement.
  • Worked on SSAS storage and partitions, and Aggregations, calculation of queries with DMX, Data Mining Models, developing reports using MDX and SQL.
  • Installed and configured MDM Hub Console, Hub Store, Hub server, Cleanse and Match Server, Resource Kit, Address Doctor applications by following Informatica Product Availability Matrix (PAM) document
  • Involved in implementing the Landing Process of loading the customer/product Data Set into Informatica MDM Hub using ETL tool that was coming from various source systems.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM
  • Built a data warehouse in Azure SQL Server that allowed stakeholder to get the data they needed to drive their business.
  • Designed and implemented complex SQL queries for QA testing and report/ data validation.
  • Worked in Agile technology with Scrum and waterfall models.
  • Experienced in using Assure data standardizer tool to transform the data according to the pre-defined rules as part of the transformation process
  • Provide support for technical integration and prod controls teams for execution of ETL mappings.
  • Tested and developed the mapping for extracting, cleansing, transforming, integrating, and loading data using Autosys.
  • Written Test Plan document, Test Cases, Reported Defects using HP ALM.
  • Used Toad for Data Analysts to compare Oracle and Teradata databases
  • Extracted Data from Teradata using Informatica Power Center ETL and DTS Packages to the target database Defect tracking and Defect Report Generation are prepared using ALM and Quality Center
  • Involved in manual and automated testing using informatica data validation option (DVO) and performed source and target table validation.
  • Written several complex SQL queries for validating business object reports.
  • Extensively tested business objects report for static/cosmetic errors like header, footer, logos fonts etc. which may appear in all pages of the reports.
  • Smoke testing for the new builds for QA acceptance.
  • Responsible for defect management and driving to resolution and retesting the same.
  • Performed negative testing using informatica to find how the workflow performs when it encounters invalid and unexpected values.
  • Created SQL compare statements to validate actual results with the expected results.
  • Tracked and executed the user Acceptance test cases (UAT) with respect to the requirements to determine the feature coverage.
  • Ensured quality products delivered and installed as specified in the statement of work (SOW).

Environment: Informatica 9.1, Oracle 11i, SQL, HP Quality Center 11, Autosys, UNIX, Agile, SSIS, SSRS, Azure

Confidential, St. Paul, MN

ETL/ QA Tester

Responsibilities:

  • Created Test Plan and developed test strategies.
  • Created Manual Test Suites for various modules.
  • Tested stored procedures & views.
  • Performed Teradata SQL Queries by following Teradata Best Practices.
  • Written SQL queries in order to data validation in migration as part of backend testing.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Test newly developed features to ensure proper functioning prior to release to QA for multiple processes
  • Worked on environment issues, Compatibility checks after post migration of applications to Unix to Linux
  • Worked with UNIX scripts in executing and validating logs for the issues.
  • Reported bugs and tracked defects using Quality Center.
  • Used FTP and Telnet protocols in order to migrate files to heterogeneous Operation systems like UNIX, Linux and windows.
  • Responsible for Functional, Regression and User Acceptance testing - coordinated UAT testing
  • Performed SIT/Functional, UAT Testing in Web based application.
  • Submitted weekly bug or issue report updates to the Project Manager in the form of the QA Error Log.
  • Involving in Functional testing, End to End testing and Regression Testing
  • Understanding of Functional Requirement Specifications and System Requirement Specifications.
  • Extensively tested several Micro strategy reports for data quality
  • Responsible for ETL batch jobs execution using UNIX shell scripting to load data from core database to Staging and Data mart tables.
  • Wrote Complex SQL queries to verify report data with the source and target system (data warehouse) data.
  • Assisted business users to execute UAT test scenarios and test data as part of Validation Testing
  • Experience on Data validation, Data merging, Data cleansing and Data aggregation activities.
  • Validating the data against staging tables and target warehouse
  • Find report defects and subsequently validating the fix, repeating the process until done.
  • Perform Sanity Testing, Data Driven Testing & Ad-hoc Testing when required.
  • Perform system testing to ensure the validity of the Report requirements and mitigation of risks prior to formal acceptance.
  • Extensively use SQL queries for data validation and backend testing.
  • Working with Data base testing Involved in Data Migration Testing and preparing documents. Functionality, Interface, and Regression testing.
  • Preparing the SQL queries to fetch the data from databases

Environment: Informatica Power center 8.6, IBM Infosphere MDM, Micro strategy, Oracle 9i/10g, Teradata SQL Assistant, Teradata V2R6, TOAD for Oracle, Quality Center, Windows XP, UNIX

Confidential, Auburn Hills MI

ETL/ QA Tester

Responsibilities:

  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
  • Created test data for testing specific functionality.
  • Promoted Unix/Informatica application releases from development to QA and to UAT environments as required.
  • Used Agile Test Methods to provide rapid feedback to the developers significantly helping them uncover important risks.
  • Responsible for the data validation for the Data Completeness, Data Transformation, Data Quality, Integration Testing, UAT & Regression Testing.
  • Involved in user sessions and assisting in UAT (User Acceptance Testing).
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Assisted in System Test and UAT testing scenarios as required.
  • Extraction of test data from tables and loading of data into SQL tables.
  • Design and build efficient SQL queries, analyze query cost comparison, use indexes efficiently.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mapping.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
  • Tested the reports using Business Objects functionalities like Queries, Slice and Dice, Drill Down, Cross Tab, Master Detail and Formulae etc.
  • Used Quality Center for Bug reporting, tracking & generation of Test Metrics.
  • Tested the ETL process for both before data validation and after data validation process.
  • Tested the messages published by ETL tool and data loaded into various databases.
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Validating the data passed to downstream systems.
  • Worked with Data Extraction, Transformation and Loading (ETL).
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.

Environment: Informatica 7.x, UNIX (K-Shell Script), Agile, Quality Center, Oracle, SQL Server 2008, DB2.

We'd love your feedback!