We provide IT Staff Augmentation Services!

Sr. Etl Qa Tester Resume

5.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Over 8+ years Sr ETL Tester of data warehousing experience in building and managing various data warehouses/data marts using Informatica products: Power Center 9.1/8.6/8.0/7.1 , Power Exchange. SQL Server / SSIS, SSRS and SSAS
  • Have worked extensively on analyzing project requirements and develop detailed specifications for the technical build.
  • Experienced with ETL tool, Informatica, in designing and developing complex mappings, mapplets, transformations, workflows, worklets, and scheduled sessions.
  • Hands on experience in ETL development processes using DTS and SSIS.
  • Utilized workflow manager, workflow monitor and scheduled ETL jobs in Informatica.
  • Proficient in developing, debugging, deploying SSIS Packages using SSIS designer in
  • Experience in Extraction, Transformation and Loading of data from different heterogeneous source systems like Flat files (Fixed width & Delimited), XML Files, COBOL files, Excel, Oracle, Sybase, MS SQL Server and Teradata.
  • Performance tuned targets, sources, mappings and sessions, co - coordinated with DBA's and UNIX administrator for ETL tuning.
  • Extensive experience in writing Python Scripts and UNIX Shell-Scripts
  • Migrated code from Sagent to informatica 9.x and tested all objects developed and checked their end to end consistency in new environment.
  • Used stored procedures, functions, triggers, joins, views and packages in PL/SQL & Oracle 10g/9i/8i: Written SQL queries, stored procedures, and cursors Migrated PLSQL code to Informatica mappings.
  • Developed interface for data transfer/data loading using PL/SQL programs, SQL*Loader & Informatica.
  • Interfaced TOAD to analyze, view and alter tables in various databases.
  • Prepared star & snowflake data models and involved in designing EDW (enterprise data warehouses)
  • Configured Informatica Data Quality (IDQ), defined rules, built scorecards, applied metrics and grouped into various dimensions in IDQ.
  • Implemented data cleansing, tuning, profiling and reports, dashboards to display DQ results using IDQ.
  • Developed data cleanup procedures, transformations, scripts, stored procedures and executed test plans for loading data successfully into targets.
  • Knowledge of Jasper soft reporting tool and integration to IDQ tool.
  • Involved in unit, integration & system testing and code migration between various environments.
  • Exceptional problem solving and decision making capabilities, recognized by associates for quality of data, alternative solutions, and confident, accurate, decision making.

PROFESSIONAL EXPERIENCE

Sr. ETL QA Tester

Confidential, Chicago, IL

Responsibilities:

  • Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.
  • Worked on Informatica power centre 9.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
  • Involved in design and development of complex ETL mappings.
  • Developed Python scripts to automate the test cases.
  • Designed and developed SSIS Packages to extract data from various data sources such as Access database, Excel spreadsheet and flat files into SQL server for further Data Analysis and Reporting by using multiple transformations provided by SSIS such as Data Conversion, Conditional Split, Bulk Insert, merge and union all.
  • Involved in Requirement gathering and analysis, designing and successful implementation. Research on Hadoop for high performance using AWS, writing PIG and Map Reduce
  • Implemented partitioning and bulk loads for loading large volume of data.
  • Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.
  • Developed the Informatica mappings using various transformations, Sessions and Workflows. Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Excel files and Teradata database.
  • Used VIM and PyDev (Eclipse binding with Python) as a script editor.
  • Identified bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations.
  • Performance tuning by session partitions, dynamic cache memory, and index cache.
  • Created Complex ETL Packages using SSIS to extract data from staging tables to partitioned tables with incremental load.
  • Developed Informatica SCD type-I, Type-II mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.
  • Extensively worked on various Look up Caches like Static, Dynamic, Persistent, and Shared Caches.
  • Developed workflow tasks like Email, Event wait, Event Raise, Timer, Command and Decision.
  • Created Stored Procedures in PL/SQL.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads
  • Used SSIS 2008/2012 to create ETL packages (.dtsx files) to validate, extract, transform and load data to Data warehouse databases, data mart databases, and process SSAS cubes to store data to OLAP databases.
  • Developed Documentation for all the routines (Mappings, Sessions and Workflows)
  • Involved in scheduling the workflows using UNIX scripts.

ETL QA Tester

Confidential, Stuart, FL

Responsibilities:

  • Involved in requirement analysis, ETL design and development for extracting data from source systems: Oracle, flat files, XML files and data mart loading.
  • Worked with analysts and data source systems experts to map requirements to ETL code.
  • Created packages in SSIS Designer using Control Flow and Data Flow Transformations to implement business rules.
  • Developed mapping logic using transformations: Expression, Lookups (Connected, Unconnected), Joiner, Filter, Sorter, Router, Update strategy & Sequence generator.
  • Used different transformations: Address validation router, Filter, Lookup, Joiner, Aggregator, Normalizer, Rank, Update strategy in IDQ; exported to Power Center as a Mapplets and used in the mapping.
  • Used Python codes to handle textbox, dropdown, checkbox, buttons and popup controls
  • Worked extensively with the Teradata Queryman to interface with the Teradata.
  • Used IDQ for data cleansing, tuning & profiling; implemented reports, dashboards to display DQ results.
  • Informatica Data Quality helps make data quality improvement and standardizing on a single platform that provides a centralized set of reusable rules and tools for managing data quality across any project
  • Migrated code from Sagent to Informatica power centre and tested of all objects developed and check their consistency end to end on the new environment.
  • To develop Exception handling process for each SSIS packages.
  • Translated functional specifications into technical specifications (design of mapping documents)
  • Implemented Type-1 SCD and Type-2 SCD mappings to update slowly changing dimension tables.
  • Created mapping using multiple source systems: Oracle, flat files, and XML files into data mart in Oracle database.
  • Used load scripts to load tables in SQL
  • Used Teradata Utilities (FastLoad, MultiLoad, FastExport) . Queried the Target database using Teradata SQL and BTEQ for validation.
  • Responsible for design and implementation for loading ODS and updating warehouse.
  • Created utility scripts to manage and scrub incoming source files, to move files between directories, report and trend file metadata and frequency.
  • Wrote execution report scripts to broadcast e-mails to respective people on failure/success, error handling, and control /audit critical processes.
  • Error handled session & workflow logs in dev & test environments.
  • Performance tuned existing & new mappings.
  • Developed SSIS Packages and SQL scripts to extract the data from OLTP system to flat files and for pre-calculated aggregates, summaries, and user specific calculations.
  • Written SQL scripts to avoid Informatica joiners and look-ups to improve performance of data volume.
  • Used Control-M to schedule jobs for maintaining database objects & Unix scripts
  • Created deployment groups to deploy objects to migrate code to higher environments.

ETL QA Tester

Confidential, Philedelphia, PA

Responsibilities:

  • Designs and develops the logical and physical data models to support the Data Marts and the Data Warehouse
  • Involved in the requirement definition and analysis in support of Data Warehouse and ODS efforts.
  • Worked closely with the Enterprise Data Warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process.
  • Worked as ETL Tester responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project. Involved in extensive DATA validation using SQL queries and back-end testing
  • Managed the Metadata associated with ETL processes used to populate the data warehouse.
  • Worked with analysts and data source systems experts to map requirements to ETL code.
  • Involved in validating SSIS and SSRS packages according to functional requirements.
  • Identified/documented data sources and transformation rules required to populate and maintain data warehouse content.
  • Worked closely with the Enterprise Data Warehouse team and Business Intelligence
  • Responsible to tune ETL procedures and schemas to optimize load and query Performance.
  • Designed Data Stage ETL jobs for extracting data from heterogeneous source systems, transform and finally load into the Data Marts.
  • Prepared Logical Data Models that contains set of Entity Relationship Diagrams and Data Flow Diagrams and supporting documents and descriptions of the Relationships between the data elements to analyze and document the Business Data Requirements.
  • Understanding the Business requirements based on Functional specification to design the ETL methodology in technical specifications.
  • Created Profiles and Roles based on Organizational role hierarchy, implemented Record-Level and Field-level security and configured their sharing settings.
  • Validated several SSIS and SSRS packages to verify that they are working according to BRS.
  • Wrote SQL queries to validate source data versus data in the data warehouse including identification of duplicate records.
  • Tested the application by writing SQL Queries and creating pivot views as to perform back-end testing.
  • Creating Low level documents for creating maps to load the data from the ODS through the warehouse.
  • Used UML to produce Use Case models, Activity and Sequence diagrams, as part of the detailed design of interfaces.

ETL QA Tester

Confidential - New York, NY

Responsibilities:

  • Designed and developed Test Plans, Test Scripts and Test Cases in HP Quality Center and executed them.
  • Involved in writing the Test Plans based on Business Requirement and Functional Requirement documents.
  • Involved in implementation of the Test plans, Test cases and Test Scripts.
  • Created Test sets in Test Lab to move all the test cases from Test plan to execute the test cases.
  • Tested the data and data integrity among various sources and targets.
  • Developed and involved in both Manual Testing and Automation Test Scripts based on Use cases developed.
  • Tested to verify that all data were synchronized after the data is troubleshoot, and also used SQL to verify/validate my test cases.
  • Tested Business Objects reports and Web Intelligence reports.
  • Managed user accounts and security using Business Objects Supervisor.
  • Tested the universes and reports in Business Objects XRI3.
  • Extensively used Informatica power center for ETL process.
  • Preparation of ETL mapping specifications.
  • Worked with QTP for Regression Testing.
  • Configured Quick Test Professional with Quality center.
  • Executed of QTP Script for automation testing, analyzing the automation result
  • Reviewed Informatica mappings and test cases before delivering to Client.
  • Used all Teradata utilities including Fast Load, Multi Load, Fast Export, SQL Assistant, BTEQ & TPUMP
  • Trained and coordinated with the offshore team members in understanding the requirements and test cases for UAT.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
  • Involved in ETL Mapping Design and Performance tuning to Load mapping process.
  • Experienced in writing complex SQL queries for extracting data from multiple tables.
  • Reviewed the test cases written based on the Change Request document.
  • Testing has been done based on Change Requests and Defect Requests.
  • Involved in preparation of System Test Results after Test case execution.
  • Experienced in writing UNIX script.
  • Effectively coordinated with the development team.
  • Created critical scenarios for each change request and defect request.
  • Worked on HP Quality Center and updated defects status with appropriate information. Test cases and Test plans are managed and controlled.
  • Involved in various testing phases like Unit Testing, System Integration Testing, Validation Testing, User Acceptance Testing, Parallel Testing, Performance Testing and Regression Testing.
  • Used various SQL queries to validate the test case results for back-end test cases.
  • Well exposed to Software Development Life Cycle and Test methodologies.
  • Hands on experience with working on Autosys jobs and MQ's. MQ's are used for intermediate storage where the messages are stored for temporary purpose.
  • Hands on experience working with SQL Server DTS Packages.

We'd love your feedback!