We provide IT Staff Augmentation Services!

Etl/dwh Tester Resume

2.00/5 (Submit Your Rating)

Newbury Park, CA

PROFESSIONAL SUMMARY:

  • 7+ years of IT experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications.
  • Experience using query tools for Oracle, DB2 and MS SQL Server to validate reports and troubleshoot data quality issues.
  • Solid Back End Testing experience by writing and executing SQL Queries.
  • Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Has expertise in Test Case Design, Test Tool Usage, Test Execution, and Defect Management.
  • Experience in UNIX shell scripting and configuring cron - jobs for Informatica sessions scheduling.
  • Good working experience on SOAP UI for testing and validating various web services used in the application.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using Informatica, SSIS, Data Stage and Ab Initio.
  • Expertise in Testing complex Business rules by creating mapping and various transformations.
  • Experience in testing XML files and validating the data loaded to staging tables.
  • Experience in testing and writing SQL and PL/SQL statements.
  • Strong working experience on DSS (Decision Support Systems) applications, and Extraction, Transformation and Load (ETL) of data from Legacy systems using Informatica.
  • Good experience in Microstrategy, Cognos Reports and Business Objects testing
  • Experience in Financial, Banking, Brokerage, and Securities industries.
  • Extensive testing ETL experience using Informatica (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager)
  • Experience in Dimensional Data Modeling using Star and Snow Flake Schema.
  • Extensive knowledge and experience in producing tables, reports, graphs and listings using various SAS procedures and handling large databases to perform complex data manipulations.
  • Extensive experience in testing and implementing Extraction, Transformation and Loading of data from multiple sources into Data warehouse using Informatica.
  • Proficient experience in different Databases like Oracle, SQL Server, DB2 and Teradata.
  • Expertise in Developing PL/SQL Packages, Stored Procedures/Functions, triggers.
  • Expertise in utilizing Oracle utility tool SQL*Loader and expertise in Toad for developing Oracle applications.
  • Extensively worked on Dimensional modeling, Data cleansing and Data Staging of operational sources using ETL processes.

TECHNICAL SKILLS:

Testing Tools: Quality Center, Test Director, Win Runner, Quick Test Pro, Load Runner, Rational Clear Quest, Rational Robot

Languages: Java, .NET, XML, HTML, SQL, TSL, Java Script, VB Script

Operating Systems: Windows, UNIX, LINUX

Databases: MS Access, Teradata, SQL Server, Oracle, Sybase, Informix, DB2

ETL Tools: Informatica Power Center, SSIS, Data Stage, Ab Initio

Browsers: Internet Explorer, Netscape Navigator, Fire fox

Bug Tracking Tool: Rational Clear Quest, PVCS Tracker, Bugzilla, Quality Center

GUI Tools: Soap UI, Visual Basic

BI Tools: Microstrategy, Cognos, Business Objects, Crystal Reports, SSRS, SSAS

PROFESSIONAL EXPERIENCE:

Confidential, Newbury Park, CA

ETL/DWH Tester

Responsibilities:

  • Responsible for Gathering test requirements and scoping out the test plan.
  • Written complex SQL queries.
  • Designed and tested UNIX shell scripts as part of the ETL testing process to automate the process of loading, pulling the data for testing ETL loads.
  • Tuned ETL jobs/procedures/scripts, SQL queries, PL/SQL procedures to improve the system performance
  • Wrote SQL and PL/SQL scripts to perform database testing and to verify data integrity.
  • Tested both conditional formatting and threshold level testing for several reports developed in Microstrategy.
  • Reported bugs and tracked defects using ALM/Quality Center
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Involved in data warehouse testing by checking ETL procedures/mappings.
  • Performed all aspects of verification, validation including functional, structural, regression, load and system testing.
  • Worked with web service testing tools like Soap UI pr -developed and build test cases, execute, and maintain Integration test harnesses, tests scripts as assigned.
  • Used SoapUI for API testing both SOAP and REST protocols in different web services and wrote Xpath/XQuery language codes to add various assertions for the test.
  • Written SQL complex queries to verify data is flowing from source to target and from the target reports have been generated using Microstrategy.
  • Participated in Integration, System, Smoke and User Acceptance Testing and production testing using QTP/UFT.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Worked on the data flow diagrams using VISIO
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning.
  • Automated functional, GUI and Regression testing by creating scripts in QTP/UFT.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Extraction of test data from tables and loading of data into SQL tables.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mapping
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Tested the data load process using the Informatica Power Center ETL tool.
  • Worked with Data Conversion projects for testing data quality after migration.
  • Experienced using database query tools for Oracle, SQL and UNIX such as TOAD, Teradata SQL Assistant and SQL Plus.
  • Involved in testing data mapping and conversion in a server based data warehouse.
  • Experienced in data analysis using SQL, PL/SQL and many other queries-based applications.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Validated business reporting requirements with internal and external customers.
  • Validated the reports to make sure all the data is populated as per requirements.
  • Responsible for leading and coordinating with offshore team.

Environment: Informatica Power Center 9.1, Microstrategy, SQL, PL/SQL, Agile, Soap UI, Teradata 13, Visio, Teradata SQL Assistant, ALM/HP Quality Center 11, QTP/UFT, Oracle, UNIX, TOAD, T-SQL, SQL Server, XML Files, XSD, XML Spy, Flat Files

Confidential, Chicago, IL

ETL/BI TESTER

Responsibilities:

  • Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
  • Worked with Data Completeness, Data Transformation, Data Quality, Integration, UAT testing and Regression testing for ETL and BI group
  • Write SQL queries to validate that actual test results match with expected results
  • Check the naming standards, data integrity and referential integrity.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
  • Checked the reports for any naming inconsistencies and to improve user readability.
  • Compared the actual result with expected results. Validated the data by reverse engineering methodology i.e. backward navigation from target to source.
  • Used Clear Quest for defect tracking and reporting, updating the bug status and discussed with developers to resolve the bugs.
  • Created data-driven automation scripts for testing API Web Services using SOAP UI.
  • Involved in Unit testing and integration testing of Informatica mappings.
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Preparing Defect Reports and Test Summary Report documents.
  • Checking the status of ETL jobs in UNIX and tailing the log files while the loading is under processing.
  • Developed the Test Plan and Testing Strategy for Data Warehousing projects.
  • Written Test Scripts based on the Business requirements and executed Functional testing and data validation, data reconciliation with defect correction and retesting, followed by regression and performance testing.
  • Prepared documentation for all the testing methodologies used to validate the Data Warehouse.
  • Create and executed data driven test scripts in QTP.
  • Conduct Functionality testing during various phases of the application using QTP.
  • Created the Microsoft Visio diagram which represents the workflow and execution process of all the process
  • Automated and scheduled the Informatica jobs using UNIX Shell Scripting.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the DB2 database in UNIX environment
  • Experienced using query tools for Oracle to validate reports and troubleshoot data quality issues.
  • Validated format of the reports and feeds.
  • Effectively communicate testing activities and findings in oral and written formats.
  • Extracted data from various sources like Oracle, flat files and SQL Server.
  • Designing and creation of complex mappings using SCD type II involving transformations such as expression, joiner, aggregator, lookup, update strategy, and filter.
  • Optimizing/Tuning several complex SQL queries for better performance and efficiency.
  • Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.
  • Worked on issues with migration from development to testing for different data loading jobs
  • Extracted certain column of data from a number to files using PERL.
  • Reviewed Informatica mappings and test cases before delivering to Client.
  • Validated cube and query data from the reporting system back to the source system.
  • Used Rational Clear Case for version controlling
  • Extensively involved with backend testing by writing complex SQL queries.
  • Worked closely with the development teams to replicate end user issues.

Environment: Oracle, SQL, PL/SQL, Informatica, Quick Test Pro (QTP), Agile, Soap UI, HP Quality Center, Visio, Business Objects, TOAD, Rational Clear Quest, Rational Clear Case, XML Files, XSD, XML Spy

Confidential, Hopkins, MN

Data Warehouse Tester

Responsibilities:

  • Involved in Relational modeling and Dimensional Modeling Techniques to design ER data models and Star Schema designs for testing the data flow between source and target
  • Documented the validation rules, error handling and test strategy of the mapping.
  • Tuned the complex SQL queries for better performance by eliminating various performance bottlenecks.
  • Performed the Back-end Integration Testing to ensure data consistency on front-end by writing and executing SQL statements
  • Performed Functional Testing and Back-end Testing using the database comparable results Manually
  • Interacted with the users to ensure meaningful development of the scripts and simulated real time business scenarios.
  • Compared the actual result with the expected results. Validated the data by reverse engineering methodology i.e. backward navigation from target to source.
  • Created temporary tables for validations after understanding the transformation logic.
  • Tested several UNIX shell scripting for File validation and also PL/SQL programming
  • Tested complex mappings using mapplets and reusable transformations.
  • Executed automation scripts in HP QTP for validating Replica DB. Tested the Delta load/Batch jobs in Informatica.
  • Performed Data Centric testing to validate the data that is loaded in to the data warehouse is correct as per the defined business rules.
  • Established and documented ETL QA standards, procedures and QA methodologies.
  • Verify development of rule-set test scenarios, test cases, expected results and analysis of XML test scripts
  • Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
  • Developed and executed various manual testing scenarios and exceptionally documented the process to perform functional testing of the application
  • Performed extensive data validations against Data Warehouse
  • Loading Flat file Data into Teradata tables using Unix Shell scripts.
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • Extensively tested several ETL Mappings developed using Informatica.
  • Involved in testing data reports. Written several SQL queries for validating the front-end data with backend.
  • Conducted back to back meetings with developers and discussed about all open issues.
  • Conducted test case reviews to ensure scenarios accurately capture business functionality.
  • Worked with ETL Source System and Target System (ETL Mapping Document) for writing test cases and test scripts.
  • Written several SQL statements to confirm successful data transfers between source and target.
  • Worked with data profiling, data scrubbing and data cleansing.
  • Tested Standard Reports, Cross-tab Reports, Charts, Master-Child Drill through Reports and Multilingual Reports

Environment: Informatica Power Center, Data warehouse, Windows NT, Agile, MicroStrategy, HP QTP, Oracle, SQL, PL/SQL, T-SQL, SQL Server, SQL, PL/SQL, SQL*Plus, Toad, XML, XSD, VSAM Files, MVS, ISFP, Flat Files

Confidential

SQL /ETL/BI Tester

Responsibilities:

  • Prepared and Scheduled for Backup and Recovery. Maintenance of backup and recovery plans for system and user databases
  • Created roles, and set permissions to ensure that company’s security policies are enforced and in effect.
  • Created test case scenarios, executed test cases and maintained defects in Remedy
  • Tested reports and ETL mappings from Source to Target
  • Developed and executed various manual testing scenarios and exceptionally documented the process to perform functional testing of the application
  • Tested Informatica mappings. Performed data validations against Data Warehouse
  • Extracted and transformed data from one server to other servers using tools like Bulk Copy Program (BCP), BULK INSERT.
  • Assisted Developers to design databases, tables, and table objects in support of development activities and to achieve optimal performance.
  • Understood business requirements, and helped re-design the development plan, and designed the data warehouse
  • Created Database Objects - Tables, Indexes, Views, and Roles.
  • Created Customized Stored Procedures, Triggers, and Cursors.
  • Involved in job scheduling using MS SQL server Agent.
  • Database authentication modes, creation of users, configuring permissions and assigning roles to users.
  • Responsible for Gathering test requirements and scoping out the test plan.
  • Responsible for creating Test Plan and scope of testing.
  • Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
  • Responsible for extensive data validations on Data warehouse and reports.
  • Responsible for testing Business Objects universe and universe enhancements.
  • As an ETL/BI Tester responsible for the business requirements, ETL Analysis, ETL test and design of the flow and the logic for the Data warehouse project.
  • Responsible for leading team and co-coordinating with offshore team
  • Prepare and set up environment for Disaster recovery.
  • Transformation of data from one server to other servers using tools like Bulk Copy Program (BCP), Data Transformation Services (DTS).
  • Data conversions from legacy systems to SQL server Using DTS.
  • Created multiple complex queries for the profiling and auditing purposes.
  • Optimized, partitioned cubes to speed up report latency and performance optimizations.
  • Assisted in Troubleshooting of Report deployment on multiple servers.

Environment: Informatica, Microsoft SQL Server 2008, Enterprise Manager, Database Transformation Services (DTS), VB Script, T-SQL, SQL Profiler, MS Access, XML, XSLT, XSD

We'd love your feedback!