We provide IT Staff Augmentation Services!

Etl Tester Resume

2.00/5 (Submit Your Rating)

Providence, RI

SUMMARY:

  • 5+ years of IT experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications
  • Experience using query tools for Oracle, DB2 and MS SQL Server to validate reports and troubleshoot data quality issues
  • Solid Back End Testing experience by writing and executing SQL Queries.
  • Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Has expertise in Test Case Design, Test Tool Usage, Test Execution, and Defect Management.
  • Experience in UNIX shell scripting and configuring cron - jobs for Informatica sessions scheduling.
  • Good working experience on SOAP UI for testing and validating various web services used in the application.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide- ETL Solution using Informatica, SSIS, Data Stage and Ab Initio.
  • Expertise in Testing complex Business rules by creating mapping and various transformations
  • Experience in testing XML files and validating the data loaded to staging tables.
  • Experience in testing and writing SQL and PL/SQL statements.
  • Strong working experience on DSS (Decision Support Systems) applications, and Extraction, Transformation and Load ( ETL) of data from Legacy systems using Informatica.
  • Extensive testing ETL experience using Informatica (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager)
  • Experience in Dimensional Data Modeling using Star and Snow Flake Schema.
  • Extensive knowledge and experience in producing tables, reports, graphs and listings using various SAS procedures and handling large databases to perform complex data manipulations.
  • Extensive experience in testing and implementing Extraction, Transformation and Loading of data from multiple sources into Data warehouse using Informatica.
  • Proficient experience in different Databases like Oracle, SQL Server, DB2 and Teradata.
  • Expertise in Developing PL/SQL Packages, Stored Procedures/Functions, triggers.
  • Expertise in utilizing Oracle utility tool SQL*Loader and expertise in Toad for developing Oracle applications.
  • Extensively worked on Dimensional modeling, Data cleansing and Data Staging of operational sources using ETL processes.
  • Authorized to work in the US for any employer

TECHNICAL SKILLS:

ETL Tester

Confidential - Providence, RI

Responsibilities:

  • Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
  • Worked with Data Completeness, Data Transformation, Data Quality, Integration, UAT testing and Regression testing for ETL and BI group
  • Write SQL queries to validate that actual test results match with expected results
  • Check the naming standards, data integrity and referential integrity.
  • Checked the reports for any naming inconsistencies and to improve user readability.
  • Compared the actual result with expected results. Validated the data by reverse engineering methodology i.e. backward navigation from target to source.
  • Use SQL queries for Database testing and Validation.
  • Tested and developed the mapping for extracting, cleansing, transforming, integrating, and loading data using Informatica
  • Written test cases to test the application manually in ALM/HP Quality Center and automated using Quick Test Pro
  • Performed Database Testing to validate database using TOAD, Checked Data Integrity and Constraints by SQL Queries.
  • Worked on Team Foundation Server (TFS) and Microsoft Test Manager (MTM) to effectively manage the product life cycle
  • Written several Netezza SQL scripts to load the data between Netezza tables
  • Worked with Netezza Workbench tool to test and debug SQL query.
  • Experience in monitoring and managing the Hadoop cluster using Cloudera Manager.
  • Created data-driven automation scripts for testing API Web Services using SOAP UI.
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Preparing Defect Reports and Test Summary Report documents.
  • Checking the status of ETL jobs in UNIX and tailing the log files while the loading is under processing.
  • Tested Netezza database for Data Validation, Data Verification using NZ SQL.
  • Analyzed, designed and developed test cases for Big Data analytics platform for processing Member, Claim and providers data using Hadoop and Hive.
  • Report bugs using MTM and create/monitor bug resolution efforts and track success
  • Used Quality Center to track and report system defects
  • Developed the Test Plan and Testing Strategy for Data Warehousing projects.
  • Tested the data load process using the Informatica Power Center ETL tool.
  • Written Test Scripts based on the Business requirements and executed Functional testing and data validation, data reconciliation with defect correction and retesting, followed by regression and performance testing.
  • Participated in defining and executing test strategies using agile methodology.
  • Prepared documentation for all the testing methodologies used to validate the Data Warehouse.
  • Create and executed data driven test scripts in QTP.
  • Loading Flat file Data into Teradata tables using Unix Shell scripts.
  • Conduct Functionality testing during various phases of the application using QTP.
  • Created the Microsoft Visio diagram which represents the workflow and execution process of all the process
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the DB2 database in UNIX environment
  • Experienced using query tools for Oracle to validate reports and troubleshoot data quality issues.
  • Validated format of the reports and feeds.
  • Performed extensive Database Testing by writing complex SQL Queries in DB2 to test the different scenarios in the application.
  • Analyzed test cases for Big Data analytics platform for processing Member, Claim and providers data using Hadoop and Hive.
  • Automated the process to copy files in Hadoop system for testing purpose at regular intervals.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
  • Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices
  • Used Rational Clear Case for version controlling
  • Extensively involved with backend testing by writing complex SQL queries.

Environment: Oracle, SQL, PL/SQL, Informatica, QTP, MTM, TFS, UNIX, Agile, Soap UI, Teradata, ALM/ HP Quality Center, Visio, TOAD, Rational Clear Quest, Rational Clear Case, Netezza, Java, XML Files, XSD, XML Spy

ETL TESTER

Confidential - Cleveland, OH

Responsibilities:

  • Responsible for Gathering test requirements and scoping out the test plan.
  • Written complex SQL queries.
  • Designed and tested UNIX shell scripts as part of the ETL testing process to automate the process of loading, pulling the data for testing ETL loads.
  • Tuned ETL jobs/procedures/scripts, SQL queries, PL/SQL procedures to improve the system performance
  • Wrote SQL and PL/SQL scripts to perform database testing and to verify data integrity.
  • Reported bugs and tracked defects using ALM/HP Quality Center
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Involved in data warehouse testing by checking ETL procedures/mappings.
  • Performed all aspects of verification, validation including functional, structural, regression, load and system testing.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mapping.
  • End-to-End maintenance of Netezza Appliance's hardware and software
  • Worked with systems engineering team to deploy and test new Hadoop environments and expand existing Hadoop clusters
  • Involved in developing detailed test plan, test cases and test scripts using TFS (Team Foundation Server) for Functional and Regression Testing.
  • Involved in Unit testing and integration testing of Informatica mappings.
  • Worked with web service testing tools like Soap UI pr -developed and build test cases, execute, and maintain Integration test harnesses, tests scripts as assigned.
  • Experience in testing and writing complex SQL, T-SQL and PL/SQL statements to validate the database systems and for backend database testing
  • Matrix, Test Cases, Expected Results and Prioritized tests for the applications for various modules using ALM/HP Quality center
  • Extensively used MTM (Microsoft Test Manager) for test planning, bug tracking and reporting
  • Participated in Integration, System, Smoke and User Acceptance Testing and production testing using QTP/UFT.
  • Working closely with IBM Netezza support and UNIX team for replacement and for various client related issues.
  • Managed and executed the test process, using Agile Methodology.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Written test cases for Automation process in MTM
  • Experience in ETL Data Warehousing, database testing using Informatica for Workflow process.
  • Worked on the data flow diagrams using VISIO
  • Performed integration testing of Hadoop packages for ingestion, transformation, and loading of massive structured and unstructured data in to benchmark cube.
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning.
  • Automated functional, GUI and Regression testing by creating scripts in QTP/UFT.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Extraction of test data from tables and loading of data into SQL tables.
  • Testing the ETL data movement from Oracle Data mart to Netezza Data mart on an Incremental and full load basis.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
  • Used Quality Center to state requirements, test plan, test cases, and update test run status for each iteration and to report the defects
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Worked with Data Conversion projects for testing data quality after migration.
  • Experienced using database query tools for Oracle, SQL and UNIX such as TOAD, Teradata SQL Assistant and SQL Plus.
  • Responsible for leading and coordinating with offshore team.

Environment: Informatica Power Center, SQL, PL/SQL, Agile, Soap UI, MTM, TFS, Teradata, Visio, Netezza, ALM/HP Quality Center, Java, QTP/UFT, Oracle, UNIX, TOAD, T-SQL, SQL Server, XML Files, XSD, XML Spy, Flat Files

ETL TESTER

Confidential - Chicago, IL

Responsibilities:

  • Prepared and Scheduled for Backup and Recovery. Maintenance of backup and recovery plans for system and user databases
  • Created roles, and set permissions to ensure that company's security policies are enforced and in effect.
  • Created test case scenarios, executed test cases and maintained defects in Remedy
  • Tested reports and ETL mappings from Source to Target
  • Developed and executed various manual testing scenarios and exceptionally documented the process to perform functional testing of the application
  • Tested Informatica mappings. Performed data validations against Data Warehouse
  • Extracted and transformed data from one server to other servers using tools like Bulk Copy Program (BCP), BULK INSERT.
  • Assisted Developers to design databases, tables, and table objects in support of development activities and to achieve optimal performance.
  • Understood business requirements, and helped re-design the development plan, and designed the data warehouse
  • Created Database Objects - Tables, Indexes, Views, and Roles.
  • Created Customized Stored Procedures, Triggers, and Cursors.
  • Involved in job scheduling using MS SQL server Agent.
  • Database authentication modes, creation of users, configuring permissions and assigning roles to users.
  • Responsible for Gathering test requirements and scoping out the test plan.
  • Responsible for creating Test Plan and scope of testing.
  • Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
  • Responsible for extensive data validations on Data warehouse and reports
  • Optimized, partitioned cubes to speed up report latency and performance optimizations.
  • Assisted in Troubleshooting of Report deployment on multiple servers.

Environment: Informatica, Microsoft SQL Server 2008, Enterprise Manager, Database Transformation Services (DTS), VB Script, T-SQL, SQL Profiler, MS Access, XML, XSLT

ETL TESTER

Confidential - Silver Spring, MD

Responsibilities:

  • Involved in Relational modeling and Dimensional Modeling Techniques to design ER data models and Star Schema designs for testing the data flow between source and target
  • Documented the validation rules, error handling and test strategy of the mapping.
  • Tuned the complex SQL queries for better performance by eliminating various performance bottlenecks.
  • Performed the Back-end Integration Testing to ensure data consistency on front-end by writing and executing SQL statements
  • Tested data warehouse ETL process using SSIS (Integration Service).
  • Performed Functional Testing and Back-end Testing using the database comparable results Manually
  • Written SQL Queries in Netezza for validating the Cognos reports based on Report Specification document.
  • Interacted with the users to ensure meaningful development of the scripts and simulated real time business scenarios.
  • Compared the actual result with the expected results. Validated the data by reverse engineering methodology i.e. backward navigation from target to source.
  • Created ETL test data for all ETL mapping rules to test the functionality of the SSIS Packages.
  • Wrote SQL scripts for Backend Testing of the data warehouse application and Cognos Reports testing
  • Created temporary tables for validations after understanding the transformation logic.
  • Tested several UNIX shell scripting for File validation and also PL/SQL programming
  • Tested complex mappings using mapplets and reusable transformations.
  • Experience in working with large data warehouses, mapping and extracting data from legacy systems, Netezza/SQL Server
  • Performed Data Centric testing to validate the data that is loaded in to the data warehouse is correct as per the defined business rules.
  • Experience developing software for SQL databases including My SQL, Oracle, MSQL, Netezza, ETL and ELT tools.
  • Established and documented ETL QA standards, procedures and QA methodologies.
  • Verify development of rule-set test scenarios, test cases, expected results and analysis of XML test scripts
  • Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
  • Developed and executed various manual testing scenarios and exceptionally documented the process to perform functional testing of the application
  • Reviewing the database Test cases and performs the Database Testing when writing the SQL queries.
  • Created ETL test data for all ETL mapping rules to test the functionality of the SSIS Packages
  • Good Experience with creating Cognos Reports like List, Crosstab, Chart and Repeater.
  • Performed extensive data validations against Data Warehouse
  • Involved in testing data reports. Written several SQL queries for validating the front-end data with backend.
  • Conducted back to back meetings with developers and discussed about all open issues.
  • Worked with SSIS system variable, passing variables between packages.
  • Conducted test case reviews to ensure scenarios accurately capture business functionality.
  • Worked with ETL Source System and Target System ( ETL Mapping Document) for writing test cases and test scripts.

We'd love your feedback!