Etl Qa Tester Resume
Boston, MA
PROFESSIONAL SUMMARY:
- 7+ years of IT experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL.
- Experience using query tools for Oracle, DB2 to validate reports and troubleshoot data quality issues.
- Involved in planning, designing, building, maintaining and executing tests and test environments at each point in SDLC
- Solid Back End Testing experience by writing and executing SQL Queries.
- Experience in Software QA and Testing Methodologies, verification and validations in all phases of the SDLC.
- Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Experience in UNIX shell scripting and configuring corn - jobs for Informatica sessions scheduling.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using Informatica, SSIS, Data Stage and Ab Initio.
- Expertise in Testing complex Business rules by creating mapping and various transformations.
- Experience in data migration which included identifying various databases where the information/data lay scattered.
- Experience in testing and writing SQL and PL/SQL statements.
- Strong working experience on DSS (Decision Support Systems) applications, and Extraction, Transformation and Load (ETL) of data from Legacy systems using Informatica.
- Extensive testing ETL experience using Informatica (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager)
- Extensive knowledge and experience in producing tables, reports, graphs and listings using various SAS procedures and handling large databases to perform complex data manipulations.
- Extensive experience in testing and implementing Extraction, Transformation and Loading of data from multiple sources into Data warehouse using Informatica.
- Proficient experience in different Databases like Oracle, SQL Server, DB2 and Teradata.
- Expertise in Developing PL/SQL Packages, Stored Procedures/Functions, triggers.
- Extensively worked on Dimensional modeling, Data cleansing and Data Staging of operational sources using ETL processes.
TECHNICAL SKILLS:
Testing Tools: Quality Center, Test Director, Win Runner, Quick Test Pro
Languages: PL/SQL, Win SQL, MS SQL, VB Script
Operating Systems: Windows, UNIX, LINUX
Databases: Teradata, Netezza, Oracle, Sybase, DB2
ETL Tools: Informatica, SSIS, Data Stage, Ab Initio
BI Tools: MicroStrategy, Cognos, Business Objects, SSRS, SSAS
WORK EXPERIENCE:
Confidential, Boston, MA
ETL QA Tester
Responsibilities:
- Extensively used Informatica power centre for extraction, transformation and loading process.
- Created public folders, private folders, personalized pages and custom views on the Cognos connection portal.
- Involved in testing the batch jobs, using UNIX and Autosys
- Solid Back End Testing experience by writing and executing SQL Queries.
- Managed test sets and status reports for each release using ALM\HP Quality Center
- Involved in testing of Universes from different data sources like Oracle/SQL Server.
- Extensively written Teradata SQL Queries, creating Tables and Views by following Teradata Best Practices.
- Working in Agile environment attend daily stand up, sprint planning, backlog grooming refinement and retrospective meeting.
- Monitoring and managing the Hadoop/Big Data cluster using Cloudera Manager.
- Extensively used Informatica client tools.
- Checking the status of ETL jobs in UNIX and tailing the log files while the loading is under processing.
- Distributed the reports to the users via Cognos Connection
- Wrote SQL and PL/SQL scripts to perform database testing and to verify data integrity
- Reported bugs and tracked defects using Quality Center/ALM
- Extracted data from various sources like Oracle, flat files and SQL Server.
- Extensively used Teradata load utilities Fast load, Multiload and Fast Export to extract, transform and load the Teradata
- Managed and executed the test process, using Agile Methodology.
- Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate
- Involved in the error checking and testing of the ETL procedures and programs Informatica session log.
- Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic
- Extensively worked in the Unix Environment using Shell Scripts.
- Analysed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center
- Writing complex SQL queries using Case Logic, Intersect, Minus, Sub Queries, Inline Views, and Union in Oracle
- Used Toad for query SQL Assistant for Teradata and SQL Management studio for SQL Server
- Worked with Unix for file manipulations, text processing, data parsing and Converted SQL queries results into Unix variable
- Monitored the Informatica workflows using Power Centre monitor. Checked session logs in case of aborted/failed sessions.
- Performed integration testing of Hadoop/Big Data packages for ingestion, transformation, and loading of massive structured and unstructured data in to benchmark cube.
Environment: Informatica, Cognos, SQL, PL/SQL, Agile, Oracle, UNIX, TOAD, XML, Hadoop/Big Data, Teradata, Quality center/ HP ALM, XSLT.
Confidential, Chicago, IL
ETL BI SQL Tester
Responsibilities:
- Tested Informatica ETL mappings that transfer data from source systems to the Data Mart
- Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic
- Performed back-end testing, table manipulations of relational systems by writing complex SQL queries manually.
- Written test cases to test the application manually in Quality Center and automated using Quick Test Pro
- Written complex SQL queries and queried the Oracle database to test ETL code
- Used Teradata utilities (MLOAD & FLOAD) to load the source files in to test region for data validation.
- Reviewed the test activities through daily Agile Software development stand-up meetings.
- Worked on Informatica Power Center repository server and analysed the workflows.
- Involved in testing Cognos Reports and closely worked with operations, and Release teams to resolve the production issues.
- Performed backend testing using SQL Queries and UNIX scripting
- Utilized new technologies/tools/frameworks cantered around Hadoop/Big Data and other elements in the Big Data space.
- Wrote extensive SQL and PL/SQL Scripts for creating stored procedures in Oracle
- Defects were tracked, reviewed Written test cases to test the application manually in Quality Center and automated using Quick Test Pro
- Extracted data from various sources like Oracle, flat files and SQL Server
- Wrote complex queries in Teradata SQL assistant to check the data from Source and Target
- Worked in AGILE Methodology and used Agile Test Methods to provide rapid feedback to the developers significantly helping them uncover important risks.
- Written SQL scripts to test the mappings and extensively used Cognos for report generation
- Performed Informatica ETL testing to validate end-to-end data from source system MS SQL Server and Target system environment.
- Tested several UNIX Shell Scripts and for connecting to database and for file manipulation of data
- Written complex SQL queries for each level of data validation.
- Used HP Quality Center to perform Manual Testing and logging and tracking defect in Clear Quest until the bug was fixed
- Data is pushed to Hadoop/Big Data Lake using Hadoop/Big Data stack tools and will be exported to downstream business systems for analysis
- Extensively used Oracle Loader to load Test data from flat files to tables.
- Involved heavily in writing complex SQL queries based on the given requirements such as complex Teradata Joins, Stored Procedures, and Macros.
- Tuned SQL queries using execution plans for better performance.
Environment: Informatica, Cognos, SQL, PL/SQL, Agile, UNIX, Oracle, TOAD, XML, Hadoop/Big Data, Teradata, Quality center/ HP ALM, XSLT.
Confidential, San Antonio, TX
ETL Tester
Responsibilities:
- Monitored the Informatica workflows using Power Centre monitor. Checked session logs in case of aborted/failed sessions.
- Validated various MicroStrategy reports such as ad hoc, grids, graphs.
- Extraction of test data from tables and loading of data into SQL tables.
- Extraction of test data from tables and loading of data into SQL tables.
- Tracked the defects with application teams and assisted in closure using Jira.
- Involved heavily in writing complex SQL queries based on the given requirements such as complex Teradata Joins, Stored Procedures, and Macros
- Performed Informatica ETL testing to validate end-to-end data from source system MS SQL Server and Target system environment.
- Involved in testing all the reports that were generated through MicroStrategy.
- Used Jira for maintaining the Test Cases and Test Scripts for the application.
- Wrote complex queries in Teradata SQL assistant to check the data from Source and Target
- Used TOAD Software for Querying ORACLE.
- Involved in testing the MicroStrategy Reports, Report Service Documents and dashboards by writing complex SQL queries
- Performed back-end testing, table manipulations of relational systems by writing complex SQL queries manually.
- Used JIRA to state requirements, test plan, test cases, and update test run status for each iteration and to report the defect
- Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices
- Extensively used Oracle Loader to load Test data from flat files to tables.
- Used SQL tools to run SQL queries and validate the data loaded in to the target tables
- Tested the migration of reports from MicroStrategy.
Environment: Informatica, MicroStrategy, SQL, Oracle, Teradata, TOAD, XML, Jira, PL/SQL, XSLT.
Confidential
DWH Tester
Responsibilities:
- Involved in testing the Cognos reports by writing complex SQL queries. Used SQL tools to run SQL queries and validate the data loaded in to the target tables
- Involved in full life cycle Testing of ETL Data warehouse and reporting environment
- Tested MicroStrategy reports for different scenarios like slice and dice, drill down and drill up.
- Supported the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Informatica.
- Used JIRA for adding issues on what is being worked, creating test cases and executing test case
- Completed several in-house training courses such as Elasticsearch, Hadoop/Big Data, Hive, Spark and to learn about applications that sources data to the Warranty Cost Recovery system
- Validating maps on Informatica to verify data from Source to Target data base & record test results
- Involved in testing data mapping and conversion in a server-based data warehouse.
- Performed backend testing using SQL Queries and UNIX scripting
Environment: Informatica, Hadoop/Big Data, SQL, MicroStrategy, Unix, Toad, XML, XSLT.