We provide IT Staff Augmentation Services!

Sr. Etl/qa/bi Report Tester Resume

Charlotte, NC

SUMMARY

  • 7+ Years of Professional Experience in Software Quality Assurance (QA) and Testing in different environments and platforms including Data Warehousing, Business Intelligence, Client/Server and Web based applications.
  • Experience in Data Warehouse applications testing using Informatica, Ab Initio, Data Stage and SSIS on multiple platforms.
  • Proficient experience in different Databases like Oracle, SQL Server, DB2 and Teradata.
  • Involved in all phases of the QA Life Cycle and SDLC, with timely Delivery against aggressive deadlines, with QA methodologies such as Waterfall, Agile, etc.
  • Extensive experience in writing SQL and PL/SQL scripts to validate the databases systems and for backend database testing.
  • Experienced in using test management and bug reporting tools including HP Quality Centre and JIRA, Pivotal Tracker, Bug herd, Fogbugz tool.
  • Extensive working knowledge in UNIX/Linux operating systems.
  • Experience in User Acceptance Testing, Performance Testing, GUI Testing, System & Functional Testing, Integration Testing, Regression Testing, Data Driven, Keyword driven Testing and both Manual as well as using automated testing tools including Win Runner, Load Runner and QTP.
  • Experience in testing Business Intelligence reports generated by various BI Tools like Micro Strategy, Cognos and Business Objects.
  • Involved in front - end and back-end testing. Strong knowledge of RDBMS concepts
  • Having good knowledge on Oracle Data Integrator. Worked on Data files & label parameter of data file, strong in writing UNIX Korn shell scripting.
  • Worked with XML feeds from multiple sources systems and loaded the same into EDWH.
  • Solid experience in Black box & White box Testing techniques.
  • Excellent understanding of the System Development Life Cycle. Involved in analysis, design, development, testing, implementation, and maintenance of various applications.
  • Performed Manual and Automated Testing on Client-Server and Web-based Applications.
  • Extensive experience in drafting Test Flows, Test Plans, Test Strategies, Test Scenarios, Test Scripts, Test Specifications, Test Summaries, Test Procedures, Test cases & Test Status Reports.
  • Strong knowledge of test methodologies: Object Oriented test methodology, Service Oriented Architecture, Top to bottom and Bottom to top test methodology, QA Validations &QA Compliances to ensure the Quality Assurance Control.

TECHNICAL SKILLS

ETL Tools: Informatica, SSIS, Data Stage, Ab Initio (GDE 1.14, Co>Operating System)

Programming: SQL, PL/SQL, UNIX Shell Scripting, PERL, XML, XSLT

Operating Systems: Windows 95/98/NT/2000, UNIX (Sun Solaris2.6/2.8, Linux 2.4,HP-Unix, IBM AIX 5.5)

Databases: Oracle 11g, 9i/10g, IBM DB2 9.x, Sybase 12.5, SQL Server 2008, TeradataV2R6 and Informix.

Testing Tools: HP Quality Center 10, 11 Rational Clear Quest

Version Control Tools: Clear Case, Ab Initio EME, CVS, PVCS 7.0

Tools: XML Spy 2010, TOAD, Putty, WinSCP3, Scorex,AppEdit v8.2, Ultra Editor, Control - M, TWS, FileZilla

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

Sr. ETL/QA/BI Report Tester

Responsibilities:

  • Responsible for performing various types of process evaluations during each phase of the Software Development Life Cycle, including Review, Walk Through and hands on system testing.
  • Identify & record defects with required information for issue to be reproduced by development team.
  • Used Netezza data warehousing tool for collection, storage and staging of all metadata used for different client websites.
  • Identifying, logging, tracking and escalating bugs using JIRA.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mapping.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
  • Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases.
  • Responsible for Data mapping testing by writing complex SQL Queries using TOAD.
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Validating the data passed to downstream systems.
  • Involved in creating dashboards and reports. Created report schedules on Tableau server.
  • Customized complex reports in Excel using intricate formulas
  • Made recommendations on potential functional and technical improvements to planned or existing system components and applications
  • Involved in the Agile Scrum Process
  • Added new features for compliance with and maintenance of quality standards.
  • Implemented SDLC, QA methodologies and concepts in the Project )
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
  • Involved in testing of Universes from different data sources like Oracle/SQL Server.
  • Worked with Data Extraction, Transformation and Loading (ETL).
  • Reviewing the test activities through daily Agile Software development stand-up meetings.
  • Used HP Quality Center for writing the test cases and logging the defects.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.

Environment: Agile Informatica, E2E testing,JIRA, DB2, TOAD, Oracle 10g, Tableau, SQL, PL/SQL, SQL Server 2008, XML, XSLT.

Confidential, Woonsocket, RI

ETL/QA/BI Report Tester

Responsibilities:

  • Created test cases and test plans for user acceptance testing and system testing based on functional specifications.
  • Tested all the ETL processes developed for fetching data from OLTP systems to the target Market Data warehouse using complex SQL queries.
  • Tested PL/SQL procedures that were developed to load the data from temporary tables in staging to target tables in the data warehouse
  • Extensively used HP ALM to upload requirements, write test cases
  • Working in agile environment, attended daily stand up meeting, SCRUM meetings.
  • Provided support to offshore QA team by giving them knowledge transfer and helping them with closure of the defects.
  • Produced ETL detailed designs and documentation for Informatica Power Center.
  • Performed data validation on the flat files that were generated in UNIX environment using UNIX commands as necessary.
  • Distributed the reports to the users via Cognos Connection.
  • Reported defects using HP ALM Verified fixes and closed bugs during regression testing.
  • Used HP ALM to Report, track and, monitor defects.
  • Tested the XMLs feeds received from another source which is a third party for data consistency.
  • Tested the ETL with XML as source and tables in the data warehouse as target.
  • Tracked defects to closure by coordinating with the dev team.
  • Defined testing criteria, planned, created and executed test plans in a mainframe environment.
  • Tested source data for data completeness, data correctness and data integrity.
  • Performed End to end testing starting from the source to the report.
  • Conducted and coordinated integration testing and regression testing.
  • Participated in business requirements gathering and in modifications of the requirements based on the scope.
  • Involved in testing the Cognos reports by writing complex SQL queries.
  • Prepared UNIX scripts to run the informatica ETL jobs from command line.

Environment: Agile, SQL, PL/SQL, HP ALM XML, E2E testing,Oracle 10g, Informatica, Cognos, TOAD for Oracle

Confidential, Wilmington, DE

ETL/QA/BI Report Tester

Responsibilities:

  • Responsible for Business analysis and requirements gathering.
  • Performed complex defect reports in various environments like UAT, SIT, and QA etc to ensure the proper delivery of the application into the production environment.
  • Expertise in SQL queries for cross verification of data.
  • Involved in extensive DATA validation using SQL queries and back-end testing.
  • Used SQL for Querying the DB2 database in UNIX environment
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
  • Used Ab Initio to extract, transform and load data from multiple input sources like flat files, Oracle 10g database to the Teradata database.
  • Developed/Modified test graphs based on business requirements using various Ab Initio Components like Filter by Expression, Partition by Expression, reformat, join, gather, merge rollup, normalize, denormalize, replicate etc.
  • Involved in testing data mapping and conversion in a server based data warehouse.
  • Involved in testing the UI applications
  • Involved in testing Cognos Reports and closely worked with operations, and Release teams to resolve the production issues.
  • Created SQL Queries using SQL*Plus for dropping indexes prior to the load graph execution to be able to perform direct loads on the database tables.
  • Involved in prepare the documentation and manuals for User Acceptance testing.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.

Environment: Ab Initio,, EME, UNIX, Shell Scripting, XML Files, XSD, XML Spy 2010, Oracle 10g, HP ALM BEA Web Logic 8.1, Cognos.

Confidential

ETL/QA/BI Tester

Responsibilities:

  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center
  • Performed Data Analysis of existing data through SQL.
  • Prepared project reports for management and others. Assisted project managers in the development of weekly and monthly status reports.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning
  • Extraction of test data from tables and loading of data into SQL tables.
  • Written several shell scripts using UNIX Korn shell for file transfers, error log creations and log file cleanup process.
  • Maintained all the test cases in HP Quality Center and logged all the defects into the defects module
  • Written several complex SQL queries for validating Business Reports.
  • Validated cube and query data from the reporting system back to the source system
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Validating the data passed to downstream systems.
  • Involved in testing data mapping and conversion in a server-based data warehouse.
  • Involved in testing the UI applications
  • Worked with business team to test the reports developed in Cognos
  • Tested whether the reports developed in Cognos are as per company standards.
  • Used Test Director to track and report system defects
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
  • Worked on data quality, data organization and delivery.
  • Organized cross training of team members and users to become more responsive.

Environment: Oracle 9i, Cognos, Quality Center, XML, XSLT, XSD, UNIX, Shell Scripting, IBM AIX 5.3, SQL, PL/SQL, Teradata V2R5

Hire Now