We provide IT Staff Augmentation Services!

Sr. Etl Qa Resume

St Paul, MN


  • Over 8 years of experience in implementing and testing data warehousing and business intelligence solutions across Retail, Telecom, Claims and Service industries
  • Proficiency in unit testing, backend testing, integration testing, regression testing and system testing by writing and executing SQL queries
  • Involved in requirements and user - story discussions, story pointing/test estimation, documenting test plans, test scenarios and test cases based on the business requirements and use cases with a data-driven approach
  • Experience in preparing source to target mapping documents based on the Technical Design Documents and Functional Specifications Documents
  • Built and tested ETL code using various components of Informatica PowerCenter 10.x/9.x and IBM Information Server DataStage 8.x/11.x for integrating and ingesting data into EDW and data marts
  • Experience in analyzing the data model and ETL code; data flow/lineage; perform end to end data validation of ETL layer and BI layer
  • Knowledge about Big Data concepts, Hadoop, Databricks, Apache Spark, Delta lake/Data lake concepts, file formats like JSON, PARQUET, etc.
  • Experience testing SAP BOBJ Universe and Web Intelligence reports (ad-hoc and standard)
  • Experience in creating and enhancing SAS, Unix shell, Perl, batch scripts for automation of data validations
  • Experience in preparing and reviewing QA artifacts: test strategy, test plan documents, test scenarios, test cases and test scripts
  • Experience in troubleshooting issues, debugging, defect detection, root cause analysis, bug fixing, preparing Defect Status Reports
  • Experience in preparing test data against user’s test scenarios for User Acceptance Testing (UAT)
  • Experience querying large data sets, testing robust ETL Frameworks for migrating huge volumes of data, integrating data from disparate systems (Flat/XML files, RDBMs, Mainframes)
  • Fluent understanding of all stages of the Data Warehouse Project Lifecycle; worked at various phases of the lifecycle in both Agile (SCRUM) and Waterfall methodologies
  • Attention to detail, good team player with learnability and communication skills having worked in offshore-onsite model


ETL Tools: Informatica PowerCenter 10.x/9.x, IBM InfoSphere DataStage 8.x/11.x

RDBMS: Oracle, SQL Server, DB2, Teradata v15, Netezza

Programming: Unix shell (ksh), SQL, Oracle Pl/Sql, SAS, Python, SparkSQL

Reporting: SAP Business Objects, OBIEE, Tableau

HP: ALM, Rally, JIRA, qTest, Trello

Testing Tools: Selenium, Cucumber

AS400 iSeries, Control: M, Robot Scheduler, FileZilla, Putty, Tortoise SVN, Autosys, BMC Remedy, Windows/Unix, XML, Microsoft Excel

Methodologies: Agile (Scrum), Waterfall


Confidential, St. Paul, MN



  • Work with System Analysts and Product owner for requirements and Source to Target mapping documents walkthrough
  • QA effort estimation, test planning activities: prepare test plans, identify test scenarios, document both business validation and non-functional test cases
  • Scripting: write SQL scripts to validate table data accuracy(RxClaims), Unix scripts for file-based tasks, SAS scripts for QA automation of the testing activities
  • Views testing: Verify if views are accessible only corresponding roles and grantees, views data match the specifications mentioned in the data model
  • Collaborate with Datastage and SAP BO development teams to understand the build functionality
  • Report testing: Verify if Universe is built as per SRS, validate data using ad-hoc queries, corroborate standard report formats as specified in the user stories
  • Perform all aspects of verification, including functional, integrated, regression and system testing
  • Used HP-ALM to log defects, gather quality assurance metrics such as Total Planned Test Cases, Total No. Executed, Defect Counts, etc. as outlined in the Test Plan and share with stakeholders from time to time
  • Provide demo/walkthroughs for testing results to the project team before QA sign off

Environment: Oracle, Teradata, DB2, AS/400 iSeries, Flat Files, Unix, Datastage 11.x, SAP Business Objects, SAS Enterprise Guide, HP-ALM

Confidential, Tampa, FL



  • Work with Business Analysts, Data Modeler and BI team during Sprint planning sessions
  • Participate in the meetings for effort estimation and develop comprehensive test plans and detailed test scenarios
  • Document both business validation and non-functional test cases based on the requirements and discussions
  • Collaborate with Informatica coding team to understand the build functionality and architecture
  • Run ETL code via Informatica workflows or Unix master trigger script, verify workflow and session logs, documents test results appropriately
  • Prepare SQL queries for backend testing to validate data accuracy (record counts, NULLS, etc.) and integrity between dimensions and fact data
  • Mock up data as per scenarios (business logic) mentioned in the ETL Specification documents and test conditions to create expected results
  • Perform all aspects of verification, including functional, integrated, regression and system testing
  • Use qTest to record test cases and test scripts, trace user stories to test cases sprint wise, log defects, bug tracking
  • Provide root cause analysis to developers, investigating defects along with developers during the defect triage meetings
  • Gather quality assurance metrics such as Total Planned Test Cases, Total No. Executed, Defect Counts, etc. as outlined in the Test Plan and share with stakeholders from time to time
  • Support client during UAT phase by creating right data as per their test scenarios and record corresponding results
  • Oracle BI reports testing - Performed UI testing and data validation testing to make sure reports are pulling correct data from EDW

Environment: Oracle 12c, Netezza, Flat Files, Unix, Informatica 10.1, Rally, qTest, OBIEE

Confidential, Boston, MA



  • Interact with Business Systems Analysts, Data Analysts and Data Architects to understand the impact on downstream systems and data mapping
  • Create test scenarios, test cases and SQL scripts based on the ETL mapping sheets to compare data outputs
  • Execute test scripts, record test results, track defects using HP-ALM during various stages such as integration testing, performance testing, end-to-end testing, regression testing, etc.
  • Run Datastage jobs in Director as well as from the Unix command line, perform re-testing until all the defects are closed
  • Run AutoSys jobs to load into target tables and files
  • Work with Production Support team to ensure modifications are rightly tested by identifying key test conditions/scenarios and assist with troubleshooting issues for failed ETL processes in cross-functional team set up
  • Wrote Unix scripts towards building in-house tool for comparing data in source vs target files
  • Deliver artifacts with quality along the client’s guidelines both during short sprint and long sprint cycles (2 and 6 weeks)
  • Support performance test execution of ETL data loading for each sprint, develop performance test cases, test data requirements; and test reports

Environment: MySql, SQL Server, Squirrel, Oracle12c, Unix, Oracle Data Integrator 10.x, IBM Datastage 11.x, SAP Business Objects, HP-ALM


Sr. ETL Developer


  • Worked independently at offshore, design through SIT phases of the assignment in hybrid of waterfall-agile set up.
  • Analyzed the Order Management OLTP system schemas and designed the ETL.
  • Validated the DataStage partition techniques such as Hash, Entire, Round Robin for integration of data coming from Oracle databases and flat files
  • Documented and executed test plans, test cases based on the business requirements.
  • Documented all the test scenarios, test scripts, data, and constraint validation rules for Functional testing
  • Customized the existing Perl script to trigger the nightly batch and Unix scripts to send out the target result set in flat files via e-mails to the SEO application users daily

Environment: Oracle 10g, Flat Files, XML, Unix, DataStage 8.7


ETL Developer


  • Collected and understand information about source systems and processes from subject matter experts
  • Validated the DataStage sequences that migrate data from 6 servers and load GFCS data warehouse present on 3 servers using multi instance jobs with respective invocation Ids and parameter sets
  • Documented and executed test plans for System testing. Prepared test data to check the error handling capabilities of the code
  • Prepared Requirement Traceability Matrix to ensure all the requirements are tested for and comprehensive defect reports to track all the bugs found during integration and regression testing in HP-QC
  • Managed and maintained issue and bug tracker and shared weekly status reports during periodic review meetings
  • Analyzed Change Requests (CRs) to Create, Design and Execute Test Cases for each CR
  • Assisted team lead in drafting the release plan for Production changes rollout as per the dependencies and criticalities

Environment: Informix, MS Access, Sql Server 2008, Oracle 10g, Mainframes, Teradata v13, Unix, DataStage 8.1, OBIEE


ETL Developer


  • Worked as a DataStage developer in the ETL project that involved phased migration from a legacy billing system, EhPT Progressor to a strategic billing system, Comverse Kenan FX
  • Worked in an offshore/onsite model with project manager and team lead to do a fit-gap analysis
  • Analyzed the existing pl/Sql code (procedures, functions, triggers) and system functionality that needs to be migrated into DataStage and load InSight data warehouse
  • Developed DataStage jobs as per Source - Target mapping document while reusing the pl\Sql code as required
  • Develop Unix scripts for file handling operations and automate job runs and data loading procedures
  • Documented and executed test cases for Unit and System testing, perform peer review, log defects in the tracker

Environment: Oracle 9i, Netezza, Unix, DataStage 8.0, SAP Business Objects

Hire Now