We provide IT Staff Augmentation Services!

Sr. Qa/etl Backend Tester Resume

Detroit, MI

SUMMARY

  • Over 7+ years of experience Software Quality Assurance (QA) experience testing Data Warehouse, Database (ETL), Web, Client - Server Systems and Applications for various Industries.
  • Experience in defining Testing Methodologies; creating Test Plans and Test Cases, Verifying and Validating Application Software and Documentation based on standards for Software Development and effective QA implementation in all phases of Software Development Life Cycle (SDLC)
  • Strong in Software Analysis, Planning, Design, Development, Testing, Maintenance and Augmentation for various applications in data warehousing, metadata repositories, data migration, data mining and Enterprise Business Intelligence.
  • Expert in ETL, Data Warehousing, Backend testing.
  • Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Integration and Metadata Management Services.
  • Extensive experience in writing SQL to validate the database systems and for backend database testing.
  • Extensive experience with IBM Mainframe for analyzing the data in designing the ETL Process.
  • Extensively worked on design and implementation of Database Management Systems such as Oracle 11g/10g / 9i / 8i / 7.x, DB2 UDB, MS SQL Server, and MS Access.
  • Implemented all stages of development processes including extraction, transformation and loading (ETL) data from various sources into Data Warehouse and Data Marts using Informatica PowerCenter using Informatica Designer (Source Qualifier, Warehouse Analyzer, Transformation Developer, Mapping and Mapplet Designer), Repository Manager, Workflow Manager and Workflow Monitor.
  • Experienced in working with Business Users, Business Analysts, IT leads and Developers in identifying and gathering Business requirements to further translate requirements into functional and technical design specifications.
  • Worked extensively in UNIX (AIX) and Linux environment, used Shell Scripts for automating batch transfers, table space management, automated backup, user group maintenance, security and custom report generation.
  • Comprehensive knowledge of Ralph Kimball’s data modeling concepts including Dimensional Data Modeling and Star/Snowflake Schema Modeling.
  • Extensive experience in performing Black Box, Regression, Integration, and User Acceptance testing.
  • Experienced working with Excel Pivot and VBA macros for various business scenarios.
  • Effective independently or in a team. Worked as a Team Leader. Excellent communication as well as inter-personnel skills. Ability to convey technical information at all levels. Excels in research, analysis, and problem solving skills.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 9.1/8.6.1/8.1 / 7.1/ 6.1.

Data Bases: Oracle 11g/10g / 9i / 8i / 7.x, MainFrame via web3270, DB2 UDB, MS SQL Server 2008/2005/2000 / 7.0 , MS-Access, teradata v2r6

Development Languages: SQL*Plus, T-SQL, PL/SQL 2.2 / 8.x, Unix Shell Scripting, TOAD 7 / 7.6, SQL Loader, VBA

Operating Systems: UNIX (AIX), Linux, MS-DOS, Windows vista / XP / 2000 / NT / 98 / 95.

Methodologies: Ralph Kimball’s Data modeling, Star and Snowflake Schema Modeling

Testing Tools: Mercury Quality Center 11.0/ 10.0/ 9.0 / 8.0 , Test Director 7.6 / 7.0, Quick Test Professional 6.5 / 5.6, Win Runner 7.5 / 7.0

Workflow Tools: Putty, WinSCP3, MS-Project, MS-Excel, MS-PowerPoint, MS-Word.

Management Tools: Peregrine Service Center 5.1, HP Project and Portfolio Management Center (ITG) 7.1

PROFESSIONAL EXPERIENCE

Confidential, DETROIT, MI

Sr. QA/ETL BACKEND TESTER

Responsibilities:

  • Reviewed the Business Requirement Documents and the Functional Specification.
  • Full Life Cycle Testing (SDLC) and development of both claim submission and remittance retrievals.
  • Closely went through and worked on all the stages of SDLC for this project and designed and executed Functional, Integration, Regression, and System (End to End), UI Testing, Browser Compatibility, Backend (Database) Testing.
  • Prepared Test Plan from the Business Requirements and Functional Specification.
  • Developed Test Cases for Deployment Verification, ETL Data Validation, Cube Testing and Report testing.
  • Worked on a Business Intelligence reporting system that was primarily functioning on Oracle Applications OLTP environment with Business Objects for Business Intelligence reporting
  • Tested the reports using Business Objects functionalities like Queries, Slice and Dice, Drill Down, Cross Tab, Master Detail and Formulae etc.
  • Designed and developed UNIX shell scripts as part of the ETL process to automate the process of loading, pulling the data for testing ETL loads.
  • Created Jobs to Written several UNIX Shell Scripts for cleanup, logging, file manipulation and transferring the files.
  • Modified the UNIX scripts for Testing purpose to check the Rollback mechanism, Restatement window logic, wrapper scripts etc.
  • Extensive experience in creating automation environment using Quick Test Professional, Selenium RC, Selenium Web driver, SOAPUI.
  • Performed Cross Browser Testing for the application in different versions of IE/Mozilla/Chrome/Safari using Selenium Web driver. Performed UI, Browser compatibility testing using Selenium Web driver.
  • Responsible for developing a Performance Testing Plan and Performance Testing strategy based on the business specification requirements and user requirements
  • Involved in LR scripting and performance testing along with teams from IBM and SAIC creating a complete platform for performance testing.
  • Worked as an independent consultant for performance testing and coordinated with multiple vendors.
  • Worked in Automation of two modules using QTP. Tested VB Scripts in QTP which was used later to Automate Check and Deposit processing modules.
  • Involved in writing test scripts and functions in Test Script Language using QTP for automated testing.
  • Involved in automation of test cases using QTP, Automated detailed test cases by using Quick Test Pro
  • Analyzed massive and highly complex data sets, performing ad-hoc analysis and data manipulation.
  • Focal point for making sound decisions related to data collection, data analysis, data security, methodologies and designs.
  • Conducted research to collect and assemble data for databases - Was responsible for design/development of relational databases for collecting data.
  • Developed test data for several processes required by data validation effort both manually and through SQL processes. Assisted ETL team in data Anonymization task by supplying rules
  • Experience in Data Validation, Data Modeling, Data Flows, Data Profiling, Data Mining, Data Quality, Data Integration, Data Verification, and Data loading, Involved in extensive DATA validation using SQL queries and back-end testing
  • Used T-SQL for Querying the SQL Server database for data validation.
  • Performed UAT (User Acceptance Testing) and executed to verify requirements, look and feel of the applications.
  • Responsible for creating manual test scripts to include Functional Test, Regression Test, UAT, Migration Test and Study Configuration Test.
  • Testing (that includes unit, integrated, regression, and UAT) a new .NET application which uses Microsoft Visual Studio under the Scrum (Agile) Methodology.
  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation.
  • Tested to verify that all data were synchronized after the data is troubleshoot and also used SQL to verify/validate my test cases.
  • Supported the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Informatica and provide technical support and hands-on mentoring in the use of Informatica for testing.
  • Extensively used Informatica to load data from Flat Files to Teradata, Teradata to Flat Files and Teradata to Teradata
  • This work will include loading of historical data onto the Teradata platform, as well as reference data and metadata.
  • Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.
  • Worked as ETL Tester responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.
  • Written several UNIX scripts for invoking data reconciliation.
  • Solid Back End Testing experience by writing and executing SQL Queries.
  • Creation of test scripts in SQL based on published design documents
  • Exported the results through SQL and documented the test results for test cases, suites executed.
  • Worked on Various SQL Server procedures for various Bugs, executed queries in test databases using SQL queries.
  • Executed SQL Queries for testing integrity of data in database (Backend Testing).
  • Experienced in writing complex SQL queries for extracting data from multiple tables.
  • Testing has been done based on Change Requests and Defect Requests.
  • Preparation of System Test Results after Test case execution.
  • Performed Functional, Regression, Data Integrity, System, Compatibility testing
  • Written several complex SQL queries for validating Business Object Reports.
  • Fine-tuned for performance and incorporate changes to complex PL/SQL procedure / Packages for updating the existing dimension tables using PL/SQL Developer on Oracle 8i RDBM.
  • Performed backend database testing by writing SQL and PL/SQL scripts to verify data integrity
  • Developed SQL Stored Procedures and Queries for Back end testing.
  • Extensively executed T-SQL queries in order to view successful transactions of data and for validating data in Sql Server Database.
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • TOAD is used to perform manual test in regular basis. UNIX and Oracle are using in this project to write Shell Scripts and SQL queries.
  • Wrote SQL queries to validate source data versus data in the data warehouse including identification of duplicate records.
  • Experienced in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results using Mercury Quality Center
  • Prepared Test status reports for each stage and logged any unresolved issues into Issues log.
  • Used T-SQL for Querying the SQL Server database for data validation.
  • Writing the test scripts for manual testing.
  • Involved with ETL test data creation for all the ETL mapping rules.
  • Preparing and supporting the QA and UAT test environments.
  • Tested different detail, summary reports and on demand reports.
  • Communicated discrepancies determined in testing to impacted areas and monitored resolution.

Environment: INFORMATICA POWER CENTER 9.1, Business Objects Enterprise XI R2/ R3, SQL, PL/SQL, UNIX, Agile, Quality Center 11.0, IBM AIX 5.5, DB2, TERADATA V2R6, SYBASE 12.5, Shell Scripting, XML Files, VSAM COBOL Files, IBM, AUTOSYS, XML SPY 2010, ORACLE 11G, TOAD 10, Word, Excel, Outlook.

Confidential

ETL SQL /BACKEND TESTER

Responsibilities:

  • Reviewed the Business Requirement Documents and the Functional Specification.
  • Prepared Test Plan from the Business Requirements and Functional Specification.
  • Carried out data profiling for multiple loan feeds.
  • Performance Testing - Load testing, stress testing and soak testing of the application.
  • Assigned the tasks of assisting senior staff in designing and developing performance testing procedures
  • Responsible for preparing data in support full performance testing life cycle
  • Extensive knowledge of performance testing, database testing, and structured software testing
  • Wrote several UNIX scripts for invoking data reconciliation.
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Executed UNIX Shell Scripts for Batch Job Execution.
  • Written several UNIX scripts for running test loads for regression testing purpose.
  • Trained the users before UAT on how to test and document the test results. Also, assisted the users during UAT. Took part in Triage Meetings with the required parties after defect analysis to prioritize defect resolution
  • Involved in user training sessions and assisting in UAT (User Acceptance Testing)
  • Performed validation tests to ensure that the developed functionality meets the specifications prior to UAT testing
  • Migrated scripts from Selenium IDE to Selenium Web driver and created framework scripts from scratch.
  • Used Selenium Web driver to test browser compatibility and correct functionality of the website.
  • Good Experience on Selenium IDE and creating Scripts in selenium --RC by using Java,Used Selenium IDE for Open source web testing
  • Maintained automated regression test cases in Selenium Web Driver using Java programming language
  • Maintained the data integrity during extraction, manipulation, processing, analysis and storage.
  • Built data input and designed data collection screens - Managed database design and maintenance, administration and security for the company.
  • Discussed intelligence and information requirements with internal and external personnel
  • Used Quick Test pro to write the automated test scripts by using various actions and reusable actions.
  • Manually tested the application functionality and developed Automation test scripts to perform functional and regression testing and prepared automation test scripts, Methods and functions in QTP.
  • Automated test cases, for data driven tests and linked test scripts using QTP to perform Regression testing of different application versions.
  • Extensively used SQL statements to query the Oracle Database for Data Validation and Data Integrity Worked with data validation, constraints, source to target row counts.
  • Perform Functional, Data Validation, Integration, regression and User Acceptance testing.
  • Ensure data integrity and data validation throughout the multiple environments across the application.
  • Involved in the complete Software Development Life Cycle (SDLC).
  • Wrote, prepared and executed Manual Test Cases based on Requirements and Use Cases, and Automated Test Scripts throughout Software Development Life Cycle (SDLC).
  • Performed ETL testing based on ETL mapping document for data movement from source to target.
  • Tested ad hoc and canned reports for Business objects.
  • Tested Business Objects reports and Web Intelligence reports.
  • Managed user accounts and security using Business Objects Supervisor
  • Tested the universes and reports in Business Objects 6.0
  • Strong testing Quality Assurance experience within agile environment.
  • Good understanding of agile software development lifecycle (iterative and incremental).
  • Performed tests on various features of agile development process.
  • Extensively used Informatica to load data from Flat Files to Teradata, Teradata to Flat Files and Teradata to Teradata.
  • Automated detailed test cases by using Quick Test Pro.
  • Used Quick Test Pro to write the automated test scripts by using various actions and reusable actions.
  • Used Quality Center for bug reporting.
  • Tracked and reported the bugs with Quality center.
  • Defect Reporting & Defect Tracking using Quality Center.
  • Tested complex objects to the universe to enhance the report functionality.
  • Tested ad hoc and canned reports for Business objects.
  • Written several complex SQL queries to validate the Data Transformation Rules for ETL testing.
  • Written extensive UNIX Shell scripting for data parsing and text parsing needs including archiving the old data, running backend jobs & setting up of job dependencies.
  • Performed extensive data validations against Data Warehouse
  • Loaded flat file Data into Teradata tables using Unix Shell scripts.
  • Responsible for verifying business requirements, ETL Analysis, ETL test and design of the flow and the logic for the Data warehouse using Informatica and Shell Scripting
  • Tested several Informatica Mappings to validate the business conditions.
  • Conditional testing of constraints based on the business rules
  • Designed and executed the test cases on the application as per company standards and tracked the defects using HP Quality Center.
  • Informatica Power Center was used to extract, transform and load data from different operational data sources like Oracle, SQL server into SQL database.
  • Extensive experience in writing SQL and PL/SQL scripts to validate the database systems and for backend database testing.
  • Developed and Executed SQL statements in Toad to retrieve data and to validate data.
  • Implemented Database Checkpoints for Back-end Testing.
  • Performed the Back-end Integration Testing to ensure data consistency on front-end by writing and executing SQL statements.
  • Worked on SQL Tools like TOAD and SQL Server Management Studio to run the SQL Queries to perform manual test in regular basis and validate the data.
  • Involved extensively in doing back end testing of the data quality by writing complex SQL.
  • Designed and prepared scripts to monitor uptime/downtime of different system components
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Monitored the data movement process through Data Extract to flat files through Informatica execution flows and Loading data to Data mart through NZLOAD utility.
  • Testing the ETL data movement from Oracle Data mart to Teradata on an Incremental and full load basis.
  • Developed the ETL process to automate the testing process and also to load the test data into the testing tables.

Environment: VBA, SQL, PL/SQL,XML, Excel Pivot, Informatica PowerCenter 8.6.1, Business Objects Enterprise XI R2 / XI R3.Agile, Oracle 10g/9i, QTP, DB2 UDB, WinSQL, UNIX (AIX), Linux, PuTTY, WinSCP3, Mercury Quality Center 10.0, Word, Excel, Outlook, Autosys, Teradata, TOAD, XML

Confidential, Phoenix, AZ

Data/SQL/ETL Tester

Responsibilities:

  • Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
  • Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
  • Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain the data marts (Load data, Analyze using OLAP tools).
  • Performance monitoring and tuning on UNIX systems. Develop step-by-step method using predesigned worksheets to eliminate much of guesswork in performance monitoring and tuning.
  • Assisted in analyzing incoming equipment and developing the necessary control applications in Linux and Unix.
  • Used to perform manual test in regular basis. UNIX and Oracle are using in this project to write Shell Scripts and SQL queries
  • Developed inline view queries and complex SQL queries and improved the query performance for the same.
  • UNIX and Oracle are used in this project to write Shell Scripts and SQL queries.
  • Involved in Identifying of the unused test cases in the Test Plan in Quality Center and prepared a consolidated doc to delete the test cases.
  • Data output - Made data chart presentations and coded variables from original data, conducted statistical analysis as and when required and provided summaries of analysis.
  • Trained data analysis beginners to improve overall efficiency of department.
  • Cooperated with external outsourcing for the design and development of analysis.
  • Involved in automation of test cases using QTP.
  • Did functional testing using QTP
  • Did extensive work with ETL testing including Data Completeness, Data Transformation & Data Quality for various data feeds coming from source.
  • Executed campaign based on customer requirements
  • Followed company code standardization rule
  • Developed several VBA macros.
  • Identify issues, information, and behaviors during the adoption of a proprietary information management system.
  • Accelerate the rate of adoption of the system, improve the quality of the data being input and generated, and promote accountability amongst the staff and users.
  • Develop code necessary to introduce additional reports, reverse engineer data models to the business meaning, and instruct users on the account management and implementation process advantages derived from the system.
  • Identify and document deficiencies in the proprietary information management system during initial implementation.

Environment: Informatica PowerCenter 8.1,VBA, Excel, Quick Test Pro, Oracle 9i, DB2 UDB, SQL Server 2000, SQL, PL/SQL, PuTTY, WinSCP3, TOAD, SQL Developer, UNIX (AIX), Shell Scripts, Mercury Quality Center 9.0, Microsoft Office 2003, Word, Excel, Outlook, Windows XP/2000, TERADATA V2R6, MLOAD, FLOAD, TERADATA SQL ASSISTANT, BTEQ

Hire Now