Etl (informatica Dvo)/data Warehouse Tester Resume
Vienna, VA
PROFESSIONAL SUMMARY:
- Experienced Quality Assurance Analyst with 9+ years of experience in the Information Technology industry, involving analysis, both Manual and Automated testing of Client Server and Web applications.
- Possess specific experience in the Banking, and Financial Domains.
- Proficient in writing Test Strategies, Test Plans, Test Cases, Test Scripts, Test Scenarios and Test Summary Reports for both Manual and Automated Testing.
- Hands on expertise in Data ware housing concepts and tools. Involved in the ETL processes where in the organizations were using DataStage, Informatica, AbInitio, Autosys tools, Control M.
- Technical expertise in ETL methodologies, Informatica 8.x/9.x - Power Center, Client tools - Mapping Designer, Mapplet Designer, Transformations Developer, Workflow Manager/Monitor and Server tools.
- Proficiency in utilizing ETL tool Informatica Power Center 9.x for developing the Data warehouse loads with work experience focused in Data Acquisition and Data Integration.
- Extensive ETL testing experience using Ab-Initio, DataStage, Informatica (Power Center/ Power Mart, DVO, designer, Workflow Manager, Workflow Monitor and Server Manager).
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using Informatica and DataStage.
- Experience in Dimensional Data Modeling using Star and Snow Flake Schema, FACT and Dimensions Tables, Physical and Logical Data Modeling.
- Well versed with Kimball’s methodology - Dimensional modeling, star and snow flake schema in implementing decision support systems.
- Extensive experience in testing and reviewing of dimensional modeling (Star and Snow flake) of data warehouse.
- Experience in CDC, daily load strategies of Data warehouse and Data marts, slowly changing dimensions (Type1, Type2, and Type3), Surrogate Keys and Data warehouse concepts.
- Experience in debugging, troubleshooting, bug fixing and Performance tuning of Informatica mappings and Data Warehouse loads to resolve any data inconsistencies across loads.
- Worked on various heterogeneous databases: Oracle, SQL Server, Teradata, Flat files and XML in creating reports, transactions and conversions.
- Possess specific experience performing testing including Backend, Frontend, Regression, Functional, System, Interface, Usability, and Black box, Integration Verification and Validation End-to-end and User Acceptance Testing.
- In depth knowledge of the Business process development using Software Development Life Cycle Waterfall, Agile and testing using Software Testing Life Cycle (STLC).
- Proficient in Black Box and White Box Testing methodologies. Possess specific experience performing Regression and Functional Testing using Quick Test Professional Proficient in Mercury s Test Director and Quality Center for Test Designing, Requirement Mapping, Reports, Test Execution and Defect Tracking.
- Proficient in Rational Clear Quest and Quality Center for Defect Tracking Defined and Implemented Processes and Procedures for QA Departments.
- Ability to work independently or in a team environment or in a rapid pace environment. Possess strong interpersonal skills, communication and presentation skills with the ability to interact with people at all levels
- Experienced in the Business Requirements Analysis, System Design, Data Modeling, and development, Testing, Implementation and Maintenance.
- Proficient in using Mercury Interactive and other Automated Testing tools such as Win Runner, Load Runner, Rational Clear Quest and QTP, Total System (TSYS), Confidential Simulator (VS), Master Cards Simulator (MCS) and Versa Test Simulator (VTS) for credit cards processing.
- Solid experience in back-end testing manually on the Oracle SQL Developer, DB2 and Teradata databases by writing complex SQL queries.
- Excellent documentation skills and experience coordinating with project manager, business analyst, Architects, DBA’s and Developers.
- Good team player with strong communication and problem solving skills.
TECHNICAL SKILLS:
Testing Tools: Win Runner, Load Runner, QTP, TSYS, VTS, Master Cards Simulator, Versa Test Simulator
BI Reporting Tools: Micro Strategy 9 & 8, Crystal Reports, Business Objects XIR2/6.5
Bug Reporting Tools: Test Director, Mercury Quality Center (DVO), Application Lifecycle Management (ALM), Rational Clear Quest
Programming Languages: SQL, PL/SQL, SQL Plus, UNIX Shell Scripting, C++
Tools: and Packages MS Office, MS Project, flex, flat files, parsing
Database: Oracle 11/10g, MS SQL Server, DB2, Teradata
Platforms: UNIX, Windows XP/ NT/ 2000/98, AS/400, Mainframe
ETL Tools/Job Scheduling Tools: Informatica Power Center, Ab-Initio, Data Stage, Control M, TWS, Autosys
PROFESSIONAL EXPERIENCE:
Confidential, Vienna, VA
ETL (Informatica DVO)/Data Warehouse Tester
Responsibilities:
- Responsible for testing and reviewing of ETL mapping and transformation specifications based on the business requirements from various teams and the business teams.
- Testing of ETL jobs that are scheduled for file transfers from Operational Data Stores to designated file systems/directories.
- Tested various Reusable ETL Transformations which facilitate Daily, Weekly & Monthly Loading of Data.
- Tested the performance bottle necks at sources, targets, mappings, and sessions and employed required measures.
- Used Toad and SQL Plus for testing execution of ETL Processes' PL/SQL procedures, packages for business rules.
- Tested Triggers which were enforcing Integrity constraints, stored procedures for complex business logic complementing the Informatica sessions.
- Reviewed the Business Requirement Document to understand the process and write Test Plan and Test Cases.
- Formulate methods to perform Positive and Negative testing against requirements.
- Conducted Smoke testing, Functional testing, Regression testing, Integration testing and User Acceptance Testing (UAT).
- Used Agile testing methodology for achieving deadlines in UAT.
- Experience creating basic aggregate tests, which include COUNT, COUNT DISTINCT, COUNT ROWS, SUM, AVG, MIN, MAX.
- Validated the archived jobs using DVO (data validation tool).
- Recommended and implemented best practices for Quality Center implementation and usage.
- Managed QA tasks, requirements and defects in HP Quality Center.
- Developed and implemented Quality Center Standards.
- Identified data sources and transformation rules required to populate and maintain Data Warehouse content.
- Involved in development of Logical and Physical data models that capture current state Developed and tested all the Informatica Data mappings, sessions and workflows - involving several Tasks.
- Extracted data from oracle database, different data sources and staged into a single place and applied business logic to load them in the central oracle database.
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy, Transaction control, Sequence generator and Stored Procedure.
- Developed complex mappings in Informatica to load the data from various sources.
- Developed incremental load mappings for daily, monthly jobs using Change Data capture technique (CDC).
- Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
- Parameterized the mappings and increased the re-usability.
- Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
- Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
- Implemented various file control checks, process controls and file sending to other team for fraud checks.
- Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
Environment: TSYS. Informatica Power Center 9.6.1, SQL developer, Teradata, Oracle EBS, Oracle 11g, LINUX, Flat files, Unix Shell Scripting, Informatica Data validation (DVO), HP Quality center11.0
Confidential, Washington, DC
Sr. ETL/Data Analyst
Responsibilities:
- Worked with Business Analysts to define testing requirements to satisfy the business objectives.
- Involved in ETL process testing using Informatica ETL tool.
- Created source to target mapping documents listing efficient data transformation business rules by database elements, with clear supporting documentation and example scenarios.
- Analyzed large amounts of data with limited documentation with SME support to produce technical specifications and Data Quality improvement plans; migration & conversion requirements by analyzing source and target schema.
- Wrote SQL queries for data analysis and worked extensively on SQL, PL/SQL.
- Wrote complex SQL queries to verify data from Source to Target.
- Used Quality Center for creating and documenting Test Plans and Test Cases and register the expected results.
- Used HP Quality Center for storing, maintaining the test repository, bug tracking and reporting.
- Used Data validation Option (DVO) for complete and effective data validation and testing.
- Performed database queries and analyzed large data sets in support of requirements development.
- Developed and validated requirements for data conversion and migration.
- Participated in the technical design process to ensure that business requirements are being satisfied.
Environment: Unix, Oracle, Netezza, MS SQL, TWS, Quality Center, SAS, Java,, Informatica, SSRS, Microstrategy, MS Office, UNIX
Confidential, Wilmington, DE
Sr. ETL/DWH Tester
Responsibilities:
- Reviewed Test Plans to ensure adequate testing coverage of requirements and design provide feedback.
- Created Test Cases and developed Tractability Matrix and Test Coverage reports.
- Managed and conducted system testing, integration testing and functional testing.
- Tracked the defects using Quality Center tools and generated defect summary reports.
- Prepared status summary reports with details of executed, passed and failed test cases.
- Interacted with data warehouse developers, business & management teams and end users.
- Participated in regular project status meetings related to testing.
- Wrote Test Strategies and reviewed with stakeholders to obtain sign-off.
- Ensured that functional testing executed meets documented testing standards.
- Ensured that all various vendor interfaces using SOAP/UI calls regression and smoke test scripts maintenance using QTP.
- Campaigned data validation for effects of various promo offers on cells as well as end to end validation in TSYS.
- Advised regarding the removal of features from release relative to testing completion and operational risk.
- Coordinated with the developers regarding the defects and retested them against the application.
- Tracked and analyzed the defects and recorded the variation between the expected and actual results.
- Performed the tests in both the SIT, QA and contingency/backup environments.
- Performed all aspects of verification, validation including functional, structural, regression, load and system testing collected requirements and tested several business reports.
- Used the Control M jobs extensively for the entire project.
- Validated Control-M and IBM Data Stage jobs for success of data load into tables residing in QWARE, CIF and each layer of the warehouse (staging to ADS).
- Used the Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database.
- Wrote complex SQL queries to validate the data conditions for ETL and BI group.
- Worked on SQL scripts to load data in the tables.
- Tuned PL/SQL code against the warehouse for bottleneck analysis from pre pod environment.
- Designed and developed UNIX shell scripts as part of the ETL process, automated the process of loading and pulling the data.
- Tested ETL feeds and wrote complex SQL queries to validate the data based on ETL mapping rules.
- Prepared extensive set of validation test cases to verify the data.
- Consumed existing web services for the retrieval of business logic data from WebSphere.
- Developed automated test scripts from manual test cases for regression testing based on the requirement documents.
- Extensively tested the reports for data accuracy and universe related errors.
- Created and tested event based schedules of the reports and built an emailing mechanism to indicate success/failure of the schedule.
- Performed functional, integration, regression, volume and performance testing and prepared the test reports using Rational ClearQuest and ClearCase as repository to maintain new and modified test cases.
Environment: Quality Center, Oracle SQL Server, Teradata, TSYS, Data stage, Java, Control M. WINSCP, Microstrategy, Versa Test Simulator, PUTTY
Confidential, Newark, DE
Sr. QA Analyst
Responsibilities:
- Performed the testing of credit card authorization techniques and process, payments, refunds and settlements.
- Tested the whole system and workflow of payments, refunds and settlements in issuer, acquirer and merchant prospective.
- Worked on credit cards authorization process to make sure all transactions go through ADE (Auth Decision Engine).
- Used TSYS, VTS ( Confidential Test Simulator), Master Card Simulator, Tandem to run the credit card authorization process.
- Validated Credit Cards Authorizations by using TSYS screens and Teradata for backend testing.
- Tested whole transaction processing facility to validate whole authorization system is working correctly.
- Worked on the process that divides the control zones of credit card customer, and validated that the control zone application works as expected.
- Worked on the migration of all the processes, from ADS (previous DB) to Teradata (New Version).
- Worked on the Credit line Decrease (CLD) and Credit Line Increase (CLI), and Fraud, processes to ensure the correct migration.
- Working on scheduling the batch processes in new scheduler namely TWS.
- Scripted the test cases for component and UAT using HP Quality Center.
- Logged the defects in HP Quality Center and verified defect cycles.
- Performed UAT, positive, negative and boundary testing using the approved test cases.
- Used complex SQL queries in Oracle and Teradata in data warehouse environment.
- Using SQL developer, Teradata SQL Assistant and PSFTP during the testing.
- Validated shell scripts and the related Java functions in all the processes.
- Manually ran the process using UNIX commands.
- Used Unix Navigation commands to attribute change command as CHMOD, FTP commands.
- Validated the log files after the process run.
- Used Traceability Matrix to map the test plans/cases with the business requirements.
Environment: Unix, Oracle, Teradata, ADS, ODS, HP Quality center, MS SQL, TWS, Shell Scripts, SAS, Java, VTS( Confidential Simulator) Total Systems (TSYS), AbInitio, MS office, UNIX
Confidential, Foster City, CA
BI/ETL Data Quality Analyst
Responsibilities:
- Performed validation of Data Translation from Source (HRFA) to Target (EDW, SDW & CDIMI) data mart using Data mapping/Transformation Rules.
- Validated the system end-to-end testing to meet the approved functional requirements.
- Created detailed test plans and test cases based on functional requirements and software design documents.
- Wrote test scenarios, generic tests cases, detailed positive and negative test cases for ETL.
- Prepared ETL and SQL routines/code for performing ETL testing (system and integration testing) and documented the test results.
- Used Informatica Data validation Option (DVO) to complete data testing quickly and easily by creating rules that test the data being transformed during the data integration process.
- Performed functional testing and prepared & executed regression test scripts for ETL processes.
- Created client’s profiles and performed data validation testing from client prospective for Issuer, ACQR and merchant prospective.
- Analyzed Business Requirement Documents to get a better understanding of the system on both technical and business perspectives.
- Prepared weekly DR Summary report and distributed to the Test Coordinator, Director of testing team, Test Team, Development Lead, Project Manager, & Release Manager, Project Manager.
- Set up internal and external review meetings for test scenarios and test scripts.
- Led all the activities and work and conducted walk through meetings to coordinate with offshore teams in India and Singapore.
- Performed extensive manual testing of the application and executed test cases manually to verify the expected results.
- Performed configuration testing of the application and utilized, Mercury Quality Center and Rational ClearQuest as the Bug Tracking Tool.
- Worked closely with UAT team to perform UAT testing.
- Performed the complete functional as well as data validation testing (back end testing).
- Verified report layout (Attributes/Metrics positions), Naming conventions, Totals & Grand Totals and Validated the SQL query.
- Validated Drilling options - Simple & Advanced, Prompts, Prompt ordering, Prompt Defaults and Verified Metric Calculations, Drill Maps, Security filters.
- Tested Export/Print functionality, Formatting Properties such as alignment, scroll bar, decimal places.
Environment: Oracle, UNIX, ETL, Informatica Power Center 860, Power Center Workflow Monitor, Java, J2EE, HTML, IE, Windows, Data mart, Quality Center, Rational Clear Quest, MS Project, MS Office, Micro Strategy Reports, PL/SQL Developer, IBM DB2
Confidential, New York, NY
Report Analyst Data/ Warehouse Tester
Responsibilities:
- Designed test cases scenarios and test scripts performed manual testing on UNIX & Window platform.
- Found bugs and issues developed and executed test plans test strategy scenario scripts.
- Used Toad to check the SQL of Our Source and Destination database (Backend was Oracle database).
- Found and reported bugs in Quality center and sent back to the Lead developer to resolve the issues.
- Responsible of testing the classless, objects, data in a tables and check the formatting of reports.
- Responsible for designing, building, testing Reports through Business Objects Xir2, Perl.
- Migrated components from development to test and test to production.
- Manually wrote complex SQL queries to check the integrity of data.
- Created reports for various portfolios using the various data providers such as Free Hand SQL, Stored Procedures, and VB procedures.
- Created the reports using Business Objects functionality such as Queries, Slice and Dice, Drill Down, Cross Tab, Master Detail and Formulas, etc.
- Created reports using Info View and ZABO for distribution to users.
- Created Schedulers to monitor Broadcast Agent. Built full client and WEBI reports.
- Analyzed, developed, tested, and implemented segments of automated systems.
- Recommended and initiated system changes.
- Collected requirements from the IVR team, analyzed, Documented, Built & exported Universe to Repository, built & tested OLAP and Crystal Reports in Info view.
- Optimized, debugged and tested SQL queries, views and stored.
- Constantly tested SQL for joins and contexts for new object creation.
- Used Rapid SQL to conduct functional testing of existing objects. Created and tested classes and objects to organize the universe.
- Tested reports for accuracy, and troubleshot report issues including poor performance and bad data.
- Worked with the IT team to bring in the data required into Business Objects Universe to make available for super users to create ad-hoc reports.
- Interacted with business users and owners to understand their business views in developing and modifying Universes and Reports accordingly.
- Participated in the designing and Building of Universes, Classes and objects.
- Participated in the troubleshooting of universe schema with loops, chasm and fan traps, and cardinality problems.
- Used SOAP request in XML to extract the data from CORE System to generate CLD letter to the customers.
Environment: Mercury Quality Center, Oracle 9i, Toad, Java, Java Script, C++, Business Objects XI/6.5, Web focus, Power Builder Sybase, UNIX Shell Scrip & Perl Technical skills, Central Management Console, Designer, Web Intelligence, Info View, Supervisor, BCA, ZABO, Informatica Power Center, Crystal Report XIs, Perl XML, HTML
Confidential, Danbury, CT
QA Analyst/ Validation Analyst
Responsibilities:
- Wrote test scenarios, generic tests cases, detailed positive and negative test cases for ETL.
- Performed functional testing and prepared regression test scripts for ETL process.
- Executed test scripts per defined ETL testing processes.
- Manually performed integration and regression testing, documented bugs and worked with development team to resolve issues.
- Performed backend validation testing for Oracle database using Toad (test SQL of Source and destination using latest data mapping).
- Wrote shell scripts to back up daily test data.
- Participated in the design, development, implementation and maintenance of an on-going enterprise-wide Data Factory program.
- Participated in data analysis in support of different business data requirements definition, UAT and training activities.
- Assisted in the translation of other business data requirements into conceptual, logical, and physical data models.
- Participated in identification of data sources for the required data attributes.
- Developed data maps to document data attributes to data sources also data targets, including identification and documentation of data transformation algorithms.
- Assisted in knowledge transfer to other technical staff members on business data processes.
- Evaluated potential expansion projects, also provided data requirements and design for new releases, as necessary.
- Conducted daily status meetings with the PMO & Project Leads during test execution.
Environment: UNIX, IIS, Java, Oracle, and Test Director, Bugzilla, BEA Web Logic 8.2, Manual Testing, Test Director, Windows 2000