We provide IT Staff Augmentation Services!

Retail - Etl Lead/tester Resume

3.00/5 (Submit Your Rating)

Columbia, MD

PROFESSIONAL SUMMARY:

  • Over 7 years of Software Quality Assurance (QA) experience testing Data Warehouse, Database (ETL & BI), Web, and Client - Server Systems and Applications for Banking, Insurance, Financial, Healthcare, and Retail Industries.
  • In depth technical knowledge and understanding of Data Warehousing, Data Validations, UNIX, SQL, PL/SQL, OLAP, XML, SQL Server, Oracle, and ETL.
  • Expert in managing/leading ETL and Business Intelligence Testing.
  • Proficient in developing procedures, packages in PL/SQL with good knowledge in Oracle Developer 2000 and SQL*Loader
  • Familiar with managing the defects and change requests through Testing Tools like Quality Center/Test Director, Clear Quest and StarTeam.
  • Extensive experience in testing and implementing Extraction, Transformation and Loading of data from multiple sources into Data warehouse using Informatica, Ab Initio.
  • Experience in Dimensional Data Modeling using Star and Snow Flake Schema.
  • Designed data models using Erwin.
  • Experience in using Oracle Databases, DB2 UDB, Sybase, SQL Server, Redbrick, Netezza, and Teradata.
  • Expertise in QA Testing in distributed Unix/Windows Environment and Oracle databases as back end, Performed end-to-end testing.
  • Extensive experience with databases like Oracle, Teradata, SQL Server, DB2, and MS Access.
  • 5+ years with SQL/PLSQL and expertise in writing SQL Queries.
  • Work extensively with slowly changing Dimensions.
  • Experienced SQL Data Analyst / Data Reporting Analyst with strong background in design, development, and support of online databases and information products as well as data analysis / reporting / processing.
  • Extensive experience in ETL process consisting of data transformation, sourcing, mapping, conversion and loading
  • Well versed with Manual and Automated Testing methodologies and principles.
  • Proficient in Oracle 10g/9i/8i/7.3, PL/SQL Development Toad, PL/SQL Developer, SQL Navigator, Perl, UNIX, Korn Shell Scripting.
  • Strong Experience in data Maintenance and Support which uses Oracle 10g/9i databases, SQL, PL/SQL, SQL* Loader in windows NT and Unix environment
  • Experience in Performance Tuning of SQL and Stored Procedures.
  • Automated and scheduled the Ab Initio & Informatica jobs using UNIX Shell Scripting.
  • Application Data warehousing experience in Financial, Banking, Healthcare and Insurance.

TECHNICAL SKILLS:

Operating Systems: Windows 2000/NT/XP/Vista, UNIX (Linux, Sun Solaris, IBM AIX)

Databases: Oracle 8.1/9i/10g,PL/SQL,SQL Server 2000/2005, Siebel, Sybase

ETL Tools: Informatica 6.1, 7.1, 8.1 and Data Stage 6.x/7.x/8.x

Reporting Tools: Cognos, Business Objects 5.0/6.5/XI R2/R1

Query Tools: PL/SQL, Toad 7.4/8.3.6, SQL Query Analyzer, SQL Plus

Scripting Languages: XML, HTML, XHTML, Shell Scripting

Languages: C/C++, SQL, PL/SQL

Application Development: Visual Basic 6.0, Oracle9i/10g

Packages: MS-Office, Visual Studio. 2005

Modeling Tools: Star-Schema & Snowflake-FACT and Dimension Tables, Erwin 4.0, ER studio

Scheduler Tools: Autosys, Appworx, Tidal

Others: Visual Source Safe, File Zilla, MKS Tool kit, XML Spy 2008 (UNIX scripts)

PROFESSIONAL EXPERIENCE:

Confidential, Columbia, MD

Retail - ETL Lead/Tester

Responsibilities:

  • Involved in Business analysis and requirements gathering.
  • Develop ETL test plans based on test strategy. Created and executed test cases and test scripts based on test strategy and test plans based on ETL Mapping document.
  • Worked with ETL group for understating mappings for dimensions and facts.
  • Worked with Quality Center for executing test cases and reporting defects
  • Involved in Data Validation Testing
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mapping.
  • Involved in Designing and developing of Star Schema and created Fact and Dimension Tables for the Warehouse using Erwin.
  • Used Informatica PowerCenter work flow manager to run the data load jobs.
  • Monitored the workflows using Informatica workflow monitor and performed end to end testing of Informatica data load jobs.
  • Tested the ETL business logic for different feeds and checking the log files for data counts.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Analyzed test results, generate test reports, file bugs and follow up on resolution with development & support team.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Performed all aspects of verification, validation including functional, structural, regression, load and system testing

Environment: Informatica Power Center 8.6.1,SQL, PL/SQL,Oracle 11g, SQL Plus, Quality Center 9.2Erwin 4.0

Confidential, Saint Louis, MO

ETL /QA Tester

Responsibilities:

  • Planning, designing, building, maintaining and executing tests and test environments at each point in SDLC
  • Assisting in preparing Test plan and estimations
  • Development of test cases based on functional specifications
  • Understood and analyzed the Requirements and Rules from DOORS.
  • Establish test environment requirements
  • Used HP Quality Center to state requirements, business components, test cases, and test runs for every iterations, defects. Also link defects with requirements.
  • Tested the reports functionally and non functionally to meet the OLAP standards.
  • Worked on MS Excel for Documentation with Requirements, Test Cases, Test Scripts and the Results.
  • Attended weekly TSR (Test Status Review) meeting with test team lead.
  • Attended Daily Status meetings with Project Manager, Business Analyst, Developer and QA Team..
  • Participate in validating mappings, sessions, workflows in Informatica
  • Involve in heavy testing of the data transformation processes developed in Informatica
  • Read the Functional Specifications and implemented their technical aspect in System and Integration testing (SIT).
  • Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning
  • Involve in preparing test data for different scenario validations specific to the process/mapping
  • Validate database object(s) existence as per Data model
  • Participate in weekly meetings with SMEs and developers to discuss ongoing change of requirements in iteration plan and validation of STM documentation
  • Worked with Teradata SQL Assistant for data retrieval and data validation.
  • Understood the Data Model in the Data Warehouse that consists of Dimension and Fact tables.
  • Coordinate with Release Management team in getting our jobs executed in Test environment (Autosys)
  • Participate in bug analysis and DRB meetings
  • Developed Test Scripts using complex SQL and PL/SQL queries, according to Mapping Document
  • Mapped the Requirements with Test Cases in Requirement Traceability Matrix (RTM).
  • Validate data between source, stage and target
  • Participate in Knowledge Transfer (KT)

Environment: Oracle SQL Developer 2.1.1,Informatica Power Center 8.6.1, SQL, PL/SQL, UNIX, Shell Scripts, Certify Tool, TDR Viewer, Oracle 11g, Teradata SQL Assistant, SQL Plus, SQL Assitant 6.0,Quality Center 9.2, MLOAD, FLOAD, TPUMP, IBM Mainframes, Access, SOAP, XML, XML Spy,TSO/ISPF, Autosys

Confidential, Wall Street, NY

ETL Tester - Ab Initio/SQL/UNIX

Responsibilities:

  • Involved in Business analysis and requirements gathering.
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Defined data requirements and elements used in XML transactions.
  • Involved in Business analysis and requirements gathering.
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Tested graphs for extracting, cleansing, transforming, integrating, and loading data using Ab Initio ETL Tool.
  • Metadata graphs from legacy source system to target database fields and involved in create Ab Initio DMLs
  • Tested several data validation graphs developed in Ab Initio environment
  • Reviewed the Test Cases and Test Scripts with Business Users and made some corrections.
  • Worked with QA team members on updated test cases and change requests to prepare the Final RTM.
  • Reported bugs and tracked defects using Quality Center
  • Lead team of 5~6 people and co-ordinate offshore team of 12 people in INDIA.
  • Managed multiple projects from different Line of Businesses.
  • Worked with EAI tool (TIBCO) to validate XML Messages before ending to TIBCO queue and TIBCO topic Reviewed and approved database modifications
  • Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices.
  • Extensively tested the Business Objects report by running the SQL queries on the database by reviewing the report requirement documentation.
  • Designed and developed UNIX shell scripts as part of the ETL process to automate the process of loading, pulling the data for testing ETL loads.
  • Written several shell scripts using UNIX Korn shell for file transfers, error log creations and log file cleanup process.
  • Imported data from tables in one schema to CSV files and exported to tables from another schema using SQL* Loader.
  • Tested several UNIX shell scripting for File validation and also PL/SQL programming
  • Used Clear Quest to track and report system defects and bug fixes. Written modification requests for the bugs in the application and helped developers to track and resolve the problems.
  • Validating the load process of ETL to make sure the target tables are populated according the data mapping provided that satisfies the transformation rules.
  • Performed Track Record defect process flow in Quality Center.
  • Used Quality Center as a repository for Test Cases and for Defect Tracking.
  • Writing complex SQL queries using Case Logic, Intersect, Minus, Sub Queries, Inline Views, and Union in Oracle.
  • Validated the ETL Scripts in the Target Database (Oracle) for creating the cube before migrating it to SQL Server.
  • Responsible for different Data mapping activities from Source systems to Teradata
  • Worked with Developers on Defects until the test case is passed.
  • Used Quality Center for bug tracking and reporting, also followed up with development team to verify bug fixes, and update bug status
  • Worked with ETL group for understanding Ab Initio graphs for dimensions and facts
  • Tested the application by writing SQL Queries and creating pivot views as to perform back-end testing.

Environment: Ab Initio (GDE 1.12, Co>Op 2.12), SQL, PL/SQL, Quick Test Pro 8.0, Business Objects XIR2, UNIX, Shell Scripts, Mercury Quality Center, Teradata V2R6, SQL Assitant 6.0, MLOAD, FLOAD, TPUMP, Rational Robot, Rational Clear Quest, Rational Test Manager, Rational Functional Tester, Rational Manual Tester, Rational Clear Case, IBM Mainframes, CICS, Sybase, Oracle, SQL Plus, Access, SOAP, XML, XML Spy, SQL, TSO/ISPF

Confidential, Newark, New Jersey

Data Warehousing Tester

Responsibilities:

  • Involved in Business analysis and requirements gathering.
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Involved in rebuild testing of the BI Reports from Cognos to Crystal Reports.
  • Validated the data in the reports by writing simple to complex SQL queries in the transactional system
  • Lead team of 6 people onsite and 4 people offshore.
  • Managed multiple projects and schedule the timelines for testing process for entire data warehousing projects.
  • Executed the Test cases for Crystal Reports and Cognos.
  • Performed segmentation to extract data and create lists to support direct marketing mailings and marketing mailing campaigns.
  • Oversaw the historical loads in UAT and PROD environments.
  • Also worked on Integration of all the processes in UAT/PROD.
  • Optimizing/Tuning several complex SQL queries for better performance and efficiency.
  • Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.
  • Generated test data (on UNIX box using Perl scripts) to support development.
  • Experience in Data Inspection/analysis of tables as well as outbound files (data files in EBCIDIC & ASCII format).
  • Involved in the error checking and testing of the ETL programs using Informatica session log and workflow logs.
  • Verified layout of data files and control file entries as per business requirement.
  • Based on the generated test data wrote test cases to demonstrate testing approach with detailed explanation of the cases for SORs.(System Of record)
  • Did Unit Testing for all reports and packages.
  • Tested the entire data reconciliation process for multiple source and target systems.
  • Day-to-day Cognos administration activities like monitoring Scheduled jobs like Cube refresh, Impromptu scheduled reports, Backup and recovery maintenance.
  • Involved in creating the test data for generating sample test reports before releasing to production.
  • Wrote complex SQL scripts using joins, sub queries and correlated sub queries.
  • Provided input into project plans for database projects providing timelines and resources required.
  • Maintained the data integrity and security using integrity constraints and Data base triggers.
  • Involved in the application tuning of database by identifying the suitable Indexes.
  • Worked on UNIX Shell wrapper scripts
  • Worked on Autosys which involved in creation of Autosys jobs, execution.
  • Worked on issues with migration from development to testing.
  • Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
  • Validated cube and query data from the reporting system back to the source system.
  • Tested analytical reports using Analysis Studio

Environment: SQL, PL/SQL, TOAD 7.0, Test Cases, Test Scripts, Test Plan, Traceability Matrix, Test Director, SAS/BASE 9.1.2, SAS/Access, SAS/Connect, IBM DB2, COBOL, Flat Files, Copy Books, MVS, TSO, JCL, IMS DB/DC, ISPF, Informatica Power Center 7.2 (Power Center Designer, workflow manager, workflow monitor), SQL *Loader, Cognos 7.0 Series, Oracle8i, SQL Server 2000/2005, Erwin 3.5/4.0

Confidential, Minneapolis, MN

Sr. Backend/ETL Tester

Responsibilities:

  • Participated and Performed System Integration testing by developing Sub graphs and Integrated with Extraction and Load Graphs.
  • Used TestDirector to Report bugs and Data Quality Issues on each check sum and field validation Tests completed.
  • Communicated and Discussed with Developers about the status of each Data Quality Issues
  • Ran SQL Statements manually to execute Record Counts on Each Table Loaded into System and Compared with Source Table.
  • Developed and involved in both Manual Testing and Automation Test scripts based on Use cases
  • Prepared the Customer Templates to Upload the Data to the Application.
  • Assist with the user testing of systems (User Acceptance testing) developing and maintaining quality procedures, and ensuring the appropriate document is in place.
  • Developed SQL procedures to ensure compliance with standards and lack of redundancy, business rules and functionality requirement into ETL procedures.
  • Designed and executed Functionality and User Acceptance test cases.
  • Written various Autosys JIL Scripts to activate the UNIX scripts in production i.e. Jil scripts for the Box and Commands.
  • Written UNIX AWK Scripts to manipulate test data and input files.
  • Responsible to find out different DDL Issues and Report to DBA’s using Test Director.
  • Participated in bug triage meetings with developers to validate the severity of the bug and responsible for tracking of the bug life cycle using TestDirector.
  • Designed test data using MS Excel sheet, created data driven test for testing the application with positive and negative inputs.
  • Perform data driven testing and validated the test results.
  • Created various User Defined Functions for script enhancements and to verify the business logic.
  • Written Complicated SQL queries in DB2 for Update Dimension Tables, Export Data from Target tables etc.
  • Written various Unix Shell Scripts for Scheduling the job in production like check file script, load script, complete script, and DQ script.
  • Wrote complex SQL scripts in DB2 for testing data quality and validation.
  • Execute the Unix Shell Scripts to Know the Backend Batch Jobs Status

Environment: Ab Initio (Co>Op 2.13, GDE 1.13), Data Profiler, Oracle 9i, Autosys, IBM DB2, Syncsort, Erwin 4.0, Business Objects 5.x, IBM AIX-UNIX 5.0, Windows NT, Shell Scripting, SQL, PL/SQL, Test Plan, Test Cases, Test Scripts, COBOL, VSAM Files, TSO, MVS, ISPF, JCL, XML, XSLT, XSD, XML Spy

We'd love your feedback!