Etl Bi/cognos Tester Resume
Jersey City New, JerseY
PROFESSIONAL SUMMARY:
- 7+ years of IT experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications
- Experience using query tools for Teradata, Oracle, DB2 and MS SQL Server to validate reports and troubleshoot data quality issues.
- Solid Back End Testing experience by writing and executing SQL Queries.
- Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Has expertise in Test Case Design, Test Tool Usage, Test Execution, and Defect Management.
- Experience in UNIX shell scripting and configuring cron - jobs for Informatica sessions scheduling
- Strong working experience on DSS (Decision Support Systems) applications, and Extraction, Transformation and Load (ETL) of data from Legacy systems using Informatica.
- Proficient in SQL, PERL, Shell Scripting and PL/SQL coding and experience in Performance Tuning of SQL and Stored Procedures.
- Experience in testing Business Intelligence reports generated by various BI Tools like Cognos 7.3 Series and Business Objects XI R2
- Experience in Financial, Banking, Brokerage, and Securities industries
- Experienced in testing Cognos and Business Objects reports.
- Expertise in Testing complex Business rules by creating mapping and various transformations
- Experience in testing XML files and validating the data loaded to staging tables.
- Experience in testing and writing SQL and PL/SQL statements.
- Experience in Dimensional Data Modeling using Star and Snow Flake Schema.
- Extensive knowledge and experience in producing tables, reports, graphs and listings using various SAS procedures and handling large databases to perform complex data manipulations.
- Experience in testing and writing SQL and PL/SQL statements.
- Extensive experience in testing and implementing Extraction, Transformation and Loading of data from multiple sources into Data warehouse using Informatica.
- Proficient experience in different Databases like Oracle, SQL Server, DB2 and Teradata.
- Expertise in Developing PL/SQL Packages, Stored Procedures/Functions, triggers.
- Expertise in utilizing Oracle utility tool SQL*Loader and expertise in Toad for developing Oracle applications.
- Extensively worked on Dimensional modeling, Data cleansing and Data Staging of operational sources using ETL processes.
Technical Skills:
Testing Tools: Quality Center 9.2, Test Director 8.0/7.6/7.2, Win Runner 7.6/7.0/6.5, Quick Test Pro 8.0/6.5/6.0, Load Runner 7.6/7.2, Rational Clear Quest, Rational Robot
Languages: Java, .NET, XML, HTML, SQL, TSL, Java Script, VB Script
Hardware: HP-9000 Series, IBM Compatible PC Pentiums
Operating Systems: WindowsNT/2000/XP, UNIX, LINUX
Databases: MS Access, Teradata V2R6, SQL Server, Oracle, Sybase, Informix, DB2
ETL Tools: Informatica Power Center 7.1.3/8.1/8.5, SSIS, Data Stage, Ab Initio
Browsers: Internet Explorer, Netscape Navigator, Fire fox
Bug Tracking Tool: Rational Clear Quest, PVCS Tracker, Bugzilla, HP QC 9.2
GUI Tools: Visual Basic 5.0/ 6.0, Crystal Reports 4.6/6.0
BI Tools: Cognos 8.0 Series, Business Objects XIR3, Crystal Reports, SSRS, SSAS
PROFESSIONAL EXPERIENCE:
Confidential, Jersey City, New Jersey
ETL BI/Cognos Tester
Responsibilities:
- Responsible for Gathering test requirements and scoping out the test plan.
- Written complex SQL queries.
- Designed and tested UNIX shell scripts as part of the ETL testing process to automate the process of loading, pulling the data for testing ETL loads.
- Tuned ETL jobs/procedures/scripts, SQL queries, PL/SQL procedures to improve the system performance
- Wrote SQL and PL/SQL scripts to perform database testing and to verify data integrity.
- Tested different reports including detail, summary reports and on demand reports developed by Cognos Report Studio.
- Written several complex SQL queries for validating Cognos Reports
- Involved in developing test cases to test Teradata scripts (Bteq, multiload, fastload).
- Reported bugs and tracked defects using Quality Center 9.2
- Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Done the Quality Center administration part like creating new users and providing the access to the new user to the specific domain and the projects.
- Extracted certain column of data from a number of files using PERL
- Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Extraction of test data from tables and loading of data into SQL tables.
- Written several complex SQL queries for validating Cognos Reports
- Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
- Tested the data load process using the Informatica Power Center ETL tool.
- Worked with Data Conversion projects for testing data quality after migration.
- Written several complex SQL queries for validating Cognos Reports.
- Involved in testing the Cognos reports by writing complex SQL queries.
- Tested OLAP cubes and SSRS reports for various business needs.
- Involved in testing data mapping and conversion in a server based data warehouse.
- Experienced in data analysis using SQL, PL/SQL and many other queries based applications.
- Involved in extensive DATA validation using SQL queries and back-end testing
- Validated business reporting requirements with internal and external customers.
- Validated the reports to make sure all the data is populated as per requirements.
- Responsible for leading team and co-coordinating with offshore team
Environment: Quality Center 9.2, SQL, PL/SQL, Teradata V2R6, PERL, Teradata SQL Assistant 6.2, Quality Center 9.2, Oracle 10g, Cognos Series 8.0, CA Erwin 4.0, UNIX, Windows 2000, TOAD, T-SQL, SQL Server 2008, XML Files, XSD, XML Spy 2008, SSIS, SSRS, SSAS, Flat Files
Confidential, Quincy, MA
ETL BI/Cognos Tester
Responsibilities:
- Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
- Used HP Quality Center to write test cases, analyzed the results, tracked the defects and generated the reports.
- Write SQL queries to validate that actual test results match with expected results
- Check the naming standards, data integrity and referential integrity.
- Checked the reports for any naming inconsistencies and to improve user readability.
- Compared the actual result with expected results. Validated the data by reverse engineering methodology i.e. backward navigation from target to source.
- Used Clear Quest for defect tracking and reporting, updating the bug status and discussed with developers to resolve the bugs.
- Worked with business team to test the reports developed in Cognos
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Preparing Defect Reports and Test Summary Report documents.
- Checking the status of ETL jobs in UNIX and tailing the log files while the loading is under processing
- Experienced in writing UNIX and PERL scripts for various business needs in data world
- Developed the Test Plan and Testing Strategy for Data Warehousing projects.
- Written several complex SQL queries for validating Cognos Reports
- Converted SQL queries results into PERL variable and parsed the same for multiple reasons.
- Validated format of the reports and feeds.
- Effectively communicate testing activities and findings in oral and written formats.
- Extracted data from various sources like Oracle, flat files and SQL Server.
- Designing and creation of complex mappings using SCD type II involving transformations such as expression, joiner, aggregator, lookup, update strategy, and filter.
- Optimizing/Tuning several complex SQL queries for better performance and efficiency.
- Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.
- Queried Teradata Database and validated the data using Sql Assistant.
- Used Quality Center to track and report system defects
- Validated cube and query data from the reporting system back to the source system.
- Used Rational Clear Case for version controlling
- Extensively involved with backend testing by writing complex SQL queries.
- Worked closely with the development teams to replicate end user issues.
Environment: Oracle 9i/10g, Quality Center 8.0 SQL, Teradata V2R6, Teradata SQL Assistant, PL/SQL, Quick Test Pro 8.0, VSAM,, Cognos 7.0, PERL, TOAD, Rational Performance Tester 7.0,Rational Clear Quest 7.0.1,Rational Clear Case 7.0, XML Files, XSD, XML Spy
Confidential, Manhattan, New York
QA Tester - Global Consumer Group (GCG) - Data Warehousing
Responsibilities:
- Involved in Relational modeling and Dimensional Modeling Techniques to design ER data models and Star Schema designs for testing the data flow between source and target
- Documented the validation rules, error handling and test strategy of the mapping.
- Tuned the complex SQL queries for better performance by eliminating various performance bottlenecks.
- Performed the Back-end Integration Testing to ensure data consistency on front-end by writing and executing SQL statements
- Performed Functional Testing and Back-end Testing using the database comparable results Manually
- Used Quality Center 9.2 for writing Test Plan, Test Cases development, test script writing, Test case execution & Defect Tracking
- Interacted with the users to ensure meaningful development of the scripts and simulated real time business scenarios
- Compared the actual result with the expected results. Validated the data by reverse engineering methodology i.e. backward navigation from target to source.
- Created temporary tables for validations after understanding the transformation logic.
- Tested several UNIX shell scripting for File validation and also PL/SQL programming
- Tested complex mappings using mapplets and reusable transformations.
- Established and documented ETL QA standards, procedures and QA methodologies.
- Verify development of rule-set test scenarios, test cases, expected results and analysis of XML test scripts
- Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
- Validating data present in the Cognos Reports
- Developed and executed various manual testing scenarios and exceptionally documented the process to perform functional testing of the application
- Performed extensive data validations against Data Warehouse
- Loading Flat file Data into Teradata tables using Unix Shell scripts.
- Extensively used Informatica power center for extraction, transformation and loading process.
- Extensively tested several ETL Mappings developed using Informatica.
- Involved in testing data reports. Written several SQL queries for validating the front-end data with backend.
- Reported defects in Quality Center
- Conducted back to back meetings with developers and discussed about all open issues.
- Conducted test case reviews to ensure scenarios accurately capture business functionality.
- Responsible for Automation Approach document preparation and maintained the scripts in Quality Center 9.2
- Written several SQL statements to confirm successful data transfers between source and target.
- Worked with data profiling, data scrubbing and data cleansing.
- Tested Standard Reports, Cross-tab Reports, Charts, Master-Child Drill through Reports and Multilingual Reports
Environment: Informatica Power Center 6.1, Cognos,Windows NT, Micro Strategy, Oracle 8i, SQL, PL/SQL, T-SQL, SQL Server 2005/2000, SQL, PL/SQL, SQL*Plus, Toad, XML, XSD, VSAM Files, MVS, ISFP, Flat Files
Confidential
SQL Developer/ETL/BI Tester
Responsibilities:
- Prepared and Scheduled for Backup and Recovery. Maintenance of backup and recovery plans for system and user databases
- Created roles, and set permissions to ensure that company’s security policies are enforced and in effect.
- Created test case scenarios, executed test cases and maintained defects in Remedy
- Tested reports and ETL mappings from Source to Target
- Developed and executed various manual testing scenarios and exceptionally documented the process to perform functional testing of the application
- Tested Informatica mappings. Performed data validations against Data Warehouse
- Extracted and transformed data from one server to other servers using tools like Bulk Copy Program (BCP), BULK INSERT.
- Assisted Developers to design databases, tables, and table objects in support of development activities and to achieve optimal performance.
- Understood business requirements, and helped re-design the development plan, and designed the data warehouse
- Created Database Objects - Tables, Indexes, Views, and Roles.
- Created Customized Stored Procedures, Triggers, and Cursors.
- Involved in job scheduling using MS SQL server Agent.
- Database authentication modes, creation of users, configuring permissions and assigning roles to users.
- Responsible for Gathering test requirements and scoping out the test plan.
- Responsible for creating Test Plan and scope of testing.
- Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
- Responsible for extensive data validations on Data warehouse and reports.
- Responsible for testing Business Objects universe and universe enhancements.
- As an ETL/BI Tester responsible for the business requirements, ETL Analysis, ETL test and design of the flow and the logic for the Data warehouse project.
- Responsible for leading team and co-coordinating with offshore team
- Prepare and set up environment for Disaster recovery.
- Transformation of data from one server to other servers using tools like Bulk Copy Program (BCP), Data Transformation Services (DTS).
- Data conversions from legacy systems to SQL server Using DTS.
- Created multiple complex queries for the profiling and auditing purposes.
- Optimized, partitioned cubes to speed up report latency and performance optimizations.
- Assisted in Troubleshooting of Report deployment on multiple servers.
Environment: Informatica 6.1, Microsoft SQL Server 2000, Enterprise Manager, Database Transformation Services (DTS), VB Script, T-SQL, SQL Profiler, MS Access, XML, XSLT, XSD