- 8+ years of IT experience in Quality Assurance and Software Testing of various business applications in Client/Server environments and Data Warehousing solutions
- Experience in working with Big data applications
- Expertise in different components of Hadoop like Hive, HDFS, Sqoop
- Create Hive queries for the data validation
- Loaded structured and semi structured data into HDFS and Hive
- Transfer data from relational data base to HDFS using SQOOP
- Experience with test data management and experience with data ingestion tool Sqoop
- Experienced in gathering reporting and analysis requirements, documenting the report specifications, implementing the metadata layers including Physical, Business Model and Mapping, and Presentation layer
- Strong knowledge of SDLC and Software Test Life Cycle and highly proficient in SQA methodologies for Waterfall, Agile
- Experience in Data Analysis, Data Cleansing (Scrubbing), Data Validation and Verification, Data Conversions
- Expertise in developing, implementation and maintaining Test Plan, Test Script and Test Methodologies
- Strong experience in preparing documentation, preparing test environments, executing and analyzing the results
- Involved in testing the batch jobs, using UNIX and Autosys
- Expertise in testing reports using Business objects tool, front - end web user interfaces applications, Client/Server and Web based applications
- Have tested several complex reports generated by Cognos and Micro Strategy including dashboards, summary reports, master detailed, drill down and score cards
- Involved in preparation of Requirement Traceability Matrix (RTM), Defect Reports, and Weekly Status Reports
- Experienced in interacting with Clients, Business Analysts, UAT Users and Developers Comfortable in working with fixed length, delimited flat files and worked with large volume of data in the database/data warehouse
- Experience in analyzing the error caused to failure of the ETL load with to the log file and report them to the corresponding team to get it rectified
- Familiar in working with Quality Center/ ALM to store Business Requirements, Test Cases, Test Runs, and Test Results for every iteration and store defects and link them back to the requirements
- Experience in End-to-End testing of an application/process. Performed backend testing using SQL Queries and UNIX scripting
Databases: Oracle 11g/10g, SQL Server, Teradata 14
Big Data Ecosystem: Hadoop, MapReduce, HDFS, HBase, Spark, Hive, Pig, Sqoop
Programming: SQL, PL/SQL
Testing Tools: HP ALM, JIRA
ETL Tools: Informatica 10.2/10/9.5, DataStage 8.7/8.1
Operating System: Microsoft Windows, UNIX, LINUX
Reporting tools: Cognos, MicroStrategy 10.11
Confidential, Durham, NC
Sr. ETL Tester
- Responsible for Gathering test requirements and scoping out the test plan.
- Written complex SQL queries.
- Moving data flat files generated from various Databases to HDFS for further processing.
- Developing the Hive HQL for processing data.
- Created and tested Hive tables to store the processed results in a tabular format.
- Created and tested Hive Managed and External tables with partitions in ODS layer and loaded data in to Hives.
- Wrote SQL and PL/SQL scripts to perform database testing and to verify data integrity.
- Tested both conditional formatting and threshold level testing for several reports developed in Micro strategy.
- Reported bugs and tracked defects using ALM/Quality Center
- Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Involved in data warehouse testing by checking ETL procedures/mappings.
- Performed all aspects of verification, validation including functional, structural, regression, load and system testing.
- Written SQL complex queries to verify data is flowing from source to target and from the target reports have been generated using Micro strategy.
- Participated in Integration, System, Smoke and User Acceptance Testing and production testing
- Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Extraction of test data from tables and loading of data into SQL tables.
- Validated the reports to make sure all the data is populated as per requirements.
Environment: MicroStrategy, SQL, PL/SQL, Agile, Soap UI, ALM/HP Quality Center 11, Oracle UNIX, TOAD, T-SQL, SQL Server, XML Files, Flat Files
Confidential, Hartford, CT
ETL Tester/ Hadoop Tester
- Documented the business requirements, developed test plans, test cases created for the database backend testing and to test database functionality.
- Review the test cases written based on the Change Request document.
- Used Informatica as an ETL Tool for Developing the Data Warehouse.
- Understand the application business logic with Business Requirements Specification documents and functionality of application with Functional Requirements specification documents.
- Involved in Modeling and Reporting against a Warehouse (OLAP), (OLTP) system as data sources.
- Involved in functional study of the application.
- Running the Jobs/Workflow for ETL process.
- Created ETL mappings, sessions, workflows by extracting
- Parsed high-level design specification to simple ETL coding and mapping standards.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- Participated in daily stand up meetings, weekly status meetings with offsite teams and offshore teams. Worked on DataStage as ETL Tool
- Involved in Defect Triage Meetings and determine the priority and severity of bugs.
- Create Crystal Reports, which would facilitate decision-making. The Crystal Reports designed were complex and used formulas, parameters, selection criteria, sub reports etc.
- Ensured data integrity and verified all data modifications and calculations during database migration using ETL tools
- Extensively used UNIX commands to validate the Unix Script files and the parameter files.
- Extensively used the diff command in UNIX to find difference between files.
Environment: HPALM 11.0, MS Excel, Oracle 10g, SQL Server 2010, Teradata utilities, TOAD, UNIX, Windows, SQL, PL/SQL, Jira, Informatica 9.5, HDFS, Hive, Sqoop
Confidential, Bothell, WA
- Created functional test cases, test conditions, test data and performed integration, system, regression, positive, negative, E2E testing.
- Escalated testing issues, interacted with business users/analysts, developers, external teams to resolve the defects/issues.
- Tested all the ETL process developed to load the data into the data marts.
- Tested the various reports like Dashboards, Summary and Drill through reports developed in business objects
- Execute SQL test scripts and record actual result.
- Interacted with Project team and users to get information about the application and decide various tests that can be performed for the specific application.
- Preparing and Reporting Daily/Weekly Status Reports.
- Extensively involved in Regression, smoke and Sanity testing.
- Tested various data feeds based on the data flow diagrams from deposits, loan to General ledger.
- Written complex SQL queries to perform data validations.
- Maintained and contributed the entire progress reporting exercise.
- Took part in daily scrum and weekly progress calls for the move.
Environment: DataStage 8.1, Agile - Scrum, Oracle 11g, MS SQL2008, MS Office, Jira, SQL, PL/SQL, Cognos