We provide IT Staff Augmentation Services!

Senior Big Data Hadoop Tester/etl Tester Resume

Chicago, IL

SUMMARY:

  • Proficient in analyzing the Business Requirements, System Requirement Specifications, Functional Requirement specification, Design Documents to formulate Test Plans, Test Strategies, and Test Cases
  • 9+ years' experience working as a DWH Tester in various domains like Banking, Insurance and Healthcare.
  • Over 4 years of strong working experience with Big Data and Hadoop Ecosystems
  • Strong experience with Hadoop components: HBase, Hive Pig, Sqoop, Zookeeper.
  • Strong working experience with ingestion, storage, querying, processing and analysis of big data.
  • Strong working experience with Hadoop Architecture and the components of Hadoop - Map Reduce, HDFS, Job Tracker, Task Tracker, Name Node and Data Node
  • Experience in Full software development life cycle implementation. Including Business interaction, Requirement Analysis, Software Architecture, Design, Development, Testing and Documentation phases.
  • Extensive Knowledge in design of Object Oriented Applications Using UML
  • Experience in Full life cycle of software projects including system analysis, design, development, testing, implementation and user
  • Experience in Data Analysis, Data Validation, Data Verification, Data Cleansing, Data Completeness and identifying data mismatch.
  • Experience in conducting Integration, System, Functional, Regression, GUI, Stress, Performance Testing
  • Experienced in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results using Confidential Quality Center
  • Experience with Business Objects XIR2 and dynamic dashboards, scorecards and structured reports for operations and higher management.
  • Expertise in creating and integrating BO reports and objects with the data warehouse.
  • Worked with ETL group for understating mappings for dimensions and facts.
  • Optimizing/Tuning several complex SQL queries for better performance and efficiency.
  • Extensive testing ETL experience using Informatica … (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager)
  • Experience in Dimensional Data Modeling using Star and Snow Flake Schema.
  • Proficient experience in different Databases like Oracle, SQL Server, DB2 and Teradata
  • Developed Test Cases, Test Plans and produced Bug Reports Experienced in executing PL/SQL
  • Good communication and inter-personal skills, accustomed to work in a team environment with tight schedules and capable of working efficiently under pressure, manage multiple project and cross train sub-ordinates in other Functional areas

WORK EXPERIENCE:

Senior Big Data Hadoop Tester/ETL Tester

Confidential, Chicago, IL

Responsibilities:

  • Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
  • Developed and executed various manual testing scenarios
  • As an ETL Tester responsible for the understanding the business requirements, creating test data and test case design.
  • Extensively used Map Reduce component of Hadoop. Responsible for importing and exporting data into HDFS and Hive.
  • Analyzed data using Hadoop components Hive and Pig.
  • Responsible for writing Pig scripts to process the data in the integration environment
  • Responsible for setting up HBASE and storing data into HBASE
  • Responsible for managing and reviewing Hadoop log files
  • Responsible for running Hadoop streaming jobs to process terabytes of xml's data.
  • Load and transform large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Tested HBase tables to load large sets of structured, semi-structured and unstructured data coming from different sources like UNIX, NoSQL etc
  • Experience in working with the HBase as a NoSQL DB in the MapReduce framework.
  • Verified Importing and exporting data into HDFS and Hive using Sqoop.
  • Worked with Linux shell scripting for moving the files to the Hadoop Cluster.
  • Involved in loading data from UNIX file system to HDFS.
  • Part of daily stand up meetings to communicate any day-to-day issues.
  • Tested several Cognos Reports of different types including Dashboards, Drill-Down, Master-Detailed, Aggregated, KPI's, Grouped List, Cascade and Web Reports.
  • Tested and validated the Cognos reports on target databases for conformance to specifications
  • Design and execute the test cases on the application as per company standards and tracked the defects using Confidential Quality Center/ALM
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Interacting with senior peers or subject matter experts to learn more about the data
  • Extensively written test scripts and complex SQL Queries for back-end validations
  • Tested key data metrics and reports that help to determine mandatory and discretionary investments based on the prioritizations and tested the expected returns using MS Access.
  • Created SQL queries to pull out data and metrics from the database.
  • Analyzed business and functional requirements to derive test plans, test cases, procedures, and expected results for testing Teradata based data warehouse applications.
  • Validated Extract/Transform/Load (ETL) processes that load Teradata test databases with data feeds from upstream applications and provided information deliveries of warehouse data to downstream users.
  • Involved in unit, performance and integration testing of DataStage jobs.
  • Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions.
  • Involved in testing the XML files and checked the quality of data that is parsed and loaded to staging tables.
  • Tested datasets using several stored procedures.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata

Environment: Teradata 13, Hadoop 1, 2, Pig, Java (JDK1.6), Hive, Map Reduce, Sqoop, Flume, Quality Center/ALM, Oracle 10g, Unix IBM AIX 5.5, Confidential - SQL, XML Files, XSLT, XML Spy 2010, SQL, PL/SQL. Excel, Datastage 7.5.3, Java, and Eclipse

ETL QA Analyst

Confidential, Pennington, NJ

Responsibilities:

  • Interacted with Business Analysts for Business Requirements, testing scope reviews, inspections and test planning.
  • Prepared Traceability matrix and prioritized test cases.
  • Involved in creating Test Plans, Test Scenarios, Test Scripts based on business Requirements, Use cases and Functional Specifications in Test plan tab of Quality Center tool
  • Create and maintain a Requirements Traceability Matrix
  • Involved in performing Functional, Integration, System, Regression, GUI, Sanity, Ad-hoc, and User Acceptance Testing (UAT) for the AIE2E application.
  • Performed extensive Data Integrity testing by executing SQL Statements on Oracle & SQL database
  • Coordinated with the Developers regarding the Defects raised and Retested them against the application.
  • Tracking and analyzing the defects and recording the variation between the expected and actual results
  • Experience testing data conversions and migrations in cross platform scenarios Informatica Power Center.
  • Tested reports developed by Business Objects Universes for both Adhoc & Canned Reporting users of Business Objects XI R3.1
  • Experience in Testing data conversions and transformations.
  • Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
  • Tested both the Web Service Architecture and Service Oriented Architecture
  • Prepared extensive set of validation test cases to verify the data
  • Modified Test cases according to design doc's changes, executed the test cases, Interpreted test results and determine pass/fail
  • Performed Defect Management by prioritizing the issues and communicated the issues to the developer.
  • Planned, coordinate and execute Business Objects deployment for End-users and documented the entire project
  • Checking the integrity of UI data with the database using SQL queries, checking executing of stored procedures with the input values taken from the database.
  • Executed the Test cases, Test Scenarios in Test lab tab of Quality Center.
  • Communicated the issues/defects to the developer using Quality Center tool.
  • Participated in bug review meetings with the software development team throughout the testing phase.
  • Created Test metrics and participated in status meetings and reported the progress to the manager.

Environment: Informatica 9.1 Oracle 10g, Business Objects XIR3.1/3, Toad, DB2, Mainframe, AS/400, MS Office tools, SQL Query Analyzer, UNIX (Red hat Linux, Sun Solaris, IBM AIX), SQL Server 10.5, Clear Quest, UNIX Scripting.

Senior ETL QA Tester

Confidential

Responsibilities:

  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center 9.2
  • Tested ETL jobs as per business rules using ETL design document
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Involved in the testing of Data Mart using Power Center.
  • Extensively used Informatica Power Center for Extraction, Transformation and Loading process.
  • Extensively tested several ETL Mappings developed using Informatica.
  • Used Teradata load utilities Fast load, Multiload and FastExport to extract, transform and load the Teradata data warehouse.
  • Worked in an Agile technology with Scrum.
  • Involved in automation of test cases using QTP.
  • Did functional testing using QTP
  • Effectively distributed responsibilities, arranged meetings and communicated with team members in all phases of the project.
  • Used import and export facilities of the application to download/upload XMLs of failed test cases so as to re-verify.
  • Writing UNIX scripts to perform certain tasks and assisting developers with problems and SQL optimization.
  • Configured Quick Test Pro with Quality Centre and Maintained the project information in Quality Centre.
  • Extensively used Autosys for automation of scheduling jobs on daily, bi-weekly, weekly monthly basis with proper dependencies.
  • Wrote complex SQL queries using joins, sub queries and correlated sub queries
  • Performed Unit testing and System Integration testing by developing and documenting test cases in Quality Center.
  • Did Unit testing for all reports and packages.
  • Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
  • Tested ad hoc and canned reports for Business objects.
  • Responsible for migrating the code changes from development environment to SIT, UAT and Production environments.

Environment: Informatica 9.1, Flat files, Perl,, DTS, MS SQL Server 2008, Oracle 10g, SQL, PL/SQL, IBM DB2 10, AGILE, Teradata V2R6, Teradata SQL Assistant, Business Objects XIR3, COBOL, Confidential QTP 10.0, Confidential Quality Center 10, Autosys, UTL FILE, TSO, ISPF, OS/z, JCL, Mainframes, Toad, Unix Shell Scripting, Windows XP/2000

QA Tester

Confidential, San Antonio, TX

Responsibilities:

  • Analyzed and reviewed project documentation, business requirements to prepare detailed test schedules and plans.
  • Experience in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results using Quality Center/ALM and performing end to end testing and created Reports.
  • Checking the data flow from one end to another by using SQL queries.
  • Wrote Complex SQL queries to validate the standard reports, Dashboards and BI Business objects Universes.
  • Created Test input requirements and prepared the test data for Data Driven testing.
  • Extensively worked in agile environment, with daily scrum meeting, stand up meetings, presentations and review
  • Participated in defect review meetings with the team members. Used MS-Word for documentation.

Environment: Confidential ALM/Qulaity Center, UNIX, UAT, Agile and Java.

Tester

Confidential, PA

Responsibilities:

  • Interacted with clients to determine User requirements and goals. Participated in analysis of Business and functional requirements.
  • Wrote SQL queries to test the application for data integrity and verified the contents of the data table.
  • Developed Test Plan and overall Test Strategy for the Application. Developed Test cases, Test plans, and Test procedures using MS Word and MS Excel.
  • Extensively used SQL queries to check storage and accuracy of data in database table. Performed basic testing of security features using manual testing.
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted

Environment: MS Word, MS Excel. SQL, Confidential Quality Center, Oracle, Agile, UAT, UNIX and ETL

Hire Now