We provide IT Staff Augmentation Services!

Test Engineer Resume

4.00/5 (Submit Your Rating)

RichardsoN

SUMMARY:

  • Professional experience in IT and Enterprise Application Development in multiple business domains ranging from Health Care and Insurance, which includes hands - on experience in Big Data/Hadoop ecosystem and related technologies.
  • Strong knowledge and experience in architecting real-time streaming applications and batch style large scale distributed computing applications using tools like Map reduce, Hive etc.
  • Strong knowledge on NoSQL database like HBase.
  • Excellent understanding/knowledge of Hadoop architecture and various components such as Big Data and Hadoop File System HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Hadoop Map-Reduce programming paradigm.
  • Experience in importing and exporting terra bytes of data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Experienced in using NFS (network file systems) for Name node metadata backup.
  • Experience in developing customized UDF's in java to extend Hive and Pig Latin functionality.
  • Experience in analyzing data using HiveQL, Pig Latin, HBase and custom Map-Reduce programs in Java.
  • Experience in working with offshore quality assurance teams.
  • Building Cloud Formation with S3 bucket.
  • Implementing snapshot Backup tool for AWS Instances on AWS Environment.
  • Implementing Hardening script for AWS Linux Instances on AWS Environment.
  • Implementing and setting up NAT HA failover on AWS Environment.
  • Implementing and setting up AWS security groups for AWS Instances on AWS Environment.
  • Defined AWS Security Groups which acted as virtual firewalls that controlled the traffic allowed to reach one or more AWS EC2 instances.
  • Implementing and setting up Route 53 for AWS Web Instances, ELB & CloudFront on AWS Environment.
  • Implementing VPC on AWS cloud Environment and Handle the complete control of computing resources on EC2. Acquiring, testing and installing various patches to the system with patch management.
  • Experience with Amazon S3, RedShift and RDS.
  • Have experience in using implementing Bootstrap Framework.
  • Activated virtual machines using KMS.
  • Expertise in job workflow scheduling and monitoring tools like Oozie and Crontab.
  • Expertise in continuous Integration practices with JIRA.
  • Experience in installation, configuration, supporting and managing - Cloudera's Hadoop platform along with CDH3 & 4 clusters.
  • Experience in Software Development Life Cycle (SDLC) methodologies like Agile, Scrum, and Waterfall.
  • Extensive experience working in Oracle, DB2, SQL Server, PL/SQL and My SQL database.
  • Hands on experience in application development using Java, RDBMS, and UNIX shell scripting.
  • Experience in working with BI team and transform big data requirements into Hadoop-centric technologies.
  • Hands-on experience on development tools like Eclipse, IntelliJ, RAD, Eclipse, and JDeveloper.
  • Solid Understanding of OOPS and RDBMS concepts.

TECHNICAL SKILLS:

Languages: Java, Python, HTML, SQL.

Java Technologies: Core Java, JDBC

Big Data Technologies: Hadoop - HDFS, MapReduce, Oozie, Sqoop, Hive, Pig, Cloudera.

Web Technologies: HTML5, CSS3, Javascript, PHP

RDBMS: Oracle, MS SQL Server,Teradata and MySQL.

NoSQL: Hbase

Data Visualization: Datameer

File Transfer: SecureFx, Putty

Tools: Eclipse, IntelliJ

Data mining Tools: Rapidminer

Test Mangament Tools: HP ALM, IBM RTC

Code Respository: GitHub

Operating System: Windows, UNIX, Linux.

PROFESSIONAL EXPERIENCE:

Confidential, Richardson

Test Engineer

Responsibilities:

  • Prepared Test Plans from the Business Requirements and Functional Specification.
  • Developed test strategy & test cases and reviewed with all stakeholders, Development team, Infrastructure team and other teams.
  • Coordinate & schedule the defect triage meetings with the development team and business to discuss the open issues reported using ALM/ Quality Center.
  • Experience in working offshore/onshore model.
  • Worked on onsite SPOC for the offshore team in knowledge transfer and helped offshore team to make easy transition into the project.
  • Experience in working multiple projects at a same time.
  • Working experience as test lead for few projects.
  • Successfully completed several project working as an onshore test lead for the offshore team.
  • Performed system testing on different Data warehouse and Hadoop components.
  • Performed Functional and Non Functional testing in Data warehouse and Data Lake.
  • Performed UAT testing and Data Quality testing for the Business Users.
  • Performed regression testing in Data warehouse and Data lake.
  • Experienced in Testing XML and Json files.
  • Created automated framework for testing Hadoop components like hive. Hbase and hdfs in Apache Pig and Python.
  • Created Hive and Hbase tables based on the business requirements.
  • Used Hive queries to transform the data in Hive tables.
  • Performed source to target Hive database testing by using HIVE queries.
  • Performed source to target Teradata database testing by using SQL scripts.
  • Used Datameer macros to test HBase tables.
  • Used Datameer import and export to load data into multiple databases.
  • Used Linux commands to check audit framework for HIVE and Hbase tables.
  • Used JIRA for tracking of bugs and tasks.
  • Created test cases and test plan in HP ALM.
  • Created defects in HP ALM and assigned defect id to bug in JIRA.
  • Given support to production issues.
  • Used Pig to load different types of data format into Hive and Hbase.
  • Written Pig Scripts for testing Hbase, Hive, XML and Flat files.
  • Written custom PIG UDF for loading and testing of data.
  • Used filters in HBase tables to get subset with the latest timestamp.
  • Hand on experience on ETL tools likes Data Stage and Talend.
  • Worked with Teradata SQL Assistant for data retrieval and data validation.
  • Developed SQL queries to verify the number of records from Source to Target and validated the referential integrity, Time variance, Missing records, Nulls/Defaults/Trim spaces rules as per the design specifications.
  • Loading Flat file Data into Teradata tables using Unix Shell scripts and mload.

ENVIRONMENT: Hive, Hbase, Pig, HDFS, Sqoop, Hue, Jira, SecureFx, Datameer, Teradata, Datastage, Putty, SQL, shell scripts, DB2, HP ALM, RTC, GitHub, Talend.

Confidential

PROGRAMMER ANALYST / GRADUATE ASSISTANT

Responsibilities:

  • Worked in HTML, CSS, HTML5, XML, DHTML, Ajax, JavaScript, jQuery, Angularjs.
  • Worked on Cross Browser Compatibility and tested each and every web application on popular web browsers such as Internet Explorer, Firefox, Safari, Opera, and Chrome.
  • Responsible for transforming designed mockups to web pages.
  • Developed Web applications that are cross - browser compatible.
  • Performing new - website support research in the areas of Latest Web Technologies, Usability, Accessibility, and User Experience.
  • Coordinated with the quality group for testing activities and Production Support team to resolve the solution.

ENVIRONMENT: HTML, CSS, JavaScript, PYTHON, PHP, MVC Framework, PHP Web Services.

Confidential

Web Developer

Responsibilities:

  • Worked with the business community to define business requirements and analyze the possible technical solutions.
  • Requirement gathering, Business Process flow, Business Process Modeling and Business Analysis.
  • Extensively used UML and Rational Rose for designing to develop various use cases, class diagrams, and sequence diagrams.
  • Used JavaScript for client-side validations.
  • Developed application using Spring MVC architecture.
  • Developed custom tags for table utility component.
  • Used various Java, J2EE APIs including JDBC, XML, Servlets, and JSP.
  • Designed and implemented the UI using HTML, JSP, and JavaScript.
  • Involved in Java application testing and maintenance in development and production.
  • Involved in developing the customer form data tables. Maintaining the customer support and customer data from database tables in MySQL database.
  • Involved in mentoring specific projects in the application of the new SDLC based on the Agile Unified Process, especially from the project management, requirements, and architecture perspectives.
  • Designed and developed Views, Model and Controller components implementing MVC Framework.

ENVIRONMENT: JDK 1.5, J2EE, JDBC, Servlets, Spring MVC, JSP, XML, XSL, CSS, HTML, DHTML, JavaScript, UML, Eclipse IDE, MySQL.

We'd love your feedback!