We provide IT Staff Augmentation Services!

Software Engineer - Testing Resume

2.00/5 (Submit Your Rating)

Dallas, TexaS

SUMMARY:

Technically - proficient, results-driven Computer Science/Data Science student skilled with developing, testing, and implementing software initiatives that result in improved efficiencies and increased productivity. Solid critical-thinking and problem-solving skills with the ability to multi-task and work well in fast-paced environments. Effective communicator with the capacity to work independently or as a team player. Extremely organized and accurate; able to follow through to the last detail.

  • Understanding of distributed systems, HDFS architecture, internal working details of MapReduce and Spark processing frameworks.
  • Understanding of Hadoop big data architectures, data movement technologies, database partitioning, database optimization, and building communication channels between structures and unstructured databases.
  • Understanding of big data concepts and use of cloud technologies and tools.
  • Well-versed in installation, configuration, administration, and tuning Hadoop cluster of major Hadoop distributions (Cloudera CDH 3/4/5, Hortonworks HDP 2.3/2.4, and Amazon Web Services (AWS).
  • Mastered use of the different columnar file formats like RCFile, ORC and Parquet formats.
  • Pandas experience with operational monitoring of clusters and applications.
  • Setup and Config Data Lake on Bigdata platforms like Cloudera, Hortonworks, and Mapr.
  • Skill using Open source software like Spark, Flume, and Kafka.
  • Proficient performing importing/exporting actions between SQL (Oracle, MySQL)/ NoSQL (Realm, MongoDB) databases and HDFS using Sqoop.
  • Proficient with building tools like Apache Ant and Apache Maven.
  • Hands on experience migrating complex MapReduce programs into Apache Spark RDD operations like transformations and actions.
  • Hands-on expertise in Hadoop components - HDFS, MapReduce, Hive, Impala, Pig, Flume, Sqoop and HBase.
  • Implemented, set-up and worked on various Hadoop Distributions (Cloudera, Hortonworks, Amazon AWS).
  • Knowledgeable of deploying the application jar files into AWS instances.
  • Knowledgeable of Hadoop Architecture and Hadoop components (HDFS, MapReduce, JobTracker, TaskTracker, NameNode, DataNode, ResourceManager, NodeManager).
  • Knowledgeble of installation and configuration of Hive, Pig, Sqoop, Flume and Oozie on Hadoop clusters.

TECHNICAL SKILLS:

  • Database: Apache Cassandra, AWS DynamoB, MongoDB, ArangoDB, AuroraDB, Redshift, AmazonRDS, SQL, MySQL, NoSQL, Oracle, DB2.
  • Open Source Distributions AWS, Kali, Cisco
  • Amazon Stack AWS, EMR, EC2, EC3, SQS, S3, DynamoDB, Redshift, Cloud Formation
  • Systems Windows Active Directory, Windows Server 2003, 2008, 2008R2, 2012/2012R2, Red Hat Linux 6-7, Red Hat, IBM AIX, HP, CentOS, Ubuntu
  • Virtualization VMWare, vSphere, Virtual Machine/ Big Data VM, VirtualBox, Oracle Big Data Lite Virtual Machine v 4.9
  • Data Pipelines/ETL Apache Camel, Flume, Apache Kafka, Apatar, Atom, Fivetran, Heka, Logstash, Scriptella, Stitch, Talend, Ketl, Pentaho Data Integration (Kettle), Jaspersoft, CloverETL
  • Distributions Cloudera, Hortonworks, AWS, MapR
  • Hadoop Ecosystem Hadoop, Hive, Spark, Maven, Ant, Kafta, HBase, yarn, Flume, Zookeeper, Impala. HDFS, Pig, Oozie, Tez, Zookeeper, Apache Airflows
  • Search Tools Apache Solr/Lucene, Elasticsearch/ Kibana
  • File Formats Parquet, Avro
  • File Compression Snappy, Gzip, PRC
  • Data Mining RapidMiner, IBM SPSS Modeler, Oracle Data Mining
  • Data Cleansing DataCleaner, Winpure Data Cleaning Tool, Patnab, OpenRefine, Drake
  • Software Nessus, SET, API, Metasploit, WIreShark, VMWare, vSphere, AppDynamics, Confluence, Jira,, RabbitMQ, Minsoft, Nagios, Cloudclock.

PROFESSIONAL EXPERIENCE:

SOFTWARE ENGINEER - TESTING (TEST ENGINEER)

Confidential, Dallas Texas

Responsibilities:

  • Condust software tests accoridng to pre-defined test strategy, plan artifacts, including field test, field trial, FUT test, component and service level testing, data conversion and system integration testing, continuous build release testing and version control.
  • Analyze business and technical requirements and specifications based on scopes of projects.
  • Develop test plans to address areas such as database impacts, software scenarios, regression testing, negative testing, error or bug retests or usability.
  • Perform wireless device field testing per established test plans: AGPS, Voice Quality, Call Performance, Data Performance, and Video Performance.
  • Perform feature test and provide feature testing support, including perform tests, perform test calls, and setup test scenarios, etc.
  • Perform all carrier specific tests, including call performances, system selection, system determination, AGPS, MIP, browser, SMS, MMS, video, camera, user interface, etc.
  • Prepare field test reports for carrier submission and evaluation.
  • Troubleshooting and diagnostic analysis of system defects, and conducts impact analysis.
  • Monitor the real time performance of the testing, ensuring data collected is accurate and the test runs smoothly.
  • Manage resulting log files which can include reviewing for accuracy, content, post-processing and uploading to central server.
  • Provide technical support to integration of new Software, Hardware or features
  • Support testing and fix verification on software patches or correction loads.
  • Review compatibility of patches and updates with respect to the current solution and feature set.
  • Assist to prepare internal patch installation and software update MOP.

SENIOR BIG DATA ENGINEER

Confidential, Alpharetta, GA

Responsibilities:

  • Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities.
  • Implementing ETL process.
  • Monitoring performance and advising any necessary infrastructure changes.
  • Defining data retention policies.
  • Design and implement data architectures in production environments.
  • Implementation of data orchestration pipelines, data sourcing, cleansing, augmentation and quality control processes.
  • Deployment of machine learning models in production Translation of business needs into data architecture solutions Contribution to overall solution, integration and enterprise architectures Development of data landscape modernization architectures and roadmaps.

SENIOR LINUX SYSTEMS ENGINEER/SAP/BWA/HANA ADMINISTRATOR

Confidential, Atlanta, GA

Responsibilities:

  • Develop bash scripts to automate file transfers between GDOL and various banks
  • Consulting on mainframe to distributed environment migration
  • Apply understanding of IT system components & architecture including, MF HA architectures and migration strategies.
  • SAN/NAS, VMWARE, networking and firewall architectures.
  • Write jcl scripts to convert and transfer files using Sterling Gateway MET.
  • Perform hands on system administration of z/os & 3rd party software installs & troubleshooting activities.
  • Managed budget for Managed File Transfer Architecture.
  • Level 3 support- supported power VM systems troubleshooting application issues. Building out new virtual machines and troubleshooting issues.
  • Implemented HA Logical partitions, patching and updating virtual machine configurations and troubleshooting application and database performance issues by tuning power VM.
  • Maintain & Build Linux Servers running SAP BWA & HANA Business Analytics Applications
  • Install patch releases, OS SW enhancements; application SW
  • Level 3 Support, troubleshoot and resolve issues and failure
  • Used Perl scripts to build out new turnkey systems for clients
  • Technologies Used: SUSE Linux IBM SAP HANA & BW, IBM Blade, bash, python, azur, Idap

We'd love your feedback!