Big Data Applications Developer Resume
2.00/5 (Submit Your Rating)
Adon, TX
SUMMARY
- Having Three+ years of hands - on expertise in big data, analytics and cloud computing, analytics and cloud computing (Hadoop Ecosystems, Java, Map Reduce, Hive, Impala, Pig, Oozie, HBase, Cassandra, Sqoop, Zookeeper, Redshift, Zeppelin)
- Two years’ experience installing, configuring, deploying and testing Hadoop ecosystem components using Cloudera Manager.
- Capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application architecture.
- Excellent working noledge of RDBMS and NoSQL databases: MS SQL, Mahout, Cassandra and MongoDB.
- Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per requirement and Good working noledge on Data Transformations and Loading using Export and Import.
- Clear understanding of Hadoop architecture and various components such as HDFS, Job and Task Tracker, Name and Data Node, Secondary Name Node and Map Reduce programming.
- Familiarizing with Spark, Spark SQL, Storm and Flume.
- Strong leadership, excellent mentoring and superb interpersonal skills
TECHNICAL SKILLS
Languages: Visual Studio .NET, VBA, VB, C#, C/C++, R, Java and MATLAB
Packages: Excel 2013, Access 2013, PowerPoint 2013, SQL and SAS
Scripts: JavaScript, HTML5, CSS3, XML, Unix Shell Scripting
Operating Systems: Windows (all versions), Linux (Ubuntu 16.01), Mac OS X
Databases: Microsoft SSMS, MySQL, MongoDB, HBase
PROFESSIONAL EXPERIENCE
Confidential, Adon, TX
Big Data Applications Developer
Responsibilities:
- Sqoop tables from MySQL and Oracle Database to Hadoop
- Pre-processing with Hive and Impala
- Develop data models using advanced Big Data visualization tools; Platfora, Arcadia and
Confidential, Cincinnati, OH
Hadoop Developer/Administrator
Responsibilities:
- Develop Data Pipelines dat work seamlessly on Hadoop clusters using Pig, Hive and Impala
- Implemented Hadoop based data warehouses, integrated Hadoop with Enterprise Data Warehouse systems
- Expertise in data ingestion using Sqoop or Hue
- Excellent data modeling skills using Pentaho, UDFs, Hive Scripts and Pig Scripts
- Design, develop, testing, tuning and building a large-scale data processing system, for Data Ingestion and Data products for both Operational Applications and Analytical needs
- Troubleshoot and develop on Hadoop technologies including HDFS, Hive, Pig, Flume, HBase, Spark, Storm, Impala and Hadoop ETL
- Proficiency in Linux Scripting and regulation of Hadoop administration rights and access
- Expert noledge in Hadoop/HDFS, MapReduce, HBase, Pig, Impala, Sqoop, Amazon Elastic Map Reduce (EMR), Cloud: Amazon EC2, Cloudera Manager,
Confidential, Flower Mound, TX and South Bend, IN
Java Developer/Engineer
Responsibilities:
- Designed, developed, tested, supported and deployed desktop, custom web, and mobile applications for departmental projects within an Agile environment.
- Developed software, architecture, specifications and technical platforms using Java(Spring Tool Suite & Eclipse),
- Developed automation scripts to support testing of telecommunications equipment
- Integrated developed code with database functions using Java REST API
- Designed and deployed an app which implemented statistical/predictive models and cutting edge algorithms to measure vehicle safety and depreciation.
- Led the migration of monthly statements from UNIX platform to MVC Web-based Windows application using Java.
- Built sustainable and modular automation solutions using agile and continuous delivery best practices.
- Very good noledge of OOP and OOAD concepts since 2008
Confidential
Data Analyst/Repair Supervisor
Responsibilities:
- Turned-around projects with a loss into profitability and kept them above 60%
- Developed custom financial and auditing reports using Crystal Reports with T-SQL, MS Excel and Access.
- Analyze large datasets to provide strategic direction to the company.
- Perform quantitative analysis of product sales trends to recommend pricing decisions.
- Conduct cost and benefit analysis on new ideas.
- Scrutinize and track customer behavior to identify trends and unmet needs.
- Assist in developing internal tools for data analysis.
- Implemented process improvements to increase efficiency utilizing limited operative resources
- Effectively recruited or selected, trained, and assigned personnel
- Efficiently scheduled projects and assignments
- Installed, tested, turned on, modified, upgraded and debugged telecommunication equipment
- Adapted to rapid change and possessed the determination and focus to get the job done, while meeting assigned schedules, and adhering to strict technical standards