We provide IT Staff Augmentation Services!

Manager Big Data Resume

4.00/5 (Submit Your Rating)

San Francisco, CA

SUMMARY

  • Software Engineer with 13 years IT experience in Software Design and Development
  • Developed Big Data solutions using Hadoop, Hive, HBase and Kafka
  • Web and App development experience using Java, J2EE, JSP, EJB, Servlets, C/C++/Objective C, SOA, DHTML, JDBC, Spring and Hibernate.
  • Expertise with J2EE complaint application servers including Sun Application Server, Apache Tomcat, IBM WebSphere, JBoss.
  • Experience in designing and implementing databases and data warehouses on SQL Server, DB2, Oracle, and MySQL.
  • Developed web reports and dashboards using Cognos 8 Suite, Microsoft Analysis Services and custom OLAP tools.
  • Able to identify DB bottlenecks and optimize SQL statements and stored procedures
  • Developed rich and interactive front end web applications using JavaScript, JQuery, JSP, CSS, HTML, and SVG
  • Formally trained in Software Engineering principles with experience in teh Software Development Life Cycle (SDLC).
  • Knowledgeable in applying design patterns for large scale applications.
  • Possess excellent communication skills as well as technical skills

TECHNICAL SKILLS

  • Java, C/C++, Objective C, JavaScript, JQuery, TCL, PERL
  • J2EE, JSP, Java Bean, JDBC, Servlets, EJBApache, Tomcat, WebLogic, Websphere, IIS, JBoss
  • Xcode, Eclipse, WSAD, Visual Studio, Rational Software Architect
  • Hadoop (MapR Cloudera, Hortonworks), Hive, HBase, Kafka, J2EE, Spring, Hibernate, Splunk
  • Rational Rose, WinRunner, JUnit, Ant
  • Git, Subversion, CVS, Perforce
  • Oracle, SQL Server, DB2, MySQL
  • DHTML, CSS, XML, JSP, JQuery, Ajax
  • TCP/IP, UDP, SMTP
  • UNIX, Windows, DOS, MAC OS X

PROFESSIONAL EXPERIENCE:

Confidential, San Francisco, CA

Manager Big Data

Responsibilities:

  • Designed and Deployed Cloudera Hadoop Cluster
  • Designed and Deployed Kafka cluster
  • Lead Big Data Team to build system capable of ingesting 30TB / day using teh Lambda architecture
  • Hired and mentored team members for Big Data
  • Lead design of data pipeline and data models
  • Coordinated efforts with external teams to help them with their Big Data requirements

Technologies: Java, Hadoop, Kafka, Hive, PrestoDB, Impala, MySQL, Linux

Confidential, San Francisco, CA

Software Engineer

Responsibilities:

  • Designed and Deployed Hadoop MapR and Hortonworks Clusters
  • Developed ETL process to ingest data from CSV files into Hive partitioned tables
  • Created data dictionary of existing tables using SQL Server, Splunk
  • Setup Kafka cluster to ingest large amounts of log data into Hive tables
  • Developed Proof of concepts for Datameer, Vertica and Paraccel to integrate with MapR cluster
  • Identified and troubleshoot bugs within open source software such as Hadoop, Hive, and Kafka

Technologies: Java, Hadoop (MapR, Hortonworks), Splunk, Hive, Sql Server, Linux

Confidential, Walnut Creek, CA

Software Architect

Responsibilities:

  • Developed company homepage using CSS, JQuery, and DHTML.
  • Performed fraud detection using Mahout and Hadoop
  • Designed company data warehouse for analytics and reporting
  • Developed iPhone application for residents to easily make payments online.
  • Developed IFrame application which allows third party companies to seamlessly integrate with online payment site.
  • Created new features and maintained company J2EE web application
  • Reverse engineered sales commission reports which involved over 200 calculations within a week
  • Developed pivot tables of monthly sales reports using OLAP
  • Improved and customized Rentpayment.com site
  • Performed custom integration of legacy client software with Rentpayment’s online processing web services.

Technologies: Objective C, Java, J2EE, JSP, HTML, Servlets, Tomcat, SQL Server, SOA, Eclipse, XCode, JQuery

Confidential, Redwood City, CA

Sr. Technical Consultant

Responsibilities:

  • American Express Instant Account Verification Project
  • Team lead and project manager.
  • Managed offshore resources.
  • Ensured application met both business and technical requirements.
  • Developed technical requirement, project plans, and test plans.
  • ANZ Personal Finance Management Project
  • Ensured dat customer’s business and technical requirements were met by validating design and coordinating with offshore resources to ensure application was being built right.
  • Data warehouse and data mining project
  • Developed ETL solutions to extract and transform transactional data into a star schema
  • Developed categorization engine to extract dimensions from transactional data.
  • Capital One Batch feed project
  • Developed batch process to extract and encrypt alerts from SOA components into flat files for client.

Technologies: Java, J2EE, EJB, JSP, HTML, Servlets, JBoss, Resin, Oracle, SOA, Eclipse

Confidential

Analyst

Responsibilities:

  • Developed data access objects (DAO’s) for authentication and general data access using Hibernate.
  • Wrote teh data access components as EJB’s in Rational Software Architect using container managed transactions, and exposed them as Web Services following SOA principles.
  • Generated JUnit tests to ensure dat teh components functioned as expected and to automate regression testing.
  • Worked with other developers and clients to identify business needs and created teh data access components out of teh requirements.
  • Worked with IBM’s Websphere Integration Developer to develop integration components to hook up to legacy systems.

Technologies: Java, J2EE, EJB, JSP, HTML, Servlets, Websphere, Oracle, SOA, Hibernate, Eclips

We'd love your feedback!