We provide IT Staff Augmentation Services!

Spark Developer Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • Cloudera Certified Hadoop professional with 3+ years’ experience in Hadoop/Big Data and overall of 9+ years of IT industry experience.
  • Excellent knowledge of Hadoop Architecture.
  • Excellent exposure in cluster benchmarking and performance tuning.
  • Having 2+ year experience in Hadoop & writing MapReduce programs
  • Having Good exposure in map reduce life cycle, Hadoop API concepts & Machine learning Concepts.
  • Very good knowledge in writing MapReduce drivers, Mappers & Reducers in Java
  • Having exposure on Hadoop ecosystems - Hadoop, MapReduce, HBase, Hive, Oozie, Sqoop, Mahout, R, Greenplum and Pig.
  • Team Player and Self Starter with Excellent Interpersonal and Communication Skills
  • Having 11 months experience in GreenplumDB and writing functions in GreenplumDB.
  • Involved various web application developments using ASP.NET, C#, ADO.NET, XML, HTML, CSS and SQL Server
  • Good knowledge in Object Oriented Programming (OOP), Database concepts.
  • More than 2 year of offshore responsibilities for projects with analysis, design, development phases and also have 1 month of experience on onsite implementation(Product setup/ patch scripts to correct the data mismatch) of product.
  • Having a Basic Understanding of .NET Framework 4.0 features and Microsoft Visual Studio 2010

TECHNICAL SKILLS:

Technologies: Cloudera 5.4, Mapreduce, Spark, Scala, Sqoop, SQL, PerlKafka, Storm, Hive, Pig, Oozie.NET Framework, ASP.NET, C#.NET, ADO.NET, jQueryJava, JavaScript, CSS, XML and HTML

Database: SQL Server, GreenPlum, Oracle, PL/SQL

IDE: Eclipse, Visual Studio 2005, 2008

Operating System: Windows, Linux

Tools: PG admin III, Toad, HP Unix and Secure CRT, Crystal ReportsWeb Server: Internet Information Services (IIS), Tomcat 1.7

Scripts: Unix shell scripts, Perl scripts

PROFESSIONAL EXPERIENCE:

Spark Developer

Confidential

Environment: SparkSQL 5.5, Scala 2.10, hive,Oracle, SQL Developer, notepad++Putty, Win SCP, Sublime Text Editor and Beyond compare

Responsibilities:

  • Involved in Requirement Analysis and high level and low level design.
  • Follow internal processes to carry out all deliveries as defined in project plan.
  • Involved in preparation of unit analysis Test Plan.
  • Involved in System testing and defect fixing.
  • Migration of codes to User Environment.
  • Preparing the High Level Design and Low Level Design for the project based on the use case diagrams and sequence diagrams.
  • Conduct quality reviews of design documents, code and analysis
  • Conducting review meetings
  • Onsite Coordination

Hadoop Developer

Confidential

Environment: Mapreduce, Hive, HDFS, Sqoop, Oozie, Java, Unix, Maven REST service. (Spark - used in PoC)

Responsibilities:

  • Requirements gathering
  • PoCs and technology feasibility analysis
  • Support Architect on technology selection and system flow.
  • Carry out System design
  • Technical guidance to team members
  • Code reviews
  • Development, Unit testing
  • QA team co-ordination
  • Support project manager for time & resource estimates, team acquisition, and team development
  • Update project manager for the status

Hadoop Developer

Confidential

Environment: Kafka, Storm, HBase, Hive, HDFS, JavaSqoop

Responsibilities:

  • On-time delivery of Project for new RFPs
  • Create System architectures for new proposals
  • Deck preparation for RFPs
  • Comparison study of technologies
  • Training new joinees

Greenplum Developer

Confidential

Environment: PostgreSQL, Oracle, Perl scripts, Secure CRT,WinScp, PG admin and Toad

Responsibilities:

  • Building a new application that will validate the submitted trade and generate reports to participant.
  • Developed transformations in Greenplum database based on the source target mapping documents.
  • Developed functions in Greenplum to generate reports.
  • Follow internal processes to carry out all deliveries as defined in project plan.
  • Preparation of test cases and Test scripts documents.
  • Functional documentation of the scripts.
  • Analysis of requirements for finding any ambiguity, incompleteness or incorrectness
  • Involved in the documentation of design specifications and mapping documents.
  • Development of Greenplum transformations based on the source target mapping.
  • Involved in guiding activities, code quality assurance, performance tuning, defect prevention etc.
  • Prepared and executed unit test cases.
  • Involved in End to End system Integration process.

Greenplum Developer

Confidentia

Environment: SQL, Postgre SQL, Datastage, Perl and shell, PG admin III, TOADSecure CRT, Win SCP and Beyond compare

Responsibilities:

  • Involved in Requirement Analysis and high level and low level design.
  • Developed functions in Greenplum database using for report generation Postgre SQL.
  • Follow internal processes to carry out all deliveries as defined in project plan.
  • Involved in preparation of unit analysis Test Plan.
  • Involved in System testing and defect fixing.
  • Migration of codes to User Environment.
  • Preparing the High Level Design and Low Level Design for the project based on the use case diagrams and sequence diagrams.
  • Developing Greenplum functions for generating reports.
  • System testing.
  • Migration of codes to User environment.
  • Prepared unit analysis plan and analysis data
  • Conduct quality reviews of design documents, code and analysis
  • Using data Loader for migration and load of the data
  • Conducting review meetings
  • Onsite Coordination

Hadoop Developer

Confidential

Environment: Hadoop Distributed File System (HDFS), Hadoop(CDH4), R Mahout, Oozie, Spring

Responsibilities:

  • Involved in requirements gathering
  • Comparative analysis of different Data sized inputs.
  • Generating huge volume of CDR files.
  • Understanding and designing the application architecture.
  • Performance tuning of Hadoop Job's over Cluster.

Hadoop Developer

Confidential

Environment: Hadoop Distributed File System (HDFS), Hadoop(CDH4), Pig

Responsibilities:

  • Involved in requirements gathering
  • Comparative analysis of different Data sized inputs.
  • Generating huge volume of CDR files.
  • Understanding and designing the application architecture.
  • Performance tuning of Hadoop Job's over Cluster.

Hadoop Developer

Confidential

Environment: Linux, Hadoop 1.0.4, Java 1.6, Eclipse

Responsibilities:

  • Involved in requirements gathering and analysis.
  • Understanding and designing the architecture.
  • Worked in distributed file system-HDFS for storing the data.
  • Worked in dataflow language-pig for processing the data.
  • Worked in NOSQL database- Hbase for storing and retrieving the processed result.

We'd love your feedback!