We provide IT Staff Augmentation Services!

Big Data / Hadoop Developer Resume

PROFILE:

  • Dedicated and innovative with related education, practicum, and experience to contribute to long - range operational objectives in a Big Data Developer/Data Engineer role.
  • Software Solutions: Proven success in the design of challenging, enriching, and innovative applications addressing enterprise and end user needs.
  • Track record of matching business objectives to current and emerging technologies.
  • Project Participation: Talent for collaborating and building relationships with team members at every level. Outstanding communication skills.
  • Exceptionally organized and able to multitask. Proactive, self-directed, focused team member with dedication and passion to achieve all project goals.
  • Key Strengths: Excel at interfacing with customers, engineers, and management.
  • Comfortable working closely with analysts, designers and staff. Able to convey complex technologies to a variety of skill levels.
  • Talent for quickly learning new information, procedures, and technologies. Bilingual; fluent in Turkish and progressing in English.

TECHNICAL SKILLS:

Core Technologies & Skills: Hadoop, Spark, Scala, MapReduce, Pig, Hive, Sqoop, Flume, Data Warehousing, ODI (ETL Tool), Business Objects, Oracle 9i/10g/11g/12c, Sybase IQ, MS SQL, SQL, PL-SQL, C#, Microsoft Visual Studio, .NET framework, HP Quality Center, HP Quick Test Professional, HP Project and Portfolio Management, HP LoadRunner, Compuware FileAid

EXPERIENCE HIGHLIGHTS:

Confidential

Big Data / Hadoop Developer

Technologies: Hadoop, YARN, MapReduce, Pig, Hive, Sqoop, Flume, Spark, Scala, Kafka, Zookeeper, MongoDB and HBase

Responsibilities:

  • Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop MapReduce, Spark, HDFS, Pig, Hive, Sqoop, Flume and MongoDB.
  • Developed MapReduce and Scala programs to parse the raw data.
  • Implemented Spark using Scala and utilizing Data frames and Spark SQL for faster processing of data.
  • Used Sqoop to transfer data between RDBMS and Hadoop Distributed File System.
  • Loaded data from different sources into HDFS using Flume.
  • Used Pig and Hive in the analysis of data.
  • Essential knowledge about MongoDB, HBase, Kafka and Zookeeper.

Confidential

ETL Developer

Technologies: Oracle 11g, Sybase IQ 15.4, Oracle Data Integrator (ODI), SAP Business Objects (BO) 4.1

Responsibilities:

  • Coordinated data warehouse modeling, design, and development; Utilized ETL tool ODI for data extraction, transformation, and loading.
  • ETL monitoring, performance tuning, and maintenance; and SAP BO Universe Development and Web Intelligence Report publishing.
  • Improved ETL critical path and data quality studies.

C# Software Developer

Confidential

Technologies: C#, Oracle, SQL, PL-SQL, CDM, CMD, Smart Message

Responsibilities:

  • Served as dedicated C# software developer for in-house Customer Relationship Management (CRM) project. Pega CRM - Chordiant Marketing Director (CMD) and Chordiant Decision Management (CDM) integration with banking Systems, Distributing CRM Tools campaigning data to channels to communicate with clients correctly and on time, Real Time Decision System, Campaign and Product and Sales Opportunities Management, Portfolio Management, 360-degree customer display screens on banking application based on Personal / Commercial / Corporate Segments, CRM and Alternative Distribution Channnels Integrations(Internet and Mobile Banking, E-Mail, SMS, ATM), Channel-Based CRM Management System, Marketing By Customer Permit, General and Segment Based Customer Notifications, User Calendar, Alarm and Screen Alert Systems.
  • Developed personal, commercial and corporate process screens, PL-SQL database jobs for CRM Datamart, and third-party tool implementations as CDM, CMD, and Smart Message.
  • Integrated CRM tool with banking systems: Core Banking, Internet Banking, Mobile Banking, and Call Center.

Hire Now