We provide IT Staff Augmentation Services!

Software Engineer (hadoop Developer) Resume

5.00/5 (Submit Your Rating)

New York City, NY

EXPERIENCE SUMMARY:

  • Around 10 years of professional IT experience including 2.5 years in Hadoop/Big data ecosystem.
  • Domain worked include: Life and Pension insurance, Financial and Banking applications.
  • Committed team player with strong analytical and problem solving skills, willingness to quickly adapt to new environment & technologies, dedicated to successful project completion and excellent communication and interpersonal skills.
  • 2.5 years of hands on experience in Hadoop Eco system technologies such as in Pig, Hive, HBase, Map Reduce, Oozie, Flume, Kafka and Sqoop.
  • Good Understanding of Hadoop architecture and Hands - on experience with Hadoop components such as Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce concepts in HDFS Framework.
  • Handled importing and exporting data using Sqoop from HDFS to RDBMS and vice-versa.
  • Hands on experience in writing map-reduce programs in java.
  • Experience in Apache Spark using pyspark and scala to process real time data.
  • Performed Importing and exporting data into HDFS, Hive and HBase using Sqoop.
  • Hands-on experience on using Cloudera Platform.
  • Hands on experience in Installation of Hadoop Clusters.
  • Extracted data from databases with SQL and used Python for analytics and data cleaning.
  • Analyzing the data using statistical features in Tableau to develop trend analysis.
  • Involved in consuming data from RESTful web services using Spring framework.
  • Excellent knowledge with Unit Testing, Regression Testing, Integration Testing, User Acceptance Testing, Production implementation and Maintenance.
  • Hands on experience with OS/390, Z/OS, ISPF, COBOL, PL1, DB2, CICS, JCL, VSAM, SYNCSORT, Easytrieve, REXX, File Aid, Xpeditor, InterTest, IBMDebugger, Endevor, OPC, Spufi, IBM Utilities, RDz and Change configuration system.
  • Worked on annual statistics reporting which is used for generating detailed information about the pension capitals in Confidential Pension.
  • Brief exposure in Oracle10g, TOAD, Informatica Power, Python.
  • Brief exposure in ETL Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Sorter, Sequence Generator and Normalizer.
  • Experience in development using J2EE technologies like JDBC, JSP, Spring.
  • Experience working with Waterfall model and Agile Methodology.
  • Perform impact analysis and provide solutions to user’s change requests.
  • Production implementation upon successful user acceptance testing.
  • Good experience in trouble shooting and system performance tuning.
  • Experience in analyzing the entire system and the impact with other backend and front-end systems.

TECHNICAL SKILLS:

Hadoop & Spark: HDFS, Mapreduce v2.6.x, YARN, HBase v0.98.0, Pig 0.14.0, Pig Latin, Hive 0.14.0, Sqoop 1.4.4, Flume 1.4.0, Kafka 0.8.0, Impala, Oozie 4.0.1, Hue, Zookeeper 3.4.6, Spark v1.4, Python API, Scala API.

Java & J2EE Technologies: Core Java, JSP, JDBC

IDE: Eclipse

Frameworks: MVC, Spring 3.x

Programming languages: COBOL, PL1, JCL, REXX, Java, Python, Linux shell scripts.

Databases: MySql 5.0.x, DB2 v10

ETL tools: Informatica

BI tool: Tableau 9.0

Mainframe middleware: VSAM

Mainframe tools: Endeavor, Expeditor, Intertest, FileMaster, IBM Debugger tool, RDz

Mainframe Utilities: TSO, ISPF, Spufi, QMF, DataManager

PROFESSIONAL EXPERIENCE:

Confidential, New York City, NY

Software Engineer (Hadoop Developer)

Responsibilities:

  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce and loaded data into HDFS.
  • Hands on experience on writing MR jobs for encryption and also for converting text data into Avro format.
  • Implemented Partitioning, Dynamic Partitioning, Buckets in Hive.
  • Involved in creating Hive tables, then applied HQL on those tables for data validation.
  • Used Impala to pull the data from Hive tables.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Hands on extracting data from different databases and to copy into HDFS file system using Sqoop.
  • Written Sqoop incremental import job to move new / updated info from Database to HDFS.
  • Created Oozie coordinated workflow to execute Sqoop incremental job daily.
  • Load data from various data sources into HDFS using Kafka.
  • Hands on experience in joining raw data with the reference data using Pig scripting.
  • Established custom Map Reduces programs in order to analyze data and used Pig Latin to clean unwanted data.
  • Used different file formats like Text files, Sequence Files, Avro.
  • Implemented Spark RDD transformations, actions to migrate Map reduce algorithms.
  • Experience in Zookeeper to coordinate the servers in clusters to maintain the data consistency and Monitored services.
  • Used Oozie workflow engine to run multiple Hive and Pig jobs.
  • Used Tableau to generate dashboards for product trend analysis.
  • Working with clients on requirements based on their business needs.
  • Communicate deliverables status to user/stakeholders, client and drive periodic review meetings.
  • On time completion of tasks and the project per quality goals.

Technologies: Hadoop, HDFS, MR 2.5.x, HIVE 0.14.0, Pig, Sqoop 1.4.4, HBase 0.98, OOzie 4.0.1, MySql, Putty, Spark v1.4, Scala, Flume 1.4.0, Impala, Zookeeper 3.4.6, Linux and shell scripting, Tableau 9.0.

Confidential

Software Engineer (Hadoop Developer)

Responsibilities:

  • Worked closely with the business analysts to convert the Business Requirements into Technical requirements and prepared low and high level documentation.
  • Hands on using log files and to copy them into HDFS using flume.
  • Hands on writing Map Reduce code to make unstructured data as structured data and for inserting data into HBase from HDFS.
  • Experience in creating integration between Hive and HBase.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce and loaded data into HDFS.
  • Involved in loading data from Linux file system to HDFS.
  • Hands on experience in joining raw data with the reference data using Pig scripting.
  • Hands on extracting data from different databases and to copy into HDFS file system using Sqoop.
  • Written Sqoop incremental import job to move new / updated info from Database to HDFS.
  • Created Tableau reports and dashboards for business users to show the number of policies falling under a particular categories of products.
  • Involved in review process and as a senior member in the team, helped new team member to get involved in the project assignments.

Technologies: Hadoop, HDFS, MR 2.3.x, HIVE 0.12.0, Pig, Sqoop 1.4.1, HBase 0.98.0, DB2 v8, Putty, Zookeeper 3.4.5, UNIX and shell scripting, Tableau 8.0.

Confidential

Software Engineer (Java - Mainframes)

Responsibilities:

  • Involved in analyzing and understanding the requirements & functional specifications from client.
  • Prepared technical specifications based on the existing functionality and requirements. Care was taken to re-use most of the existing components/modules.
  • Involved in estimating the task once after the high level solution is being approved by the client.
  • Performed analysis for requirement changes to find out the affected list.
  • Implemented services using Core Java.
  • Developed and deployed UI layer logics of sites using JSP.
  • Struts MVC is used for implementation of business model logic.
  • Worked with Struts MVC objects like Controllers and Validators, Web Application Context, Handler Mapping, Message Resource Bundles for look-up for J2EE components.
  • Business logic was implemented using COBOL and PL1 language. DB2 was used for data storage and retrieval.
  • Worked on screen changes using Gemini and Netsyrtools in EDI and DLIPS Web applications. Gemini and Netsyr tools are built on top of HTML and Javascript.
  • Worked on CICS screen maintenance for implementing business changes.
  • Performed debugging using IBM Debugger tool for understanding and fixing the bugs.
  • System testing was performed using QC tool to keep track of the defects.
  • Change configuration management tool was used for version control.
  • FTPJobs were used for sending the report to the client mailbox.
  • Involved in review process and as a senior member in the team, helped new team member to get involved in the project assignments.

Technologies: Java, Spring, Eclipse, Web Sevices, DB2, COBOL, PL1, JCL, CICS.

Confidential, Franklin Lakes, NJ

System Engineer

Responsibilities:

  • Involved in analyzing functional specification, finding affected list of programs and homogeneous implementation.
  • Prepared Technical specification based on the existing functionality and requirement.
  • Developed programs and jobs using JCL, COBOL, DB2, CICS, and REXX.
  • Used Xpeditor tool for debugging and to understand the program flow.
  • Created detailed technical design specification for enhancing the batch programs,
  • Care was taken to re-use most of the existing components/modules.
  • Responsible for correct versioning of code by creating and moving the package using Endevor.
  • Involved in preparing test plans for unit and system testing.
  • Followed coding standards to ensure code consistency.

Technologies: COBOL, REXX, JCL, DB2, CICS.

Confidential, Mellon, NYC

Mainframe Programmer

Responsibilities:

  • Coordinated with management to deliver the task within the time limit and good quality.
  • Involved in development/enhancement of applications using COBOL, JCL, VSAM, DB2
  • Involved in production support activities. Ensuring the batch cycle gets completed in time.
  • Also, fixed the issues within the time mentioned in Service Level Agreement (SLA).
  • Involved in fixing abends such as Spaceabends, File contention errors, VSAM space abends and DB2 abends.
  • Involved in monitoring the Batch Cycles
  • As part of value add, tools were created using REXX to make the routine task easier and faster.

Technologies: COBOL, REXX, JCL, DB2, VSAM.

We'd love your feedback!