We provide IT Staff Augmentation Services!

Big Data Tech Lead Resume

2.00/5 (Submit Your Rating)

Tampa, FL

SUMMARY

  • 15 years of overall experience in IT with over 6 years of rich development experience in Big Data projects
  • Strong experience in analyzing high volumes of data with skills in Hive, Spark, Java, Oozie, Shell Scripting, MapReduce, Pig and HBase
  • Experience in designing and developing ETL multi - layer data transformation pipelines using Hive, Spark, Java, Oozie and Shell Scripting
  • Hands-on experience in migrating different types of data sources into Hadoop data lake for analysis; handled Mainframes, Teradata, Oracle and JSON data sources
  • Developed re-usable Java utilities that can be used for multiple use cases within Hadoop Eco System
  • Strong experience in designing and migrating legacy RDBMS applications into Hadoop
  • Good exposure to Sqoop, Shell Scripting, Oozie and Autosys
  • Excellent skills in developing transformation logic and writing complex Hive queries
  • Strong in-depth knowledge in Hadoop ecosystem tools, MapReduce architecture and performance tuning
  • Rich experience in multiple domains including Banking, Confidential, Confidential and Manufacturing
  • Worked for clients: Confidential, Confidential, Confidential, World Bank, Travelers, Confidential and Confidential
  • Over 8 years of experience in Development and Maintenance of Java enabled Lotus Notes applications using Lotus Notes, Java, Ajax, JavaScript, HTML, Lotus Script and Formula language
  • Confidential Certified Associate Application Developer - Lotus Notes & Domino 8.5 with experience in designing XPages based and Java-SQL enabled domino applications
  • Possess good communication skills and strong interpersonal skills

TECHNICAL SKILLS

  • Hadoop Eco System: Hive, MapReduce, Pig, Spark, HBase, Oozie, Flume and Sqoop
  • Languages Java, Scala, Lotus Script and Formula Language (R5.x, R6.x and R7.x) JavaScript, XML, XSL Transformations, HTML & CSS, Unix Shell Scripting and Xpages (R8.5.x)
  • DBMS SQL Server
  • Operating Systems Linux, Window OS
  • Tools Attunity, Eclipse, SQL Server Management Studio, Accurev

PROFESSIONAL EXPERIENCE

Confidential, Tampa, FL

Big Data Tech Lead

Responsibilities:

  • Guide and support the team members and groom them in project architecture and low level design
  • Understanding the current legacy platform by interacting with the Business Analysts and convert the AML rules into technical design
  • Design the technology migration strategy using the current EAP architecture used in Confidential COE
  • Developing/customizing the Hive QL queries that applies complex transformation formulae on the transaction data at each layer of the system
  • Developing / customizing the ETL framework in spark and java
  • Designing the job sequence, automation strategy, preparing and monitoring the Autosys and Oozie jobs to make sure that the data transformation is taking place as expected
  • Analyzing and validating the results to make sure that the quality of the output at each layer is as per the business requirements
  • Making sure that the technical artifacts are maintained and appropriately versioned in SVN and RTC
  • Performance tuning of the transformation monitoring system

Confidential, Dearborn, MI

Senior Hadoop Developer

Responsibilities:

  • Developing and customizing the Data Transformation Standards Framework (DTSF is an in-house framework developed in Java)
  • Performing the dataset operations of Delta and Dedup using DTSF, Spark Data Frames, Hive and HBase
  • Developing corresponding test methods for DTSF using JUnit
  • Developing re-usable Java utilities that can be used within Hadoop Eco System such as below Ranger Backup Utility (Technologies used: Java, REST API, JSON, Apache Ranger)
  • This Java utility has been developed by me using REST APIs, which keeps backing up Ranger Policies (JSON) in a regular interval
  • When a disaster occurs, this utility creates / exports the policies back into DR environment
  • Writing and maintaining Pig Scripts for adhoc load / transformation requests
  • Writing and maintaining Hive queries to validate the datasets

Confidential, St Paul, MN

Hadoop Developer

Responsibilities:

  • Transforming, Ingesting and analyzing business critical data sources using Hadoop stack
  • Handled multiple types of sources that include Mainframes, RDBMS, JSON and CSV / Flat files
  • Handled Mainframes data sources which are in EBCDIC format
  • Developed MapReduce programs to analyze the data
  • Creating and maintaining Pig Scripts to Transform, Ingest and Analyze the data sources
  • Writing Pig UDFs to implement custom business logic
  • Creating and maintaining Hive queries for analysis (Used COBOL and JSON SerDe)
  • Performance tuning by creating ORC tables and partitioning the data
  • Used HBase as back-end for data storage and retrieval
  • Creating and maintaining Hive tables created on top of HBase tables, that contain Audit and product related data
  • Writing and maintaining Shell Scripts
  • Preparing Oozie workflows and scheduling Oozie jobs using Oozie coordinator

Confidential, St Paul, MN

Hadoop Developer

Responsibilities:

  • Transforming and Ingesting the data sources into Hadoop data lake
  • Handled multiple types of sources that include Lotus Notes Logs, RDBMS, JSON and CSV / Flat files
  • Developing MapReduce programs to analyze the data
  • Creating and maintaining Pig Scripts to Transform & Load the data sources into HBase / HDFS
  • Writing Pig UDFs and creating and maintaining Hive tables
  • Used JSON SerDe, maintained the data in ORC format
  • Creating Shell Scripts
  • Preparing Oozie workflows and scheduling Oozie jobs using Oozie coordinator
  • Preparing Technical Design Documents and writing Unit Test cases

Confidential, Washington, D.C

Senior Developer

Responsibilities:

  • I was also responsible for coordinating between offshore and onsite team members to make sure the delivery is effective
  • Skillset: Lotus Notes & Domino R8.5.3, Java, XPages, Lotus Script, Formula Language, HTML, Java Script, XML
  • Client Description: The Confidential is an international financial institution that provides loans to countries of the world for capital programs
  • Prepare and evaluate the design of ICRA
  • Develop, unit test and deliver the application
  • Coordinate between offshore, onsite team members and make sure the client’s requirements are met
  • Provide application support for ICRA during the warranty period

Confidential

Senior Developer

Responsibilities:

  • My responsibilities are to design, develop and maintain the Java enabled Lotus Notes applications and play the role of an offshore developer

Skillset: Lotus Notes & Domino R7, Java, Lotus Script, Formula Language, HTML, Java Script, XML, LEI

Confidential

Developer

Responsibilities:

  • My responsibilities are to develop, customize and maintain Announce.IT and Xplain.IT. I was playing the role of an offshore developer.

Skillset: Lotus Notes & Domino 6.5, JavaScript, Lotus Script, Formula Language, HTML, XML, XSL Transformations, AJAX

Confidential

Developer

Responsibilities:

  • RMS is purely a web-based workflow application to automate the product’s release lifecycle in the organization

Skillset: Lotus Notes & Domino 6.5, Java, Lotus Script, Formula Language, HTML, JavaScript and AJAX

We'd love your feedback!