We provide IT Staff Augmentation Services!

Big Data Developer/team Lead Resume

5.00/5 (Submit Your Rating)

Bentonville, AR

SUMMARY:

  • Over 10+ years of experience in the full life cycle of the software design process including requirements definition, prototyping, design, implementation (coding),testing, maintenance and documentation.
  • 4 + years of work experience on Big Data Analytics as Hadoop Developer.
  • Experienced on major Hadoop ecosystem's projects such as Pig, Hive, GreenPlum, Teradata, HBase and monitoring them with Cloudera Manager.
  • Hands on experience in installing, configuring and using ecosystem components like Hadoop Map Reduce, HDFS, Pig, Hive and GreenPlum.
  • Experienced in developing applications using HTML, DHTML, JavaScript, CSS, ColdFusion 5/MX, Java, JSP and Servlets
  • Experience in using/developing ER diagrams, SQL, Stored procedures & Triggers using RDBMS packages like SQL Server 2000/ 2005, MS Access and Oracle
  • Experience in using editors like Dreamweaver, Home Site, and Eclipse
  • Extensive experience in developing Pig Latin Scripts and using Hive Query Language for data analytics.
  • Hands on experience working on NoSQL databases including HBase and its integration with Hadoop cluster and Worked with Mainframe and Ca7 Schedule
  • Good knowledge of database connectivity (JDBC) for databases like Oracle, SQL Server, MySQL
  • Extensive experience in both MapReduce MRv1 and MapReduce MRv2 (YARN). 
  • Extensive experience in HDFS, PIG, Hive, Zookeeper, UNIX and HBase.
  • Experience in collecting business requirements, writing functional requirements and test case documents.
  • Creating technical design documents with UML - Use Cases, Class, and Sequence and Collaboration diagrams.
  • Well versed in Object Oriented Programming and Software Development Life Cycle from project definition to post-deployment 
  • Dedicated hard working individual with ability to solve any type of assigned problem.
  • Excellent written and verbal communication skills, Experience in Interacting with clients/users to gather the user requirements.
  • Ability to work well individually under pressure, Self-motivated fast learner, who is consistently responsible and deadline oriented.

TECHNICAL SKILLS:

RDBMS/DBMS: MS-SQL Server 2008R2/2008/2005, Oracle 8i, MS Access, Excel, Oracle, 11g/10g/9i, SQL Server, MS Access.

Programming Languages: Hive, PIG, Hadoop T-SQL, HTML, DHML, Visual Basic, AJAX, ColdFusion, Java,SQL, and PL/SQL, SCALA

Software/Databases: MS SQL Server 2008 / 2005, (DTS), Oracle 8, ODBC, OLTP, OLAP, MS SQL Server 2000 Enterprise Manager, SQL Query Analyzer, Web Services, SQL Profiler, MySQL.

Operating Systems: Windows XP/Vista/7, Windows 2000, Windows 2003/2008/2012 Enterprise Server, UNIX. 

Tools: Business Objects, Crystal Reports, SAS, Erwin, Data Modeler.

PROFESSIONAL EXPERIENCE:

Confidential, Bentonville, AR

Big Data Developer/Team Lead

Responsibilities: 

  • As a Hadoop Developer responsible for designing, developing, testing, tuning and building a large-scale data processing system, for Data Ingestion and Data products that allow the Client to improve quality, velocity and monetization of enterprise level data assets for both Operational Applications and Analytical needs.
  • Setting up the Hadoop Landing zone and security setup through karberos and created the process to load mainframe data in to Hadoop and Hive tables.
  • Troubleshoot and develop on Hadoop technologies including HDFS, Hive, sqoop.
  • Working with key business stake holders to understand usecase requirements for data analysis.
  • Load events XML data into HIVE using hivexmlserde. Parse the XML and load the data into three different managed/external HIVE tables. Based on the business requirements exporting the previous load (partition) events for specific criteria.
  • Used Scala funtional programming.
  • Create HIVE partition managed tables for the each incremental loads. Implemented Partitioning, Dynamic Partitions, Buckets in HIVE.
  • Extracting weather forecast data using Sqoop to load data into HIVE table.
  • Develop generic SQOOP export utility to export data from HIVE to different types of RDBMS like Teradata, DB2, PostgreSQL (GreenPlum), and SAP HANA.
  • Develop generic SQOOP import utility to load data from various RDBMS sources like Teradata, DB2 and GreenPlum.
  • Importing Walmart.com data from Teradata to Hadoop using Teradata Parallel Transporter (TPT) and TDCH utilities. Experience on optimizing CPU, core, memory and disk utilization of applications that run on Hadoop clusters.
  • Develop HIVE UDF, .hql scripts passing dynamic parameters using hivevar
  • Writing UNIX shell scripts by using SFTP to load data from external sources to UNIX box and then load into the HDFS
  • Write UNIX script to load data from Greenplum temp schema to production schemas
  • Using GPLoad utility to load data from Teradata and external files stored by third party vendors to Greenplum temp schema.
  • Used Zookeeper for various types of centralized configurations.
  • Optimizing the Aggregation process with Spark. Working Experience with JSON format files and REST Web Services
  • Working on data validation and post deployment support.
  • Migrating the process and data from pivotal Hadoop to HDP 2.3

Technology: Hadoop, Hive, Sqoop, Erwin, Zookeeper, Map Reduce, HDFS, Scala,GreenPlum, Teradata, UNIX, Mainframe and ca7, GIT hub, Jira.

Confidential, Rockville, MaryLand

Big Data Developer

Responsibilities: 

  • Developed Map-Reduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW
  • Developed the Sqoop scripts in order to make the interaction between HBase and MySQL Database.
  • Developed multiple Map Reduce jobs in Java and defining their flow.
  • Experience in analyzing data using HIVEQL, PIG Latin and Map Reduce programs in JAVA.
  • Created Hive tables to store the processed results in a tabular format.
  • Experience in developing Hive Query Language and PIG Latin Scripts.
  • Developed the UNIX shell scripts for creating the reports from Hive data.
  • Managed and reviewed Hadoop log files.
  • Responsible for building scalable distributed data solutions using Hadoop
  • Responsible for writing Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).

Technology: Hadoop, Hive, Zookeeper, Map Reduce, Java, Pig 0.10, JDK1.6, HDFS, HBase, PL/SQL, UNIX, SAS and SQL.

Confidential, LA

ColdFusion Team Lead

Responsibilities:

  • Involved in development, maintenance and enhancements of the application.
  • Involved in software life cycle phases like Analysis, design, coding, and implementation and testing.
  • Developed queries and Stored Procedures, Dynamic PL/SQL Scripts to make the existing page data-driven.
  • Responsible for coding and unit testing. Involved in testing the application and bug fixing.
  • Involved with user meetings for user requirements. Worked on production Issues.
  • Developed queries and Dynamic PL/SQL Scripts to make the existing page data-driven, cfquery, cfstored procedures and cfgrids.
  • Worked with ColdFusion components (CFC’s). Worked with CF Report Builder. Worked on data CR’s.
  • Worked on onsite /offshore team as onsite co-ordination lead heading the team.
  • Worked on weekly basis releases. Worked on Pager Support and batch process jobs 24/7.
  • Worked on production support issues.

Technology: ColdFusion 9/10, PL/SQL, Oracle 11g, Crystal Reports, HTML 5, Xhtml, JavaScript, CSS,AJAX, Eclipse, EXT JS (SNECHA),SVN, Mercury-QC, Windows XP, SharePoint, Sql Server2008.

Confidential, VA

ColdFusion/SQL Developer

Responsibilities:

  • Involved in development, maintenance and enhancements of the application in ColdFusion using agile methodology.
  • Involved building up partner sites to access to the users for a particular client. Involved in fixing production tickets. Worked with reporting tools like SSIS,SSRS.
  • Involved in testing the application and bug fixing.
  • Worked on generating reports for the client based on the shipping /tracking of the packages.
  • Worked on redesigning the website from job level to package level.
  • Responsible for development and maintenance of the intranet website.
  • Responsible for a creating a new website where information about each package, client can be tracked in ColdFusion.
  • Worked on production support issues.

Technology: ColdFusion MX, PL/SQL Server 2005/2008, HTML, JavaScript, CSS, Dreamweaver, VSS, Eclipse, Windows XP, SSIS, SSRS, Flash, Ajax, Drupal, DB2.

We'd love your feedback!