We provide IT Staff Augmentation Services!

Big Data/hadoop Lead Developer Resume

2.00/5 (Submit Your Rating)

SUMMARY:

  • Over 15 years of experience in Software systems development, integration and Implementation of core business applications using Java and .Net technologies spanning multiple platforms within Mortgage, Retail and Financial services domain.
  • Over 2 years of experience in the design, development and Implementation of Big Data Solutions at BoFA on Hadoop ecosystem computing infrastructure (Cloudera).
  • Hands - on experience in developing solutions to process large sets of data using HDFS, Map Reduce, Pig, Hive, Hbase, Hadoop Streaming, SQOOP, Flume, Oozie.
  • Experienced in all aspects of data management, data architecture, data integration and Implementation of Business Intelligence/ETL applications.
  • Experienced in the design, development and implementation of scalable application and data integration solutions using SOA using Java/ .Net technologies.
  • Experienced in Object oriented analysis, design and programming in Core Java and .Net.
  • Experience in managing highly critical and complex projects as Tech Lead/ project lead.
  • Experienced in managing Clients, Vendors and other stakeholder’s expectations.
  • Experienced in managing Onshore/offshore delivery teams on several projects in a global and matrix environment.
  • Working knowledge and experience on Agile methodology and Scrum process.
  • Quick adoption to new tools and technologies, problem solver, effective team player.

SKILL:

Programming Languages:: Java, C#.Net, Java Script, SQL

ETL Tools:: Data Stage 8.5, Informatica

Big Data Analytics:: Hadoop, Map Reduce, Pig, Hive, Scoop, Hbase, Oozie, SQOOP, FLUME and Tableau.

Databases:: Oracle, DB2, MySQL, SQL Server and Teradata

Tools: TOAD, AutoSys, SPLUNK, AppWatch,.

Technologies: J2EE, .NET

Integration Technologies:: IBM WebSphere MQ Message broker, TIBCO Business Works, SOA and Web Services

Operating Systems: Windows and Linux

Web/Application Servers: IIS, Tomcat, J-Boss and WebSphere

Methodologies:: Waterfall, Agile

EXPERIENCE:

Confidential

Big Data/Hadoop Lead Developer

Responsibilities:

  • Involved in the design, development of data pipeline, storage and processing platform (Hadoop data ware house using Cloudera) to capture and process very large mortgage related data sets from multiple sources/channels within the mortgage domain and external vendors using Hadoop ecosystem technologies.
  • Involved in gathering and analyzing current and future ingestion requirements from each source system, creating design documents, review data sources, data formats, recommend processes for loading data in to Hadoop.
  • Prototype solutions on large data sets to improve data processing performance and review deliverables with stakeholders.
  • Hands on experience in implementing data acquisition, storage, transformation and analysis solutions using Hadoop ecosystem components - HDFS, Map Reduce, Pig, Hive, Flume and Sqoop.
  • Experience in writing Pig Latin scripts for handling data transformations, data loading to stage and production hive tables and experience in analyzing data using Pig, MapReduce, Hive QL and HCatalog.
  • Created Hive tables to store data into HDFS, processed data using Hive QL, exported Hive tables to relational data store ODS( Teradata) using SQOOP for visualization/reporting. Developed UDF’s and implemented them in HIVE queries
  • Hands on experience with sequence files, RC files, Combiners, Dynamic partitions, bucketing for best practice and performance improvement.
  • Used FLUME to collect, aggregate and store the weblog data from different sources like web servers, network devices and pushed to HDFS
  • Analyzed web log data using HiveQL and identified opportunities for improving service levels.
  • Support for ad-hoc queries across large data sets
  • Experience in writing workflows using Oozie and scheduling them with AutoSys
  • Source files tracking is done in HBase tables & Zookeeper using Java API’s.
  • Used Ant to build the environment for each source and SVN as code repository.
  • Created UNIX shell scripts for automation of ETL & Data transfer jobs.
  • Good understanding of Spark, Kafka, STROM, No SQL, HBase.
  • Working knowledge on Tableau data visualization tool and designed and published interactive Tableau workbooks and dashboards.

Lead Integration /Data Architect

Confidential

Responsibilities:

  • Key player in the countrywide and Confidential merger and integration effort. Worked in the systems and data integration space and provided thought leadership and direction for developing SOA based solutions for integration of Mortgage Servicing and core banking systems.
  • Involved in the design, development and delivery of i-Share application using IBM Data Stage 8.5 to transform and load data across multiple source and destination nodes. It is a middleware layer between mortgage servicing system (IBM i-series) and 200 + up/down stream applications within mortgage domain. This application is used to support business critical functionalities with in home loans department
  • Designed and developed Data Stage ETL PX jobs for extracting data from heterogeneous source systems (Databases, sequential files, MQ feed, compressed files) transform and finally load into the Data Warehouses as an Architect and tech lead.
  • Designed and developed Data stage ETL PX jobs which extracts data from sequential files (different formats), apply transformations on the incoming data (which includes joining with other sources to get additional data and doing lookups for information) and finally stored the data into target I series database.
  • Designed and developed the Outbound processing jobs using Data stage which includes extraction of Data from I series database and apply transformations on the extracted data and send the final result to vendors using ftp stage and at the same time storing data into Oracle database.
  • Architected data storage jobs (parallel jobs, sequence jobs and shared containers) to read input file and load data into customer databases resulting in performance enhancement.
  • Responsible for making sure any gaps in quality or performance are quickly identified and addressed.
  • Implemented process automation through auto scheduler and shell scripts to execute ETL jobs.
  • Paired with developers when needed to help them catch up on business knowledge.
  • Involved in technical discussions during planning and design sessions as part of every sprint
  • Created ETL process flows and low level design docs and provided detail instructions to the team members
  • Designed and delivered time critical Integration solutions for foreclosure, Bankruptcy, Fees, CASH, Loan modifications and Case Management using i-Share framework
  • Designed, delivered and managed enterprise level events handling frame work (Publish/Subscribe model) for document management using IBM middleware technologies.

Environment: JAVA/J2EE, JDBC, JMS, Spring, Hibernate, IBM i-Series(AS400), DB2, Oracle11g, IBM data stage v8.5, Teradata, IBM Message broker v7.0, Ant, SVN, AutoSys, Linux Shell script. SOAPUI, Splunk, Hadoop, Map-Reduce, Pig, Hive, Flume, SQOOP.

Confidential

Tech Lead

Responsibilities:

  • Involved in the migration of Real Estate lending Servicing/Default systems on to a single common platform and lead two integration projects as part of this migration effort.
  • Involved in the application design, development and implementation of Confidential ’s Debt Restructure Strategic Initiative project and Successfully delivered ETL applications and Dialer projects under this program
  • Delivered Confidential ’s DRI vendor projects which helped replace many cumbersome mainframe screens with web based solutions. DRI is a default system used to service defaulted loans

Environment: Servicing and Default Platforms (DRI, CITILINK, MortgageServ), Java, J2EE, SQL 2000 Server, Informatica and Cognos.

Tech Lead

Confidential

Responsibilities:

  • Multi brand property search feature was enabled for various channels of Confidential using search and booking services.
  • Search and booking Services were developed using TIBCO business works and Java/J2EE.
  • Lead project lifecycle from project start up through deployment using standard development and delivery methodologies and managed team of 15 onshore/offshore resources and involved in design, development and implementation of Search and Booking services project.

Environment: Java, JSP, Servlets, EJB, TIBCO Business Works, XML, Oracle 9i, SQL Server, Load Runner

Confidential

Tech Lead

Responsibilities:

  • application provides client account information to Confidential financial advisors and allows them to perform business transactions on OST for their clients and service customer accounts directly.
  • Responsibilities included interacting with business for requirements gathering, design, application development, testing and implementation.

Environment: Java, JSP, Servlets, EJB, TIBCO Business Works, XML, Oracle, IBMMQ Series 5.0, SQL server

Confidential

Senior Systems Analyst

Responsibilities:

  • Lead requirements gathering, analysis, design, development and delivery activities for Confidential Corporate Information Portal.
  • Project goal was to automate corporate processes and to provide centralized information access to all employees.
  • Implemented Coreport portal with in multiple operating companies of large insurance holding company, Confidential .
  • Coreport portal provides with a single, open strategic framework for deployment, integration and management of all enterprise assets.

Environment: Java, JSP, HTML, JavaScript, SQL server

We'd love your feedback!