Hadoop Architect/developer Resume
SUMMARY
- 12 years of Experience in Enterprise Application Development, Web Applications, Client - Server Technologies, Web Programming .
- More than 5 years of Experience in design, development, analyzing, maintenance and support of Big Data Analytics using Bigdata Hadoop enterprise version.
- Worked as Big data Architect to design teh Big data, Hadoop compatibility with other third party vendors and design teh business use cases.
- Involved in designing and building and establishing Enterprise Big data hub fit into five container database channel to preserve history and valuable data of decades for long time.
- Extensively involved in building a cluster setup for both Horton works and Cloudera enterprise level.
- Extensively involved in planning, executing POCs to find how Big data, Hadoop ca help to solve teh pain levels and performance events to process, storing, analyzing and securing sensitive data of all applications struggling with high intensity, variable, veracity, value of data.
- Involved in technical POC’s to no data integration from Tableau, SAS data, Spotfire, Informatica to Bigdata Hadoop clusters for process data in either direction
- Experienced in distributed systems to leverageCDH5.2.0, HDFS, MapReduce, analyzed performance bottlenecks and recommended tuning optimizations.
- Proficient in ApacheHadoopecosystems PIG, FLUME, Hbase, Zookeeper, Hive, impala, SQOOP, Solr, Spark, KAFKA, Apache Tika, strong understanding ofHDFS architecture.
- Strong working experience with integration, warehousing, querying, processing and analysis of large data which may be sensitive or raw information.
- Experience with Hadoop Architecture and teh daemons of Hadoop - MapReduce, HDFS, Job Tracker, Task Tracker, Name Node and DataNode.
- Implemented Proofs of Concept on Hadoop stack and different big data analytic tools, migration from different databases (me.e. Teradata, Oracle, SASdata and MYSQL) to Hadoop.
- Hands on experience in developing Map Reduce jobs using Hadoop Ecosystem.
- Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
- Experience in configuring Hadoop Clusters and HDFS.
- Successfully loaded files to Hive and HDFS from Mongo DB, Hbase Loaded teh dataset into Hive for ETL Operation.
- In-depth understanding of Data Structure and Algorithms.
- Experience in developing and deploying applications using Web logic, Tomcat and JBOSS.
- Experience with backend databases like ORACLE, DB2, MYSQL, Mainframe and SQL Server
- Team Player, quick learner and self-starter with effective communication, motivation and organizational skills combined with attention to details and business process improvements.
- Hands on experience in setting up, configuring and using Apache Hadoop ecosystem components like HDFS, Hadoop MapReduce, Zoo Keeper, Oozie, Hive, Sqoop, Pig and Flume +.
- Expertise in writing Hadoop Jobs for analyzing data using Hive and Pig
- Experience in working with MapReduce programs using Apache Hadoop for working with Big Data
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems(RDBMS) and vice-versa
- Broad experience with SQL, PL/SQL and database concepts
- Experience in optimization of Map reduces algorithm using combiners and petitioners to present teh best results.
- Excellent programming skills in Java, JSP, JDBC, XML, HTML, XHTML, JavaScript and developing client-server, Web and Distributed Applications.
- Experience in Database design, Entity relationships, Database analysis, Programming, SQL, Packages and Triggers in Oracle and SQL Server on Windows and UNIX.
- Excellent written and oral communication, interpersonal skills and presentation skills. Proven ability to work efficiently in both independent and team work environments.
TECHNICAL SKILLS
Big DataEcosystem: CDH5.2.0, MapReduce, HDFS, Hbase, Zookeeper, Hive, Pig, Sqoop, Oozie and Flume
Programming Languages: Java, SQL,PL/SQL,UNIX/Linux Shell Scripts
Web Technologies: HTML, XML, JavaScript, JSON
Framework: JUnit, log4j, Spring, Hibernate
Database: Oracle, DB2,MySQL,Hbase, Mongo DB
Application Server: Apache Tomcat 5.5.0, Tableau
IDE’s, Utilities & Web: Eclipse, HTML,CSS, Java Script
Operating Systems: LINUX, Windows 7, UNIX
Methodologies: Agile, UML, OOP
Protocols: TCP/IP, HTTP, SOAP and HTTPS
PROFESSIONAL EXPERIENCE
Confidential
Hadoop Architect/Developer
Responsibilities:
- Determine feasibility requirements, compatibility with current system, and system capabilities to integrate new acquisitions and new business functionalities.
- Fannie Mae’s current data constituent base can be grouped into three principal categories:
- Research, exploratory modeling & predictive analysis.
- Industrialized analysis like Risk & modeling, analytics like PRIMA, PRISM etc.
- Established system of record based consumption patterns (RDW’s marts & formal reporting functions including financial, regulatory and other).
- Independently formulate detailed program specifications using structured data analysis and design methodology. Prepare project documentation when needed.
- Independently code new programs and design Tables to load and test teh program effectively for teh given POC’s using with Big Data/Hadoop along with teh following technical skills like Hive, HDFS, Impala, Hue, Solr, Jason Scripts, Spark, Cloudera Manager, Cloudera Navigator to deliver complex systems issues or changes.
- Develop detailed application designs and specifications for computer applications. May assist in technical lead capacity during project development in POC’s phase.
- Write documentation describing program modifications, logic and corrections. Oversee development of user manuals and operating procedures. Provide technical assistance to resolve operating issues.
- Extracting, transferring and loading teh data from different sources to build teh right solutions for Hadoop Projects.
- Designed and analyzing theBusiness use cases to provide teh right solutions for all teh POC’s used in Hadoop Projects.
- Teh above responsibilities are complex and involve teh theoretical and practical application and highly specialized noledge.
- Replaced default Derby metadata storage system for Hive with MySQL system. Executed queries using Hive and developed Map-Reduce jobs to analyze data.
- Developed Pig Latin scripts to pull teh information from teh web server output files to load into HDFS. Built up teh Pig UDF's to preprocess teh information for analysis.
- Developed Hive queries for teh analysts. Involved in loading data from LINUX and UNIX file system to HDFS.
- Supported in setting up QA environment and updating configurations for implementing scripts with Pig.
Environment: CDH5.2.0, HDFS, Hbase, Flume 1.4, Sqoop 1.4.3, Hive 0.13.0, spark,impala.Solr1.4,apache Tika, python, UBUNTU( Linux),Eclipse Juno, XML, JSON,Enterprise data hub.
Confidential
Hadoop Developer
Responsibilities:
- Converting teh existing relational database model to Hadoop ecosystem.
- Generate datasets and load to HADOOP Ecosystem
- Design technical architectural workflow
- Optimize teh Hive queries with proper algorithms and build customer attributes using HIVE.
- Integrating teh Hive queries with OOZIE
- Compare teh Hive queries output to existing data model outputs.
- Generate datasets and load to HADOOP Ecosystem
- POC on data Ingestion with different tools.
- Follow agile methodology for teh entire project.
- Designed and developed MapReduce programs with Java.
- Develop UDF (java) and UDAF (java)
- Orchestrate hundreds of HIVE queries using Oozie workflows.
- Analyze customer patterns based on teh attributes.
- Follow agile methodology for teh entire project.
- Conduct scrum calls every day
- Prepare technical design documents, detailed design documents.
Environment: Hadoop 0.2.0 MR1, HDFS, Hbase, Flume 1.4, Sqoop 1.4.3, Hive 0.7.1, Java 1.6, Linux, Spring 3.2.3, Eclipse Juno, XML, JSON.
Confidential
Technical Lead
Responsibilities:
- Called for in teh blueprint of teh Technical Business Design Documents based on teh Conceptual System Design.
- Meet teh major office to manage teh critical application during noledge transition and shadowing and reverse shadowing time.
- Responsible for getting up teh detail planning and system appreciation document preparation of several applications and all its subsets.
- Perform business area analysis, including subjects of business plans in order to enforce teh theory and rules of performance improvement to customer offices
- Recognize significant problems and opportunities in clients’ operations and give understanding of clients’ systems and procedures, overall business operations and industries in teh current.
- Responsible for utilizing technological analysis and design principals to formulate detailed application plans and procedures in order to implement clients' requests for new or modified functionalities
- Analyze clients' requests for new or modified applications through interviews and design sessions
- Design, produce and test proposed enhancements with client interaction for verification
- Developed software programs using JCL, COBOL and DB2.
Environment: Z/OS, DB2, COBOL, JCL, CICS, VSAM, SPUFI, QMF, SQL, ACF2, SHAREPOINT, XPEDITOR
Confidential
Senior technical Analyst
Responsibilities:
- Called for in teh blueprint of teh Technical Business Design Documents based on teh Conceptual System Design. Coded complex Cobol/DB2 modules dat do dynamic allocation of teh files.
- Provided escalated support by investigating and solving complicated business application issues and their associated organizations.
- Developed an expert understanding of technical and business process flow of applications and provided recommendations for improvement.
- Provided with appropriate communication, facilitate bringing other parties together, and complete post resolution reporting for high severity incidents
- Participated in Development efforts, provide input for future requirements, and inform teh Service Desk or others of teh Release Notes and any non events.
- Extensively worked on Problem Investigation, Analysis and development of certification for teh existing or new modules.
- Setup, configure, maintain and monitor assigned business application(s) and related systems and worked in Mainframe transaction facility protocols.
- Working with clients to obtain teh occupation requirements and enhancements in teh organization.
- Involved in production batch support like scheduling teh jobs, restarting, fixing abends and bypassing teh cases. Setting up of bugs & ensuring timely and defect free delivery.
- Coordinated with interface teams to clear teh technical/business doubts.
Environment: OS/390, TSO/ISPF, VSAM, COBOL, JCL, DB2, PLATINUM, CMTS, SPUFI, Toad, JIRA, SQL Explorer, MS-Office, MS Project.
Confidential
Program Analyst
Responsibilities:
- Responsible for implementing, customizing, and integrating components of teh client ( Confidential ) application
- Design, produce and test proposed enhancements with client interaction for verification
- Monitor operation and functionality throughout teh execution process by testing applications to ensure optimum user benefits, design and configure application modifications and enhancements as necessary
- Plan and create web front-end applications to integrate with host-side operations
- Implement teh integration and customization of customer-specific system packages
- Provide first level production support post “get live” operations
- Integrate and program middleware concepts
Environment: Z/OS, DB2, COBOL, JCL, VSAM, SPUFI, QMF, SQL, ACF2, SHAREPOINT, MICROSOFT OFFICE (WORD, EXCEL, ACCESS).
Confidential
Mainframe Analyst
Responsibilities:
- Research, update and maintain quality testing measures and routines.
- Assist in teh planning, creation and control of teh test environment(s).
- Identify, collect and create test data.
- Facilitate and participate in structured walk-through and peer critiques.
- Take part in teh coordination and implementation of system and assembly testing.
- Inform Test Architect and Team Lead of any events dat may bear upon teh schedule, budget, or quality of teh ware and teh testing process.
- Validate fixes, execute test scripts, as well as record problems and events in accordance with teh project's problem and issue management plans
- Project management, testing and reporting of outcomes
- Document all teh testing results and maintain ABEND logs.
- Assist and coordinate with teh new resources in understanding teh nature of work and teh testing procedures.
- Created conceptual design, test approach, test plan, test plan and test scripts manually and Mercury Quality Centre and CMTS. Setup and manage test environments and test data based on teh exam requirements. Coordinating with onshore & Client communication.
Environment: Z/OS, DB2, COBOL, JCL, VSAM, SPUFI, QMF, SQL.