We provide IT Staff Augmentation Services!

Hadoop Consultant Resume

5.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • Over 7+ years of experience as IT consultant in analysis, design, development,testing, implementation of application software and deploying business solutions using HADOOP/Big Data and TIBCO technologies.
  • Hands on experience in working with Ecosystems like Pig, Hive, Sqoop, Hbase and Map Reduce, Strong knowledge of Pig and Hive's analytical functions.
  • Experience on HADOOPArchitecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, and MapReduce concepts.
  • Good Exposure on Apache HADOOPMap Reduce programming, Hive and PIG scripting
  • Hands on experience in Import/Export of data to RDBMS using HADOOPData Integration tool SQOOP.
  • Experience in managing and reviewing HADOOPlog files; performing data analysis using Hive and Pig.
  • Extensive experience in designing and developing various EAI, A2A and nationwide integration by using TIBCO tools and technologies.Technical expertise in TIBCO integration.
  • Proficient in all aspects of TIBCO Active Matrix Business Works (BW), TIBCO Enterprise Messaging System (EMS), Hawk, Rendezvous, Administrator, Business Events and Database & File Adapters. Experience in user management, system and domain monitoring and management using TIBCO Hawk and TIBCO Administrator.
  • Experience in implementing SOA concepts by designing and developing Web Services using WSDL, SOAP and SERVICE palettes using SOAP/HTTP and SOAP/JMS with TIBCO Business Works. Expertise in XML related technologies including XML, XSD, XSLT, XPATH and HTML.
  • Insight of Enterprise Service Bus (ESB), Service Oriented Architecture (SOA), Business Process Management (BPM) and Complex Event Processing (CEP).
  • Strong knowledge in Windows, UNIX, Linux and Macintosh operation systems.
  • Worked on Healthcare, Retail, Gaming and auction domains.
  • Extensive experience working with business analysts, architects, developers and testing teams.
  • Excellent understanding of Business Process Management. Involved in all the stages of the Software Development Life Cycle.Strong understanding and acumen of existing IT environment.

TECHNICAL SKILLS

Operating Systems: Windows 2000/NT/XP/7, Linux (Fedora, Redhat), Mac OS (10.x)

BigData/Hadoop Ecosystem: HDFS, YARN, Map Reduce, HIVE, PIG, HBase, Sqoop, ZooKeeper

Middleware tools: TIBCO ActiveMatrixBusinessWorks (5.x), BusinessEvents (3.x, 5.x), Patterns(4.x), Administrator (5.x), Hawk (4.x), Active Database adapter (5.x), File adapter (5.x), IBM i (6.x), iProcess Workspace (Browser) (11.x)

EAI Message - ware: Enterprise Messaging Service (6.x), Rendezvous (8.x), UDDI, WSDL, SOAP, WSIL, MOM, MTOM

Programming Languages: C, C++, Java, C# .NET, ADO.NET, ASP.NET, Python

Databases: Oracle 8i/9i/10g/11g, MS SQL Server 05/08, PL/SQL

Directory Service: LDAP, JNDI, MS Active Directory

Case Tools: Visio 10, iWork 09

Version Control System: Subversion (Tortoise SVN), Visual Source Safe (VSS), CVS, ClearCase, Perforce, PVCS, XML Canon, GitHub, BitBucket

Web Server: Apache Tomcat

Web language: XML schema, DTD, XSLT, XPATH, PHP, HTML, JSP

Testing tool: SoapUI Pro 4.0.1, CA (ITKO) Lisa Virtualize

PROFESSIONAL EXPERIENCE

HADOOP Consultant

Confidential, Atlanta, GA

Responsibilities:

  • Engineered, developed and maintained complex interfaces and code base using BusinessEvents and BusinessWorks.
  • Worked extensively on varied BE Concepts, Rules, Rule-functions, engine performance tuning, EMS destinations and servers.
  • Worked on BW COBOL Copybook Plug-in, web-services, MQ series connection, iAPI based logging and exception handling, and automated deployments.
  • Convinced and lead the team to use Hadoop based solutions to cater to, large amount of data being received and processing needs.
  • Design, Development and Deployment of the Big Data Processing Applications capable of handling 25TB of datadaily on 50 node clouderacluster.
  • Used Hive to refine data for further analysis and for transformation of files from different analytical formats to text formats. Wrote PIG Latin scripts for data processing.
  • Performed data migration from RDBMS to HDFS using SQOOP, analyzed data using HQL, PIG and MapReduce and put it back in RDBMS.
  • Coordinated with various teams for technical requirements, implementations. Subsequently, testing and deploying multiple applications across environments.
  • Lead and assisted client teams to debug and troubleshoot current applications.
  • Created and maintained Technical documentation for launching HADOOP Jobs and for executing Hive/Pig queries and Scripts.

Environment: HADOOP 1.2.1, PIG 0.13.0, HIVE 0.13.1, SQOOP 1.4.6, HBASE 1.0.1, CDH 5.x, TIBCO BE 5.1.2, BW 5.11, TRA 5.7, EMS 7.0, Admin 5.7, Gems 3.4, Oracle 11g, GoldenGate, Linux, Sun Solaris.

HADOOP Consultant

Confidential, Atlanta, GA

Responsibilities:

  • Gathered requirements, worked with BA and developed new functionality using Hadoop for buyer and seller side data.
  • Coordinated with database, AS400, informatica, deployment and testing teams for successful implementation of project.
  • Responsible for adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, performance tuning, manage and review Hadoop log files.
  • Handled importing of structured and semi-structured data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS, exported data to RDBMS, onto ESB using Avro and HP Vertica.
  • Published data out from files and database using EMS messaging and BW orchestration capabilities.
  • Addressed and troubleshoot issues faced by internal teams pertaining to individual auction records.
  • Researched usability of REST service using HBASE and ActiveMatrix BW plug-in for REST and JSON per our requirement.

Environment: HADOOP 1.2.1, PIG 0.13.0, HIVE 0.13.1, SQOOP 1.4.6, AVRO 1.7.6, CDH 4.x, TIBCO BW 5.10, EMS 7.0, Admin 5.7, Oracle 11g, Informatica, Jenkins, AS/400

HADOOP Consultant

Confidential, San Diego, CA

Responsibilities:

  • Worked with technology and business groups at Snaptracsto recommended suitable Hadooptechnology stackfor migration strategy.
  • Assessed current planned phases for Hadoop/Big Data implementation, current architecture,roadmap and technology fitment, considering data growth.
  • Infrastructure setup, capacity planning and administration of 38 node Hortonworks HDP 1.3data cluster.
  • Troubleshoot and addressed challenges of production cutover. Provided production support.

Environment: HADOOP 1.2.1, PIG 0.11.1, HIVE 0.13.1, JAVA 6, Oracle 11g, Atlassian, Splunk

HADOOP Consultant

Confidential, Las Vegas, NV

Responsibilities:

  • Analyzed and implemented best practices on existing code, along with logging and exception handling, with the aim of optimizing operational efficiency for data capturing.
  • Researched various Hadoop technology stack for data access, data storage, data serialization, data intelligence, data integration, management, monitoring and orchestration.
  • Processed streaming inputs / importing data in to HDFS from Heterogeneous technologies.
  • Setup and capacity planning of CDH4.3 Hadoop cluster.
  • Helped release management with installations, configurations, and deployments of the improved and enhanced solutions.
  • Documented detailoperational guides explaining designs, interfaces, process flows, methodologies, architecture diagrams and deployment activities.

Environment: CDH 4.3,HADOOP 0.23.6, PIG 0.11.0, HIVE 0.10.0, JAVA 6, CDH 4.x, TIBCO BW 5.9, EMS 6.1, Admin 5.7

Lead Consultant

Confidential, Richfield, MN

Responsibilities:

  • Created synergies within cross-cultural, cross-location and cross time-zone teams (positioned in China & India).
  • Coordinatedand mediated with client and offshore team for issues and timely deliverables.
  • Involved with front-end, back-end, security, infrastructure and TIBCO PSG teams for design and engineering.
  • Presided over and assessed technical implementation of schema designs, BW & BE implementations, CLE alerts, Hawk rule constructs, security reviews, and deployments.
  • Created web-services using SOAP over HTTPS to expose Oracle DB organized data; created web-service client to retrieve and subsequently expose data from Siebel database.
  • Created simple events and rules in BusinessEvents to validate incoming requests from client application.
  • Consulted TIBCO Professional Services Group (PSG), putting into effect Tibco COE standards and best practices, for PROD deployments.
  • Learned CA (ITKO) Lisa service virtualization (Virtual Service Environment/Image tools), testing and automation during rigorous training session by CA technologies.
  • Training and POC of Big Data and Hadoop implementations.
  • Researched tools- Chef and Vagrant, as a part of automation project, which would result in continuous integration and will equip the IT division with on-demand virtual clients.

Environment: TIBCO BW 5.9, BE 5.0, TRA 5.7, EMS 6.0, Oracle 10g, MakeDoc for TIBCO, CA (ITKO) Lisa Virtualize, Jenkins, GitHub, Test Maker (PushToTest), HP SOA Systinet, Layer7, MS Infopath 10, Rally Software

TIBCO Lead

Confidential, Menomonee Falls, WI

Responsibilities:

  • Coordinated and guided technical design sessions with Confidential ’s and Responsys.
  • Configured JMS connections to connect with and retrieve through IBM Websphere MQ MOM.
  • Integrated certificates & keys to establish connection with outside web-service while encrypting & decrypting messages over secure network.
  • Created web-service client to integrate with external web-service using SOAP over HTTPS.
  • Implemented session management, message prioritization, logging and tracking framework to handle messages.
  • Worked on performance testing, load balancing and stress testing principles of deployment to handle the upcoming holiday season transactions.

Environment: TIBCO BW 5.6, TRA 5.6, MQ Series 7.0, Admin 5.6, SoapUI 4.5

Sr. TIBCO Consultant

Confidential, Centennial, CO

Responsibilities:

  • Engineered broad spectrum of TIBCO BW processes, synchronizing with EMS, BusinessEvents, General Interface forms, iProcess integrator and MS SQL Server.
  • Designed, configured and monitored EMS messaging server and tools.
  • Designed and effectuated logging & exception handling using TIBCO CLE.
  • Expanded JMS based interface to TIBCO iProcess Workspace (Browser).
  • Created web-services using WSDL and SOAP call to interact with data analysts using TIBCO GI.
  • Performed QA activities in staging environment.
  • Monitored and administered technical cum business processes using TIBCO Administrator and hawk tools to provide production support.
  • Implemented matching engine to streamline the duplicate data using TIBCO patterns.
  • Generated unit tests for each operation using SoapUI Pro.
  • Wrote several rule sets, rules, events and concepts to validate input files while interfacing between BE and BW.
  • Migrated and deployed the project from TIBCO BusinessEvents 3.0 to 5.0.

Environment: TIBCO BW 5.8, TRA 5.6, EMS 5.1, Admin 5.6, BE 3.0, 5.0, GI 3.9, Business Studio (iProcess) 3.4, iProcess workspace browser 11.3, Patterns 4.5, MS SQL 08, SoapUI Pro 4.0, Rally Software

TIBCO Developer

Confidential, NYC, NY

Responsibilities:

  • Involved in the analysis, design, development and implementation cycles of the project.
  • Gathered the requirements and detailed the specifications for the EAI implementation.
  • Installed Administration Server and configured BusinessWorks components to communicate with Administration Server.
  • Configured TIBCO ADB adapter to interact with Oracle database.
  • Used both Certified (RVCM) and Reliable message delivery to transport messages.
  • Configured design-time & run-time SAP R/3, Oracle ADB adapters to publish & subscribe business data real-time.
  • Implemented Error handling in business process and conducted Unit testing, Component testing and supported system testing.
  • Monitored and controlled the adapters and process engines using the TIBCO Administrator.
  • Used JMS Queues for queuing incoming orders that are processed by the Order Entry System.
  • Documented the design of the workflow using Class, Message flow and Process diagrams.
  • Created deployment documents, deployment scripts & further supported the integration testing.

Environment: TIBCO BW 5.1, Designer 5.1, Rendezvous 7.1, ADB Adapter 5.0.1, SAP R/3 Adapter, TRA 5.1, Hawk 4.2.0, Oracle 8i.

JAVA Developer

Confidential

Responsibilities:

  • Involved in Application analysis, design, and identifying required tools necessary to build the application.
  • Designing and developing of User Interface using JSP, JSTL, Servlets, Struts, JDBC and HTML as per Use-Case requirement.
  • Developed and implemented business logic in Data Access Object
  • Integrated Struts with business logic component in WASD.
  • Form input validations were done using Java Script.
  • Used JUnit for unit testing.

We'd love your feedback!