Sr. Hadoop Developer/architect Resume
O Fallon, MO
SUMMARY
- 16 year of IT experience, worked as a J2EE professional and Big Data and Hadoop developer across industry domains.
- Having 4 years of experience in Big Data and Hadoop Ecosystems.
- Strong experience wif Spark, MapReduce, HDFS, Hive, Pig, Sqoop.
- Developed streaming data using Kafka, Storm.
- Strong experience wif Linux, Python and Scala to write scripts.
- Strong experience wif scheduling Hadoop jobs by using Zena tool and Oozie.
- Strong experience wif Cloudera Distribution for Hadoop (CDH) and Hortonworks Data Platform(HDP).
- Strong experience in optimization/ performance tuning of Spark, MR, Sqoop, PIG and Hive.
- Strong experience in NoSQL Databases: HBase, MongoDB, Cassandra.
- Strong experience in design, developing and implementation of Object Oriented, Component based N - tier, Client/Server, B2E, B2B, B2C and E-commerce.
- Strong experience in Jenkins, BitBucket, GitHub, ANT and Maven tools.
- Good noledge of Amazon Web Service components like EC2, EMR, S3 etc.
- Strong experience in Eclipse, Intellij and Netbeans IDE’s.
- Extensively worked on latest web technologies like J2EE Architectural & Design Patterns, Struts, EJB, Servlets, JSP, JMS, Jdbc, JAXB, HyperJAXB and JAX-RPC, XML, AJAX, JavaScript, HTML.
- Experience in setting up, configuration and deployment process wif, WebLogic, TCServer, JBoss, and IBM WebSphere.
- Aggressive problem diagnosis and creative problem solving skills.
- Proficient in development methodologies such as Waterfall, ALM, SAFe, Scrum and Agile.
- Strong experience on high availability and high traffic applications.
- Strong experience in understand the key business and product objectives, how they relate to the needs of the end-users, and how they translate into technical requirements.
- Ability to clearly communicate and present ideas verbally and in writing, to various disciplines including management, developers, QA, and product management.
- Databases experience wif Oracle, Tera data, IBM DB2, Sybase, MS SQL Server, MySql, PostgreSQL and Hsql.
- Excellent experience in HP Diagnostics tool to monitor application performance.
- Excellent experience in Splunk tool error trend for high availability apps.
- Knowledge in Sales Force and Cloud computing.
TECHNICAL SKILLS
Operating Systems: MS-DOS, WINDOWS, UNIX, Linux
Languages: Java, C, C++, VC++, Python, Scala, COBOL, Shell Scripting
Hadoop Core: HDFS, MapReduce, MR1/MR2(YARN)
Hadoop ecosystems: Hive, Pig, Sqoop, Flume, Oozie, Falcon, Kafka, Storm, Spark, Solr,Kibana Elasticserach
Hadoop Cluster: Setup, Installation, monitor, maintenance, Cloudera CDH, Hortonworks, Apache Hadoop
IDE’s/Tools: IntelliJ, JBuilderX, Eclipse, Junit, log4j, Apache Ant, Maven, TOAD, Win Merge, Beyond compare, NetBeans, HPDiag, Splunk, BSM, THS, IBM TeaLeaf, HttpWatch, HPSM, Zena, Hue.
Content Management tools: SDL Tridion, Web Content Management System(WCMS)
Databases: Oracle, SQL Server, MySQL, Hsql, Sybase, DB2, NoSql, HBase, MongoDB, MarkLogic, Cassandra, Pgql
GUI: Applet, AWT, Java swing
Web Technologies: HTML, DHTML, CSS, XML, HTTP, JavaScript, AJAX, Servlets, JSP
Enterprise Technologies: J2EE (EJB, JMS), Web Services, REST
Web/Application Servers: Apache Tomcat, JBoss, WebLogic, IBM WSA, TC Server, IIS
Frame Works: Struts, JSF, Hibernate, Spring
PROFESSIONAL EXPERIENCE
Confidential, O’ Fallon MO
Sr. Hadoop Developer/Architect
Responsibilities:
- Implemented Python and Shell scripts for classification of Confidential jobs.
- Created hive scripts and integrated wif the Shell scripts.
- Worked on Regex filters which are developed in Python to process raw data sets.
- Involved in migrating MapReduce code into Spark to improve performance of application.
- Developed Spark scripts in Scala language.
- Involved in performance tuning by implementing partitioning, bucketing in Hive.
- Developed Oozie workflows for deals and offer upload jobs.
- Involved in automating Data warehouse feeds to process in a daily manner.
- Involved in automating the test cases for Confidential process flows.
- Participated in PI planning, reviews and upcoming sprint breakdown tasks.
- Participated in Sprint retrospective and backlog grooming sessions.
- Automated classification jobs in Jenkins environment for continuous builds.
- Implemented Shell and Python code coverages by using Kcov utility.
- Participated in daily scrum, status updates, code reviews, bug fixing.
- Participated planning, design of features, and mentoring team members.
- Developed Hadoop jobs in FI level configurations.
- Involved in production migrations by coordinating wif Release management groups.
- Involved in post production activities and provide production support in the warranty period.
- Worked closely wif business partners and generated many adhoc reports.
- Developed integration of sqoop and hive work flows.
- Participated source code checkins to Git, Git clones and created pull requests to review code before merging to development branch.
- Worked on jenkins, Jacoco and sonar configurations for code coverage.
- Participated in SAFe and Agile methodologies of Confidential and EOP integrated applications.
Environment: HDFS, Cloudera distribution, Scala, MapReduce, Spark, Hive, Sqoop, Oozie,Git, BitBucket, Jenkins, SAFe, Agile and Scrum model, RHEL and Linux Shell, Pgql, Impala, Python.
Confidential, Chicago IL
Sr. Hadoop Developer/Architect
Responsibilities:
- Worked wif business team to understand Confidential business goals and requirements.
- Design and documented Confidential architectural diagram using Visio tool
- Participated project configuration and folders creation in edge node server and data lake or Hadoop environment.
- Designed work flow diagrams from Confidential IBM DB2 to HDFS by Sqoop import and Data load from HDFS to Hive Data base for consumption.
- Developed Sqoop and Hive automated scripts by using linux schell programming.
- Sqoop, Hive, Audit Frame work and History jobs scheduled by Zena tool.
- Developed Historical load scripts and to load the data as Avro format.
- Implemented Confidential project in two variants dat is initial load daily load for all 63 tables.
- Participated to analyzing file input and output formats Avro, ORC and Text formats to store data into HDFS and Hive.
- Written fourteen linux schell scripts both initial and daily loads to Sqoop data from DB2 Hadoop layer.
- Participated to schedule Hadoop jobs in Zena tool by configured wif Talend.
- Worked on various clusters ETL, HIRES and default to submit and execute jobs for development.
- Written unit test cases for all 63 tables and technically implemented code snippets for all unit test cases and uploaded to GitHub and Share point.
- Worked very closely wif testing team for all environments dev, SIT, UAT and production to fix raised defects.
- Worked on Agile and Scrum model and created, updated, assigned and closed user stories.
- Worked on GitHub to check-in and check-out source code.
- Worked on no sql database HBASE to store BLOB data of a big table.
- Worked on HDP cluster to improve sizing and storage capacity.
- Validated stored data in Hadoop layer using linux command line and HUE.
- Participated in audit framework to maintain balance and row count of DB2 tables and Hadoop data sets.
Confidential, Bloomington IL
Sr. Big Data Developer
Responsibilities:- Writing MapReduce jobs in java for data cleaning and processing.
- Working on CDH cluster.
- Working on data ingestion tools wif Sqoop, Flume.
- Working wif HBase NoSQL database.
- End to end development using Hive, Pig, HBase.
- Analyzing and transforming data wif Hive and Pig.
- Designing workflows for job scheduling and batch processing
- Providing analytical dash boards, reports to the Service Managers and stake holders and participate and presenting the work progress of the System.
- Handling multiline application support means horizontal support for high availability apps
- Define work schedules for remote team and code review
- MR performance tuning.
- Involved in setup and maintenance of Hadoop clusters for distributed dev/staging/production
- Java coding, unit testing and analyzing application health.
- Supporting code developed by development teams
- Onsite & Offshore team management and call rotation
- Worked on integration of multiple application.
- Implemented POC using Kafka and Storm.
- Implemented POC using Scala and Spark.
- Working wif vendor and client support teams to assist critical production issues based on SLA’s.
- Collaborate and working closely wif BA's and team members to provide business-specific goals and create solutions dat are meet wif business directives.
Environment: Hadoop, Cloudera, MapReduce, Hive, Impala, Pig, HBase, Sqoop, Solr, Flume, Oozie, Java, Maven, Splunk, RHEL and UNIX Shell, Eclipse, Kafka, Storm, Talend.
Confidential, Bloomington IL
Sr. Big Data Developer
Responsibilities:
- Involved in architecture design, development and implementation of Hadoop
- Involved in multiple POC’s at the inception state of the project
- Experience in defining job flows
- Coordinates wif customers to understand the deep insights of data
- Used Sqoop to move the data from relational databases to HDFS
- Writing Map Reduce programs to Identify fraudulent activities such as buying insurance, frequent accident zones, person involvement in multiple claims and analyze patterns to identify several possible suspicious fraudulent activities
- Automation of the routine process using Oozie scheduler to run a on a timely manner to load the data into Hadoop Distributed File System and PIG to massage the data to make it ready for processing.
- Text analytics searches using unstructured data to find phrases dat frequently involved in PVC and QEC reports
- Worked on evaluate and increase application performance and TEMPeffectiveness.
- Worked on Cloudera distributions.
- Worked on java performance tuning.
- Working wif Cassandra NoSQL database.
- Status reporting to the client management
- Working wif testers to associate bugs in programming code and fixing the issues
Environment: Hadoop, Map Reduce, Cloudera, Hive, Cassandra, Pig, Sqoop, Oozie, Java, Maven, Splunk, RHEL, Java, Java, HTML, CSS.
Confidential, Bloomington IL
Lead Big Data Developer
Responsibilities:
- Development of JAVA and J2EE applications for Auto Insurance business lines, deployed on web-sphere and TC-Server platforms.
- Design and developed the dedup logic for different source systems using PIG scripts
- Developed a POC (proof Of Concept) wif Talend Big Data Integration
- Developed multipleMapReducejobs for data cleansing & preprocessing huge volumes of data
- Involved in Planning and Design by following approved life-cycle methodologies.
- Involved in customer impact and gap analysis.
- Involved in Peer to Peer Code reviews and code optimization project meetings.
- Involved in fixing of defects, problem records, incidents and requests.
- Creation of noledge repository collaborative.
- Involved in coding, design and unit test plans to verify dat the program functions correctly as per the requirements.
- Resolved technical issues through debugging, research and investigation. Perform reactive research to reproduce non errors/problems, identify the fix (high level) and hand off to triage team to correct.
- Request fulfillments for different service areas.
- Monitoring production application availability and performance through dashboards and other monitoring tools. Help co-ordinate in identifying root cause of impacts resulting in application unavailability.
- Extracted 500+ RDMS tables to Hadoop using Sqoop
- Support high availability applications (meeting “four nines” availability) in production environment. Also work on service tickets in production environment.
- Working on Splunk tool to track to applications error logs.
- Working on HP service manager for application related incidents.
- Working on HP Diagnostic tool to monitor application performance.
- Participating to update project related documents by using SharePoint.
- Involved in integration pop up comment card to AQP application which is developed in JavaScript, Ajax calls, CSS and HTML.
- Interacted wif management providing daily, weekly status reports and involved in monthly releases.
- Involved in integration of multi-line-application service support which were deployed on J2EE environment and TC-Servers.
Environment: JDK1.6 and JDK 1.7, JSP, Spring MVC Frame work, IBM Web Sphere 6.1 Application Server, TC-Server, IBM DB2, IBM RSA8.0 IDE, XML, HTML, CSS, SVN 2.0, Splunk, HPDiag and Windows 7, Linux, Hadoop, HDFS, Hive, Pig, MapReduce, Sqoop.
Confidential, Iowa
Senior Java and J2EE Consultant
Responsibilities:
- Worked on R411, R112, R212 and whole sale development and JIRA’s.
- Involved in Designing and Preparing the Technical Specification document from the functional Specifications.
- Involved in Analysis and Design of the Project, which is based on MVC (Model-View-Controller) Architecture and Design Patterns.
- Developed Use Case Diagrams, Class Diagrams, Object Diagrams, and Activity Diagrams using My Eclipse.
- Involved in understanding functional requirements and attending specification meetings.
- Developed Servlets, JSP’s, Beans and Tag lib's using MVC Architecture.
- Developed Session Beans and Entity Beans using J2EE Design Patterns.
- Developed test cases using JUnit for unit testing of EJB’s and Servlets.
- Develop several database structures, complex SQL queries, stored procedures, views, triggers and user-defined functions.
- Interacted wif management providing weekly status reports and involved in biweekly releases.
- Worked on Spring MVC frame work to implement server side business logic.
- Worked on Apache Axis 2.0 framework to implement web services.
- Worked on Java 1.5 and Java 1.6 to develop core programming.
- Worked on Eclipse 3.3 IDE tool to develop the core application.
- Worked on JBoss 7.0 application server to deploy core project.
- Worked on XMI to implement front end functionality.
- Worked on Oracle 10g and Oracle 11g to maintain and manage the mortgage data.
- Implemented Object-Rlational mapping using Hibernate 3.0.
- Worked closely wif team members from across all functions.
- Executed Maven 2.2.1 scripts to manage and deploy the application.
- Oriented team members in development functionality and build configuration management.
- Supported business users for the Change Requests.
- Involved in peer reviews and formal code reviews.
Environment: JDK1.5, JDK1.6, JSP, Spring MVC Frame work, JBoss 7.0 Application Server, Oracle 11g, Eclipse3.3, XMI, SVN 1.5 and WinMerge, JIRA, Windows 7.
Confidential, Sacramento CA
Project Lead/Programmer Analyst
Responsibilities:
- Work closely wif the business team for gathering requirements Risk Analysis
- Designing and implementing technical solutions for new products and features
- Worked on business process management to identify real time business rules.
- Worked on BRE business logic and written coding using java.
- Implemented Spring MVC framework.
- Extensively worked wif Jakarta Struts Framework and hibernate for data access
- UI designed and developed using JSP, Tiles and Tag Libraries
- Working closely wif team members from across all functions
- TEMPEffectively collaborate wif other engineers, architects, managers and product managers to solve complex problems spanning multiple projects
- Technical guidance to the team.
- Conducting team meetings.
- Team monitoring and Control
- UI developed using GWT
- Functional testing and Internal Code review
- Writing build scripts using Apache Ant
- Work on the defects assigned in Bugzilla.
Environment: JDK1.6, JSP, Struts Frame work (MVC model), Apache Tomcat 6.0, MySQL 5.x, Eclipse3.3, AJAX, AXIS Frame work, WinCvs and WinMerge, Bugzilla, MS SQL, Linux.
Confidential, CA
Technical Lead
Responsibilities:
- Working closely wif team members from across all functions
- UI is implemented in JSP, tiles, struts tag library.
- Implemented server side coding using action and halper classes
- Being responsible for ongoing maintenance, bug fixing, and quality improvement code
- Participated actively in Agile methodologies (discussions, reviews, demos & scrums)
- Identify and Solve performance issues.
- Technical Guidance to team
- Unit testing and Code review
- Controlling build release and source merging
Environment: Java and J2EE, Struts, Java Script, Web logic 10.0, Eclipse3.3, Oracle, WinCvs, Solaris and Linux.
Confidential, Sacramento CA
System Analyst
Responsibilities:
- Worked wif Higher level management to understanding of all the services and provide the estimates.
- Participated actively in Agile methodologies (discussions, reviews, demos & scrums)
- Worked on BItools to update the data.
- Analysis of complex problems.
- Worked on BI batch process.
- Worked wif clients to solve complex issues.
- Participated in regular meetings to update weekly status.
Environment: CRD tools, BItools, Oracle 9.2, and Batch update tools and Windows.
Confidential, New York
System Analyst (Team Lead)
Responsibilities:
- Mile stone 4 monitoring and control
- Task analysis and assignment
- Designing and implementing technical solutions for new products and features
- Worked closely wif team members from across all functions
- Implementing features in a manner which is robust, efficient, and readable
- Being responsible for ongoing maintenance, bug fixing, and quality improvement code
- TEMPEffectively collaborate wif other engineers, architects, managers and product managers to solve complex problems spanning multiple projects
- Build management
- Java Coding in Eclipse IDE
- Implemented Spring IoC framework.
- Implemented Apache Ant scripts
- On site team, customer communication and offshore project support team communication.
- RTM updating and internal task scheduler preparations.
- Bug Analysis and Tracking.
- Code review.
Environment: JDK1.5/Swing, Forte5.2.1.19, Web logic 9.2, Oracle 9.2, Eclipse3.1, WinCvs, WinMerge, Spring Frame work, beyond compare, f2jwrapper tool, Linux. m2o web v 3.5 (Metaminds, Hyderabad, India) Dec ‘
Confidential
Sr. Software Engineer
Responsibilities:
- Client Interaction
- Coding in Socket Program
- Technical guidance to the team
- Coding in socket program
- Coding and code review
- Controlling build release and source merging
Environment: Windows XP, Eclipse3.0, Java5.0, Java Socket Programming, JSF, jsp, Sybase, html, dhtml, WinCvs and Tomcat 5.0.
Confidential
Senior Software Engineer
Responsibilities:
- Client Interaction
- Working closely wif team members from across all functions
- Work allocation
- Technical guidance to the team
- R and D in HyperJaxb tool and Implemented HyperJaxb tool
- Implemented socket programming to extract data
- Handling HyperJaxb tool and Coding
- Code review
Environment: Windows XP, Eclipse3.0, J2sdk1.4, Java Socket Programming, HyperJaxb, XML, XML Spy, Spring, Hibernate3, Quartz and WinCVS.