We provide IT Staff Augmentation Services!

Sr. Bigdata Consultant Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • 14+ years of Software Development experience in object oriented programming, design and development of Multi - Tier distributed, Enterprise applications using Java and J2EE technologies with Software Development Life Cycle and 4+ years of experience in Integrations using MuleSoft ESB.
  • Highly skilled and experienced in Agile Scrum & Waterfall Development process for diverse requirements
  • Extensively worked on both Enterprise and Community edition of MULE ESB. Have an experience working MULE API manager and RAML.
  • Developed the integration workflows using an ESB framework. Experience with Mule Soft MMC and enterprise release.
  • Developing or maintaining a production-level cloud-based architecture in AWS, including creating machine Images like AMI. n S3,VMWare Virtualizat
  • Developed software to interface with partners and external B2B systems in Java Servlets, JSP, Perl and Oracle on Solaris.
  • Involved in managing and deployment of the applications in the Cloud Hub
  • Utilized Mulesoft features like dataweave, API designer and various connectors to ensure robust and loosely coupled integration layer every time.
  • Good hands on experience using mule connectors like DB, FTP, FILE, SFTP, SalesForce, Workday, SAP etc as a part of integration usage.
  • Hand - on experience in software design and development using Java. (Core Java, Collection Framework, JDBC, Servlets, Jsp, Spring, Hibernate, JavaScript)
  • Expertise in Struts, spring, JSF, Web framework and Hibernate frameworks.
  • Involved in development of SOAP & REST based web services using WSDL, SOAP, JAXB, CXF, AXIS, JAX-WS and Restful web services using JAX-RS, CXF and JERSEY APIs.
  • Experience with XML technologies includes XML, DTD, XSD, XSLT, JAXP (DOM & SAX), and JAXB
  • Experience working with Business Rule Management System (BRMS) using ILOG Rule Studio and also used JBoss Drools to define the rules to implement the application.
  • Expertise in writing SQL queries and P/L SQL-Stored procedures, functions, sequences, cursors, triggers, indexes etc using different DBs -Oracle, DB2 and SQL Server.
  • Experience in configuring and deploying the application on Tomcat web server and WebSphere, WebLogic & JBoss application servers. Experience in using different IDEs such as Eclipse, My Eclipse and RAD.
  • Experience in using different version controlling/tracking systems GIT, StarTeam, Rational Clear Case & VSS (Visual Source Safe)
  • Experience in AIX/Unix and HP Operating System and using Shell Scripting.
  • Experience in using tools such as Log4J, Ant, SOAP UI, FileZilla and Putty.
  • Strong application integration experience using Mule ESB with Connectors, transformations, Routing, ActiveMQ, JMS and RabbitMQ. Data Transformations using Mule ESB.
  • Experience in Onsite and Offshore model and lead the team of more than 5 people of offshore.

TECHNICAL SKILLS

  • java,J2EE, SOAP, Rest, Mulesoft, B2B, Cloud, Unix, DB2, Hadoop, XMl, HTML, Java Script, XML, CSS,
  • RESTful WebServices, Apache Tomcat, Scrum methodology, Bootstrap, CXF, SoapUI 5.2.0, LISA 7.5, GUI interfaces, Swings and Awt, Threads, Networking, ANT, Junit, Solaris,, Adobe flex 3, Cairngorm framework, Oracle 10g/9i

PROFESSIONAL EXPERIENCE

Confidential

Sr. Bigdata Consultant

Responsibilities:

  • Design and Implement historical and incremental data ingestion from multiple external systems
  • Design physical data models for structuring raw data in HDFS.
  • Design map/reduce logic and HIVE queries for generating aggregated metrics.
  • Design and Develop data migration logic for exporting data from HDFS/HIVE to Vertica using WebHDFS
  • Design and Develop complex workflow in Oozie for recurrant job execution
  • Involved in gathering functionality of different products from BRD, HLAD and FDN.
  • Created automated python scripts to convert the data from different sources and to generate the ETL pipelines.
  • Involved in preparing design TSD document with Sequence Diagrams, Class Diagrams using Microsoft VISIO tool.
  • Used Spring framework to inject services, entity services, transaction management, and concerns by factory class corresponding to the use case operation executing.
  • Involved in using spring concepts - DI/IOC, AOP, Batch implementation and Spring MVC.
  • Involved in Declarative transaction management using spring AOP.
  • Created WSDLs as per wire frames, UI pages & generated client jars using JAX-WS.
  • Used Apache CXF to create SOAP based & Restful web services. Got exposure to build and deploy the changes in production environment by executing the build script and deploying the compiled GWT client side code and server side code to production server and Extensively used JSP in the view of MVC architecture.
  • Involved in creating internal & external services, Entity services and handlers.
  • Involved in defining JRules that are defined for resources to show details on UI pages.
  • Involved in writing SQL queries & PL/SQL - Stored procedures, function, triggers, cursors, object types, Cursors, sequences, indexes.
  • Involved in Web sphere server configurations, Data Sources, Connection Pooling, MQ Series Queues set up for messaging and deploying the apps on different servers in different environments like QA/IST/Production.
  • Involved in creating JUNIT test cases and ran the TEST SUITE using EMMA tool.
  • Ran check style, PMD defects & Find bugs and fixed them. Involved in fixing defects identified in QA and IST phases. And tracked QC status as per the guild lines.
  • Involved in creating http inbound & outbound flows and orchestration using XPath using MULE ESB.
  • Done with transformers, exception handling, testing & Security of mule ESB endpoint through WSSR.
  • Writing application code & development for large projects especially in a SOA environment with Mule ESB 3.5.4.
  • Extensively used Mule ESB components like File Transport, SMTP Transport, FTP/SFTP Transport, JDBC Connector, and Transaction Manager.
  • Utilized partner WSDL for ESB to Salesforce & SAP integrations.
  • Integrated web services including SOAP as well as REST using Mule ESB.
  • Involved in doing Unit testing using MOKITO and also Integration testing
  • Involved in peer level design & code reviews.
  • Supported IST, QA and UAT builds and identified issues in Smoke testing and tracked them efficiently.
  • Involved in deploying the application in Unix and to connect to see logs for fixing UAT/Production defects.
  • Involved in building the code using Ant & deployed on server.

Environment: JAVA 1.6, J2EE, Mule ESB 3.5.4, Hadoop, HBASE, Scala, Cassandra, Nosql, Mongo, DAO, HTML, Java Script, XML, CSS, Ajax, Web Sphere Application server, LDAP, Oracle 10g, Log4J, Eclipse, CVS, DOJO, Ant, SOA, SOAP, DB2, PL/SQL, SQL, PostgreSQL, Web Services-WSDL, SOAP, UDDI, SOAP UI, JAX-RS, JERSEY, Windows XP.

Confidential, Chicago, IL

Bigdata Developer/ Architect

Responsibilities:

  • Proof of concept for Web log analytics on Hadoop to the customers.
  • Work with business users, to ensure technology is implemented to meet the expectations of demanding business data mining community.
  • Maintain all system configuration documentation by collecting, storing, and updating the documentation.
  • Implemented Hadoop, Flume, PIG, HIVE, Sqoop.
  • Installed and configured Cassandra Database and generating graphs using Titan and Gremlin queries.
  • Created automated python scripts to validate the data flow through elastic search.
  • Implemented and configured flume to stream the log data from various data sources.
  • Experience in writing Pig Latin scripts to parse the log files using the Regex.
  • Created Hive external tables for loading the parse data using partitions.
  • Analyzing the data using Hive QL and Pig Latin scripts.
  • Writing MapReduce Jobs to cleanse and parse data in HDFS obtained from various data sources and migrated to MPP databases such as Teradata.
  • Provide ad hoc reports for data extraction and aggregation.
  • Created and maintained Technical documentation for launching HADOOP Clusters and for executing Pig Scripts and Hive queries
  • Environment: Hadoop, Pig, Hive, Flume, Core Java and eclipse

Confidential, IL

Lead Hadoop Developer

Responsibilities:

  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting.
  • Good experience in developing Hive DDLs to create, alter and drop Hive TABLES.
  • Involved in developing Hive UDFs for the needed functionality that is not out of the box available from Apache Hive.
  • Using HCATALOG to access Hive table metadata from Map Reduce or Pig code.
  • Computed various metrics using Java MapReduce to calculate metrics that define user experience, revenue etc.
  • Responsible for developing data pipeline using flume, sqoop and pig to extract the data from weblogs and store in HDFS Designed and implemented various metrics that can statistically signify the success of the experiment.
  • Worked on AWS to create EC2 instance and installed Java, Zookeeper and Kafka on those instances.
  • Involved in using SQOOP for importing and exporting data into HDFS and Hive.
  • Developing Pig Scripts for change data capture and delta record processing between newly arrived data and already existing data in HDFS.
  • Involved in pivot the HDFS data from Rows to Columns and Columns to Rows.
  • Involved in emitting processed data from Hadoop to relational databases or external file systems using SQOOP, HDFS GET or CopyToLocal.
  • Involved in developing Shell scripts to orchestrate execution of all other scripts (Hive, and MapReduce) and move the data files within and outside of HDFS.
  • Had a couple of workshops on Spark, RDD & spark-streaming.

Environment: Java, J2EE, Hadoop, MapReduce, Yarn, Hive, HBase, Oozie, Sqoop, Strom, Flume, AWS, Oracle 11g, Core Java Cloudera HDFS, Eclipse.

Confidential, NC Charlotte

Lead Hadoop Developer

Responsibilities:

  • Imported data from TeraData systems to AWS S3 using DataTransfer and Spark With Scala Distributed Systems
  • Experience on Hadoop data ingestion using ETL tools Talend, Datastage and Hadoop transformation (including MapReduce, Scala)
  • Worked on Inbuilt Quantum Application where we used to run our workflow with spark application.
  • Experienced with Linux operating system and shell scripting
  • Supported data analysis projects using Elastic MapReduce on the Amazon Web Services (AWS) cloud. Exporting and importing data into S3.
  • Supported in setting up QA & Production environment and updating configurations for implementing scripts with Spark Scala.
  • Worked on RunCukes Cucumber Test Reports & also has good understating on Gerkins Langusges for Running ATTD.
  • Used Apache Spark for real time and batch processing.
  • Developed Spark code using scala and Spark-SQL/Streaming for faster testing and processing of data
  • Integrated Apache Storm with Kafka to perform web analytics and uploaded click stream data from Kafka to HDFS, Hbase and Hive by integrating with Storm.
  • Used Kibana Elastic Search for handling log messages that are handled by multiple systems
  • Implemented Chef Severs for Sceduling the Cronjobs for the Spark Applications.
  • Worked on Digital Jenkins Sever to build the scala Projects for the spark Applications.Nexus build repos where all build storage is available
  • Involved in converting Hive/SQL queries into Spark transformations using Spark RDDs, Scala and have a good experience in using Spark-Shell and Spark Streaming. .
  • Implemented Mongo Db and set up Mongo Components to Write Data to Mongo and S3 Simaltaneously & read the data from Mongo
  • We setup Mongo Db in Different Envirments Like Dev,QA,Prod
  • Worked on GITHUB for Version Control Tool.
  • Developed scalable modular software packages for various APIs and applications.
  • Implemented procedures for measurement and optimization of performance of new and current systems.
  • Developed Spark code and Spark-SQL/Streaming for faster testing and processing of data.
  • Experience in deploying data from various sources into HDFS and building reports using Tableau.
  • Developed a data pipeline using Kafka and Strom to store data into HDFS.
  • Performed real time analysis on the incoming data.
  • Configured deployed and maintained multi-node Dev and Test Kafka Clusters.
  • Performed transformations, cleaning and filtering on imported data using Hive, Map Reduce, and loaded final data into HDFS.

Environment: AWS S3,EC2,Spark,Scala, Pig, Hive, Kafka, Hortonworks, Mongo, Sqoop, Apache Camel, Oozie, HCatalog, Chef, Jenkins, Artifactory, Avro, IBM Data Studio

Confidential, San Jose CA

Hadoop Developer

Responsibilities:

  • Setup Oracle Big Data Appliance Multi Node environment
  • Develop code to merge multiple JSON files and load them into HDFS.
  • Create Hive meta store and tables to take the data from JSON and perform DW queries.
  • Involved in developing Hive UDFs and reused in some other requirements. Worked on performing Join operations.
  • Developed Serde classes.
  • Develop histograms using R
  • Developed fingerprinting rules on HIVE which help in uniquely identifying a driver profile
  • Perfomed Importing and exporting data from Oracle to HDFS and Hive using Sqoop
  • Performed source data ingestion, cleansing and transformation in Hadoop
  • Developed automated workflow to schedule the jobs using Oozie
  • Developed a technique to incrementally update HIVE tables (a feature currently not supported by HIVE)
  • Created metrics and executed unit tests on input, output and intermediate data
  • Lead the testing team and meetings with onshore for requirement gathering
  • Assist the team in creating documents that entail the process involved in cluster set up
  • Involved in the analysis, design and development of the application components using JSP, Servlets, EJB components and J2EE design pattern.
  • Made changes in the JSP pages according to the requirement.
  • Designed Graphical User Interface (GUI) for various WebPages using AJAX, HTML, CSS and JavaScript.
  • Created Ajax forms for update operations
  • Data was converted into JSON using JSP tags.
  • Developed Multi -threaded code for web application development.
  • Developed online SQL query facility using JSTL to manage the database.
  • Worked on the Unit testing and Integration testing.
  • Worked on Stored Procedures using the oracle database.
  • Registration process handled by entity bean with communicates to the oracle database.
  • Developed server side utilities using J2EE technologies Servlets, JSP, EJB.

Environment: Core Java (GUI interfaces, Swings and Awt, Threads, Networking) J2EE, MulesoftESB, Servlets, JSP, XML, Design Patterns, Oracle 8i, Cloud computing (Platform as servie), SQL, Pl/Sql, JBOSS, Eclipse, JUnit R, SAS Base, MapReduce, Oracle Big Data Appliance, Apache HIVE, HDFS, Oozie, SqoopConfidential, Sunnyvale CA

Java/Hadoop Developer

Responsibilities:

  • Developed application service components and configured beans using Spring IoC, creation of Hibernate mapping files and generation of database schema.
  • Worked with NoSql and Big Data technologies such as MongoDB, Cassandra, Hadoop.
  • Worked on JavaScript to validate input, manipulated HTML elements using JavaScript, developed external JavaScript codes that can be used in several different web pages.
  • Implemented EJB's session bean to maintain mobile session.
  • Implemented methods to validate, invalidate, keep Alive session for login process and maintaining session credentials.
  • Developed REST services to talk with adapter classes and exposed them to the angular js front-end.
  • High use of Selenium in collecting client's information, development, identifying test cases, compatibly testing, automation of test scripts, Flex application testing and design, requirements review, design review, test plan review.
  • Implemented application level persistence using Hibernate and Spring.
  • Configured Struts, Hibernate framework with Spring MVC.
  • Set up Selenium tools from scratch and configured various other peripherals tools to perform Selenium test.
  • Experience Working on Selenium, QC, Rally, QTP, LoadRunner, JMeter, Fiddler, SOAP UI, REST/SOAP testingand API testing
  • Expertise in MVC Architecture using JSF and Struts framework and implementing custom tag libraries.
  • Developed the application using Struts Framework which is based on the MVC design pattern.
  • Deployed the application on Weblogic Application Server cluster on Solaris environment.
  • Deployed EJB Components on WebLogic.
  • Creation of REST Web Services for the management of data using Apache CXF.
  • Architecture& Designed the Restful web services and developed core component layers like xml validation, core service layer, solr search and transformation components.
  • Development of AJAX toolkit based applications using JSON.
  • Developed additional UI Components using JSF and implemented an asynchronous, AJAX (JQuery) based rich client to improve customer experience.
  • Involved in the development of presentation layer and GUI framework using EXTJS and HTML. Client Side validations were done using JavaScript
  • Involved in adding AJAX, JavaScript components to some of the JSP pages wherever needed.
  • Developed user interface using JSP, AJAX, JSP Tag libraries and Struts Tag Libraries to simplify the complexities of the application.
  • Developed user interface using JSP, JSTL and Custom Tag Libraries and AJAX to speed the application.
  • Developed Servlets and JSPs based on MVC pattern using Struts framework and Spring Framework.
  • Worked on Data Services implementation for the CRUD services.
  • Developed the UML Use Cases, Activity, Sequence and Class diagrams using Rational Rose.
  • Developed Oracle PL/SQL Stored Procedures and Queries for Payment release process and authorization process.
  • Developed programs for accessing the database using JDBC thin driver to execute queries, Prepared statements, Stored Procedures and to manipulate the data in the database.
  • Involved in debugging the product using Eclipse and JIRA Bug Tracking.
  • Involved in JUnit Testing of various modules by generating the Test Cases.
  • Configured Maven dependencies for application building processes.
  • Developed XSD for validation of XML request coming in from Web Service.
  • Implemented a prototype to integrate PDF documents into a web application using iText PDF library.
  • Extensively used QA tools: Selenium, QuickTest Professional (QTP), WinRunner.
  • Designed and developed client and server components of an administrative console for a business process engine framework using Java, Google Web Toolkit and spring technologies.
  • Make files and the like. Worked with Clearcase source management.
  • Configured glassfish server; Design shipping rate template upload UI using Adobe Flex and Developed Jasper report.

Environment: Java, Hadoop, HBASE, Cassandra, Hortonworks, Mapreduce, Nosql, J2EE, Servlets, JSTL, JSF, ICE Faces, XML, iRise, CSS, Spring, Hibernate, Struts, Weblogic, APACHE CXF and REST, JQuery, AJAX, ExtJS, JavaScript, JSP & SERVLET, Oracle, CRUD, SQL, UML, Eclipse, Junit, MAVEN, ITEXT,, JavaScript, Weblogic Application Server, ANT, Junit, Solaris, Windows, Jax-B, JMS, Log4j.

We'd love your feedback!