We provide IT Staff Augmentation Services!

Hadoop, Marklogic (no Sql) Developer Resume

5.00/5 (Submit Your Rating)

Phoenix, AZ

PROFESSIONAL SUMMARY:

  • Having more than 9 years of experience in IT industry, involved in Developing, Implementing, testing and maintenance of various web based applications using J2EE technologies and Big Data ecosystems experience on Linux environment.
  • Including 3 years of comprehensive experience as a Hadoop, Big Data & Analytics Developer.
  • Expertise on Hadoop architecture and ecosystems such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
  • Experience in installation, configuration, supporting and monitoring Hadoop clusters using Apache, Cloudera distributions and AWS.
  • Knowledge in installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig, Zookeeper, Falcon, Spark and Flume.
  • Imported and exported data using Sqoop from HDFS to RDBMS.
  • Knowledge of R background with statistics.
  • Strong Knowledge of Kafka. (To provide a unified, high - throughput, low-latency platform for handling real-time data feeds).
  • Knowledge of Gradle and Elastic Search.
  • Knowledge of Falcon to enhance operations, support for transactional applications and improved tooling.
  • Experience in importing and exporting data using Apache Sqoop from HDFS to Relational Database Systems / Non-Relational Database Systems and vice-versa.
  • Extending Hive and Pig core functionality by writing custom UDFs
  • Awareness of Meta Data tools and techniques .
  • Experienced in analyzing data using HiveQL, Pig Latin, and custom Map Reduce programs in Java.
  • Experience in building, maintaining multiple Hadoop clusters (prod, dev etc.,) of different sizes and configuration and setting up the rack topology for large clusters.
  • Worked on NoSQL databases including MarkLogic, HBase, Cassandra and MongoDB
  • Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
  • Experienced in designing, developing and implementing connectivity products that allow efficient exchange of data between the core database engine and the Hadoop ecosystem
  • Experienced in Data warehousing and using ETL tools like Informatica and Pentaho
  • Expert level skills in developing intranet/internet application using JAVA/J2EE technologies which includes Struts framework, MVC design Patterns, Chrodiant, Servlets, JSP, JSLT, XML/XLST, Java Script, AJAX, EJB, JDBC, JMS, JNDI, RDMS, SOAP, Hibernate and custom tag Libraries
  • Experience using XML, XSD and XSLT
  • Experience with web-based UI development using jQuery UI, jQuery, ExtJS, CSS, HTML, HTML5, XHTML and JavaScript
  • Extensive experience in middle-tier development using J2EE technologies like JDBC, JNDI, JSP, Servlets, JSP, JSF, Struts, Spring, Hibernate, JDBC, EJB

TECHNICAL SKILLS:

Hadoop/Big Data: HDFS, Mapreduce, HBase, Pig, Hive, Sqoop, NoSQL DB, CDH3, CDH4, Apache Hadoop.

Java & J2EE Technologies: Core Java, Servlets, JSP, JDBC, JNDI, Java Beans

IDE s: Eclipse, Net beans

Frameworks: MVC, Struts, Hibernate, Spring

Programming languages: C, C++, Java, Python, Ant scripts, Linux shell scripts

Build Management Tools: Maven, Apache Ant

Version control: SVN, github, ClearCase

Databases: Oracle 11g/10g/9i, MySQL, DB2, MS-SQL Server

Web Servers: Web Logic, Web Sphere, Apache Tomcat

Web Technologies: HTML, XML, JavaScript, AJAX, SOAP, WSDL

Network Protocols: TCP/IP, UDP, HTTP, DNS, DHCP

ETL Tools: Informatica, Pentaho, Ab initio

Testing: Win Runner, Load Runner, QTP

PROFESSIONAL EXPERIENCE:

Confidential, Phoenix, AZ

Hadoop, Marklogic (No Sql) Developer

Environment: QNXT Database, XSLT, Marclogic, Xquery, MLCP, SOAP UI, REST API, APIC

Responsibilities:

  • Working in the IDigital(Medicaid)project, to digitalize the process of sharing health plan data to external vendors.
  • Source data standardization and extraction
  • Transform the data to Enterprise PIM model.
  • Data ingestion to Marklogic, Internal REST API’s from Marklogic, External REST APIs from APIC.

Confidential, Eden Prairie, MN

Hadoop and Marklogic Developer

Environment: Sqoop, CDC(IBM), Apache Hive, HBase, Splunk, Mapr, Talend, Unix, MarkLogic, IMS (Identity management system), MDM (Master Data management)

Responsibilities:

  • Involved in building an organized repository (Data lake) for all raw/unfiltered structured and unstructured data. It presents a centralized data storage for common enrichment repository.
  • Common data Acquisition from disparate Trading Partners.
  • Centralized Data Storage
  • Generic Data Validation and Standardization framework enabling consistency across common data attributes.
  • Common Enrichment services, based on contracted services, that improves the usability of data for the specific client use.
  • Standard data provisioning.
  • Developed Profiling and performance tuning and loading content into Mark logic repository.
  • Used H Base as a metastore for processing and created Hbase tables with the column family.
  • Involved in data ingestion Using Sqoop (Historical Data) and IBM Change Data Capture replicates the heterogeneous data in near real time.
  • Created Hive tables and views to check the duplicate records and data quality checks, using Quality stage and applied the source level security.
  • Monitoring the work flows designed by Framework team and creating the failure point of analysis for all the workflows.
  • Fixing the issues caused by Hive server connection.
  • Used splunk dashboard to evaluate the ingestion and snapshot status.
  • Creating the tables and partitions in hive.
  • Designed and documented operational problems.
  • Active participation with documentation updates in the one connect and KT sessions to the operations team.
  • Worked in Agile and scrum development environment with Test Driven Development and unit test frame works, including continuous integration.
  • Developed high performance applications in Marklogic Server including XQuery, xml, and Xpath development and Application Monitoring.
  • Developed Server side XQuery modules in Marklogic NOSQL server and RESTAPI using RESTXQ specification using Marklogic frameworks such as Roxy, CPF, MLCP, RXQ and XRAY.
  • Stream lined meta data management framework that maintains a data catalog, captures key data attributes, enforces privacy protections on protected data elements, and support data lineage.
  • Experienced working with agile methodologies, Rally, Git repository (version control tools), SoapUI, shell scripting.

Confidential, Hartford, CT

Hadoop Developer

Environment: Hadoop, DB2, SQL, IBM Data Studio, HDFS, Sqoop, Pig, Hive, Zeke, LINUX, MarkLogic

Responsibilities:

  • Developed a framework that will sqoop the data incrementally as per the requirement based on a timeline and automated this process using Zeke enterprise scheduler.
  • Involved in developing a automated framework the sqoops the data and creates the hive table on top of that data using Shell scripts.
  • Worked extensively with Sqoop for importing data from DB2 to HDFS.
  • Created tables and partitions in Hive.
  • Created the developer Unit test plans and executed testing in development cluster.
  • Involved in creating REST services, writing Xqueries and XSLT transformations to ingest the data in the Marklogic
  • Involved in the implementation of Marklogic Framework on Roxy. shark17
  • Involved in developing a framework that will create external and manageable tables in a batch processing based on the metadata files.
  • Experienced with SDLC including coding standards, code reviews, source control management and build processes.
  • Experienced in gathering requirements, converting requirements into technical specification, exploring efficient implementation methods.
  • Designed and documented operational problems by following standards and procedures using IBM rational clear case.
  • Worked on Hortonworks Hadoop distribution.

Confidential, Bakers Field, CA

Hadoop Developer

Environment: Hadoop, MapReduce, HDFS, Hive, Java, SQL, Cloudera Manager, Pig, Apache Sqoop, Oozie, HBase, PL/SQL, MySQL

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop
  • Installed and configured Hive, Pig, Sqoop and Oozie on the Hadoop cluster
  • Developed Simple to complex Map/Reduce Jobs using Hive and Pig
  • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and extracted the data from Oracle into HDFS using Sqoop
  • Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior
  • Used Pig UDF's to implement business logic in Hadoop
  • Implemented business logic by writing UDFs in Java and used various UDFs from Piggybanks and other sources.
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team

Confidential, Fort Worth, TX

Hadoop Developer

Environment: Hadoop, MapReduce, HDFS, Hive, Java, SQL, Cloudera Manager, Pig, Apache Sqoop, Oozie, HBase, PL/SQL, MySQL

Responsibilities:

  • Involved in various phases of Software Development Life Cycle.
  • Installed and configured Hadoop Map Reduce, HDFS and developed multiple Map-Reduce jobs in java for data cleansing and preprocessing.
  • Extracted files from Cassandra through sqoop and placed in HDFS and processed.
  • Used Flume to collect, aggregate and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
  • Analyzed the data by performing Hive queries and running Pig scripts to know user behavior
  • Supported Map Reduce programs those are running on the cluster.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS.
  • Worked extensively with Sqoop for importing metadata from Oracle
  • Configured Sqoop and developed scripts to extract data from MySQL into HDFS
  • Hands-on experience with productionalizing Hadoop applications viz. administration, configuration management, monitoring, debugging and performance tuning
  • Created HBase tables to store various data formats of PII data coming from different portfolios
  • Trained and mentored analyst and test team on Hadoop framework, HDFS, Map Reduce concepts, Hadoop Ecosystem

Confidential, San Jose, CA

Java Developer

Environment: Java, Servlets, JSP, Vignette Tool, XML, Eclipse 3.3.0

Responsibilities:

  • Played a very vital role starting from Requirements analysis, designing the Presentation templates and CTDs based on the business expectations.
  • Created the UI pages using Java, JSP, and Servlets .
  • Developed web pages of the application using Java.
  • Used Vignette tool to enter the content and also generate the pages for the application.
  • Generated the pages in the form of XML’s and rendered them into live pages.
  • Used Eclipse 3.3.0 as IDE.
  • Was the only person handled the entire project right from analysis phase to the implementation.
  • Communicated to the clients and business on the daily basis.
  • Supported user Acceptance testing of the application.
  • Successfully delivered the project on time without any failures.

Confidential, OH

Java Developer

Environment: J2EE, Java/JDK, JDBC, JSP, Servlets, JavaScript, EJB, JNDI, JavaBeans, XML, XSLT, Oracle 9i, Eclipse, HTML/ DHTML.

Responsibilities:

  • Developed Admission & Census module, which monitors a wide range of detailed information for each resident upon pre-admission or admission to your facility.
  • Involved in development of Care Plans module, which provides a comprehensive library of problems, goals and approaches. You have the option of tailoring (adding, deleting, or editing problems, goals and approaches) these libraries and the disciplines you will use for your care plans.
  • Involved in development of General Ledger module, which streamlines analysis, reporting and recording of accounting information. General Ledger automatically integrates with a powerful spreadsheet solution for budgeting, comparative analysis and tracking facility information for flexible reporting.
  • Developed UI using HTML, JavaScript, and JSP, and developed Business Logic and Interfacing components using Business Objects, XML, and JDBC.
  • Designed user-interface and checking validations using JavaScript.
  • Managed connectivity using JDBC for querying/inserting & data management including triggers and stored procedures.
  • Developed various EJBs for handling business logic and data manipulations from database.
  • Involved in design of JSP ’s and Servlets for navigation among the modules.
  • Designed cascading style sheets and XML part of Order Entry Module & Product Search Module and did client side validations with java script.

Confidential

Java Developer

Environment: Java 1.5, J2EE5, Spring, JSP, XML, Spring TLD, JSP, Servlets, Hibernate Criteria API, XSLT, CSS, JSF, JSF RichFaces, WASD 5.1, Java Swing, Web service, AXIS Server2, WSDL, XML, Glassfish, JSR 286 API, UML, EJB, Java script, JQuery, Hibernate, SQL, CVS, Agile, JUnit.

Responsibilities:

  • Developed Controllers for request handling using Spring framework
  • Involved in Command controllers, handler mappings and View Resolvers.
  • Designed and developed application components and architectural proof of concepts using Java, EJB, JSP, JSF, Struts, and AJAX.
  • Participated in Enterprise Integration experience web services
  • Configured JMS, MQ, EJB and Hibernate on Web sphere and JBoss
  • Focused on Declarative transaction management
  • Developed XML files for mapping requests to controllers
  • Coded Spring Portlets to build portal pages for application using JSR 286 API
  • Hibernate templates were used to access database
  • Use the DAO in developing application code
  • Developed stored procedures.
  • Extensively used Java Collection framework and Exception handling.

We'd love your feedback!