- Over 7 years of professional experience in field of IT with expertise in Enterprise Application Development which includes 3 years of experience as Hadoop Developer with good knowledge in Hadoop ecosystem technologies
- Hands on experience in using Hadoop ecosystem components like Map Reduce, HDFS, HBase, Hive, Sqoop, Pig, Flume, Spark, Kafka.
- Experience in developing PIG Latin Scripts and using Hive Query Language
- Experience in using with (No - SQL) big-data database HBase and familiar with storage concepts like Mongo DB and Cassandra
- Experience in working with Map Reduce programs, Pig scripts and Hive commands to deliver the best results.
- Extending HIVE and PIG core functionality by using Custom User Defined function’s (UDF's).
- Working knowledge on Oozie, a workflow scheduler system to manage the jobs that run on Pig, Hive, Sqoop.
- Developed analytical components using Spark and Spark Stream.
- A good knowledge implementing Apache Spark with Scala.
- Extensive knowledge on AWS (Amazon Web Services) like S3, EMR and EC2.
- 4+ years of experience in Java programming with hands-on on the frameworks Spring, Struts and Hibernate.
- Well versed with core Java concepts like collections, multithreading, serialization, Java beans.
- Extensive experience in implementing Web Services based in Service Oriented Architecture (SOA) using SOAP, RESTful Web Services, JAX-WS, UDDI, WSDL, and Apache Axis.
- Experience in Programming SQL, Stored procedure's PL/ SQL, and Triggers in Oracle and SQL Server.
- Hands on experience with Version Control tools like SVN, CVS, Visual Source Safe (VSS) and GitHub.
- Experienced in working with different operating systemsWindows, UNIX & LINUX.
Big Data Technologies: Apache Spark, Map Reduce, Hbase, Pig, Hive, Sqoop, Flume, Kafka, Impala, Oozie, Yarn.
Languages: C, C++, Java, J2EE, Scala, SQL, PL/SQL, Pig Latin, HiveQL.
Frameworks: Spring, Struts, Hibernate, PLAY.
Scripting Languages: Python, Shell Scripting, Java script and Perl scripting.
Web Technologies: HTML, XHTML, CSS, XML, XSL, XSLT, Ajax.
Web Services: SOAP, RESTful, AWS, JAX-WS, Apache Axis.
Web Servers: Web Logic, Web Sphere, Apache Tomcat.
Application Build Tools: Apache Ant, Apache Maven.
Automation/Scripting: Unix Shell Scripts, Windows Batch Scripts
Application Servers: Apache Tomcat, GlassFish, JBoss, BEA WebLogic,IBM WebSphere
Databases: Oracle 9i/10g/11g, SQL Server, MySQL, Teradata, NoSQL, Cassandra.
Version Control Systems: CVS, SVN.
Confidential -Chicago, IL
Sr. Spark/Hadoop Developer
- Involved in all phases of the application development such as Requirement Analysis, Design, Development, Deployment and Testing.
- Migrating the needed data from Oracle, MySQL in to HDFS using Sqoop and importing various formats of flat files in to HDFS.
- Designing and creating Hive external tables using shared meta-store instead of derby with partitioning, dynamic partitioning and buckets.
- Used Pig as ETL tool to do Transformations, even joins and some pre-aggregations before storing the data onto HDFS.
- Analyzed the data by performing Hive queries and running Pig scripts to validate sales data.
- Written the Map Reduce programs, Hive UDFs & Pig UDFs in Java.
- Used Spark SQL to process the huge amount of structured data and Implemented Spark RDD transformations, actions to migrate Map reduce algorithms.
- Used Tableau for data visualization and generating reports.
- Developed Spark code using Scala and Spark-SQL for faster testing and data processing.
- Worked on converting PL/SQL code into Scala code and also converted PL/SQL queries into HQL queries.
- Involved in scheduling Oozie workflow engine to run multiple Hive and Pig jobs.
Environment: Cloudera, Oracle 12c, HDFS, MapReduce, Hive, Pig, Oozie, Sqoop, Flume, Hue, Tableau, Scala, Spark, Zookeeper,Apache Ignite, SQL, PL/SQL, UNIX shell scripts, Java, STS(Spring Tool Suite), Maven, JUnit, MRUnit.
Confidential - Detroit, MI
- Developed the Pig UDF'S to process the data for analysis.
- Involved in loading data from LINUX file system to HDFS.
- Involved in running Ad-Hoc query through PIG Latin language, Hive or Java MapReduce.
- Importing and exporting data into HDFS and Hive using Sqoop and Flume.
- Developed multiple POCs using Scala and deployed on the Yarn cluster, compared the performance of Spark, with Hive and SQL/Teradata.
- Proficient in using Cloudera Manager, an end to end tool to manage Hadoop services.
- Reviewed the HDFS usage and system design for future scalability and fault-tolerance.
- Developed Hive queries for the analysts.
- Worked extensively with Sqoop for importing metadata from Oracle.
- Experienced in defining job flows using Oozie
- Developed Shell Script to perform Data Profiling on the ingested data with the help of hive bucketing.
- Working Knowledge in NoSQL Databases like HBase and Cassandra.
- Generated property list for every application dynamically using python.
- Managed Batch jobs using UNIX shell and Perl scripts
- Used SVN and GitHub as version control tools.
Environment: JDK1.6, HDFS, Map Reduce, Spark, Scala, Yarn, Hive, Pig, Sqoop, Flume, Oozie, Impala, Cloudera, NoSQL Hbase, Cassandra, Oracle 11g, Python, Shell scripting, Perl,Linux, SVN, GitHub.
Confidential, St. Louis, MO
- Implemented Pig scripts integrated them into Oozie workflows and performed integrated testing.
- Loaded data from RDBMS servers to Hive using Sqoop.
- Created Hive tables to store the processed results in a tabular format.
- Developed Java Mapper and Reducer programs for complex business requirements.
- Used different data formats (Text format and ORC format) while loading the data into HDFS.
- Created Managed tables and External tables in Hive and loaded data from HDFS.
- Optimized the Hive tables using optimization techniques like partitions and bucketing to provide better performance with HiveQL queries.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from a variety of data sources.
- Created partitioned tables and loaded data using both static partition and dynamic partition method.
- Developed work flows to schedule various Hadoop programs using Oozie and Pig UDF's
- Created custom user defined functions in Hive.
- Designed and implemented a stream filtering system on top of Apache Kafka to reduce stream size.
Environment: MapR, HDFS, MapReduce, YARN, Hive, Sqoop, Pig, Flume, HBase, Kafka, Oozie, Java, Linux Shell Scripts and SQL.
Confidential, Southfield, MI
- Developed the action and action form classes, based on the Struts framework, to handle the pages.
- Implemented Struts and Spring frameworks.
- Implemented Web-Services to integrate between different applications using SOAP and RESTful web services using Apache-CXF.
- Used JMS for sending the messages to the Export Queue.
- Deployed and tested the JSP pages in WebSphere server.
- Developed and participated Client application development using Swing/JFC components.
- Developed SQL queries for Oracle and SQL server as our application used two data sources.
- Developed PL/SQL stored procedures, triggers.
- Designed and implemented XML parsing for XML order conformations.
- Developed the Session Beans and deployed them in WebSphere application server.
- Developed External style sheets (CSS) to bring rich look to the application.
- Designed and developed front-end using Servlets, JSP, DHTML, Java Script and AJAX.
Jr. Java Developer
- Involved in Design, Development, Testing and Integration of the application.
- Involved in development of user interface modules using HTML, CSS and JSP.
- Involved in writing SQL queries.
- Involved in coding, maintaining, and administering Servlets, and JSP components to be deployed on Apache Tomcat application servers
- Database access was done using JDBC. Accessed stored procedures using JDBC.
- Worked on bug fixing and enhancements on change requests.
- Coordinated tasks with clients, support groups and development team.
- Worked with QA team for test automation using QTP
- Participated in weekly design reviews and walkthroughs with project manager and development teams.
Environment: Java, Eclipse, Oracle, HTML, Ant, CSS, JSP, JDBC, SQL and Tomcat.