Sr. Bigdata Developer Resume
Peoria, IL
SUMMARY:
- Oracle Certified Java Developer/Big Data Developer with an accomplished 9+ years of experience specialized in Java/J2EE Technologies, working through complete Software Development Life Cycle.
- Extensively worked upon Hadoop components like, Spark, Spark - Streaming, Hive, Data Pipeline, S3, EC2, RDS, EMR, Lambda, HDFS, Map-Reduce, Hive, Sqoop and Oozie.
- Expertise in SQL, Core Java, Scala.
- Strong experience with Oracle, DB2, MySQL and RDBMS concepts.
- Extensive knowledge in Systems Analysis, Design, Development of various Client/Server, Desktop and Web Applications.
- Experienced with RESTful and SOAP Web Services.
- Created Shell Scripts for invoking SQL scripts and schedule them using, Crontab.
- Excellent skills in Web Technologies like JavaScript, CSS, HTML, Ajax, jQuery and XML.
- Excellent knowledge of Core Java including Multithreading and Collections framework.
- Excellent knowledge of Object-oriented Programming (OOP) concept.
TECHNICAL SKILLS:
Languages: Core Java, Scala, SQL, Hive QL.
Hadoop Ecosystem: AWS Cloud, Spark/Streaming, Hadoop, Sqoop, Hive, Oozie, HDFS.
Web Services: SOAP and REST web services.
DB: ORACLE 10g/11g, IBM DB2, MYSQL, Redshift, RDS.
Systems: Windows, Linux, Unix.
Framework: MapReduce, MVC, Spring, Hibernate, JUnit.
Scripting Language: Unix Shell Scripting, HTML, JavaScript, jQuery, CSS, XML, Angular JS.
Java/J2EE Skills: JSP, Servlets, EJB, JDBC, JMS.
Tools: CVS, SVN, Maven, JIRA, IBM RAD, Toad, AQT, QTP
PROFESSIONAL EXPERIENCE:
Sr. Bigdata Developer
Confidential, Peoria, IL
Responsibilities:
- Actively involved in the development of Digital Data Warehouse with Cloudera and AWS.
- Developed JSON extract data for the API team to consume normalized data from Confidential warehouse using Spark and Scala.
- Worked upon improving the performance of Spark jobs by tuning the resources allocation and partitioning data and writing data to interim files in HDFS.
- Developed AWS Lambda which captures events from API Gateway to fetch data from Snowflake.
- Implemented Snowflake connections with AWS Lambda.
- Been part of the AWS team to architecture and migrate Confidential into AWS cloud.
- Worked upon exploring connections to S3 from Cloudera.
- Created views in Hive to improve the performance of batch jobs.
- Worked upon writing unit test cases in Scala and python.
- Worked according to AGILE model approach.
Environment: CDH, Spark, Scala, Java, Python, Hive, AWS Lambda, S3, EC2, EMR, Cloud Watch, Impala, Snowflake, Oozie, Sqoop.
Bigdata Developer Consultant
Confidential, Washington, DC
Responsibilities:
- Designed configured and developed on Amazon Web Services (AWS) for a multitude of applications utilizing the AWS stack (including EC2, S3, Cloud Watch, SNS, EMR) focusing on high availability, fault tolerance and auto-scaling.
- Developed POC’s using latest Big Data technologies available in the market to modernize from the traditional system and the impact it creates in data analytics.
- Worked upon ingesting data from various sources into s3 and RDS.
- Configured EMR clusters and worked upon allocating resources.
- Loaded data into Spark RDD’s, created Data Frames and performed in-memory computations using Spark Transformations.
- Developed Spark jobs using Scala and Spark-SQL and submitted to Spark Engine to perform Validations, Aggregations on CSV and parquet files.
- Used Amazon s3 for storing the data in various formats and assigning policies to buckets to restrict access.
- Developed Spark UDF’s on individual columns to automate the handling of multiple schema validations in a single Spark Job.
- Orchestrated workflow using Data Pipeline to reliably process and move data between different AWS compute and storage services.
- Developed Lambda’s to trigger functionalities based upon incoming events.
- Triggered SNS from Lambda to notify the results of jobs scheduled and used Cloud Watch to monitor Lambda’s.
- Used Spark Streaming to stream the data creating D-Streams.
- Used Amazon RDS to store the processed data from spark jobs.
Environment: AWS, EMR, amazon S3, amazon ec2, amazon SNS, Cloud-Watch, Java, Scala, Spark-SQL, Scala, Jenkins.
Software Engineer
Confidential, Indianapolis, IN
Responsibilities:
- Extracted the data from Teradata/MySQL into HDFS using Sqoop export/import.
- Designed ER-Model for the database
- Used Partitioning pattern in MapReduce to move records into categories.
- Implemented persistent layer using Hibernate API and integrated with Spring component.
- Worked upon Batch processing.
- Actively helped teammates in technical difficulties.
- Produced Rest web services for multiple clients using Jersey.
- Wrote Unit Test Cases using JUnit framework and server-side Logging & Client-side logging using Log4J (Apache open Source API) and Maven dependencies scripts to build the applications.
- Worked according to AGILE model approach.
Environment: Java, HDFS, Sqoop, MySQL, JSP, HTML-5, JavaScript, jQuery, IBM DB2, WebLogic Server.
Software Engineer
Confidential
Responsibilities:
- Involved in development of Technical Specification documents.
- Hands on experience in loading data from Unix file system to HDFS.
- Developed SOAP web services using JAX-WS and provided to other external applications.
- Handled events and runtime errors using JSF event listeners and validators.
- Developed application layout and composing tiles definitions, managed beans to use with JSF, transfer objects to pass data over layers, Business delegates for invoking business methods.
- Built Hibernate models and Java patterns to implement DAO layer using Hibernate interfaces.
- Experienced with AGILE Methodologies.
Environment: Java 1.6, Spring framework 3.0, Hibernate, WebLogic, XML, JSP, JSF HTML5, CSS3, JavaScript, JDBC, Oracle 10g.
Software Developer
Confidential, Little Rock, AR
Responsibilities:
- Involved in all phases of Software Development Lifecycle (SDLC) including Requirements Collection, Analysis of the Customer Specifications, Development and Application Customization.
- Developed and designed the Front-End using HTML, CSS, and JavaScript with Ajax and tag libraries.
- Involved in development of presentation layer using JSP and Servlets with Development tool IBM RAD.
- Extensively worked on generating Batch reports.
- Involved in Database Design and Normalization.
- Involved in tuning of queries which caused system to crash.
- Used Spring for cross cutting concerns and IOC for dependency injection.
- Used Hibernate in data access layer to access and update information in the database.
- Developed application service components and configured beans using spring, creation of Hibernate mapping files.
- Developed the Spring AOP Programming to configure logging for the application.
Environment: Java 1.6, MVC, Hibernate, IBM WebSphere, SOAP web service, XML, Servlet, JSP, HTML5/HTML, CSS3/CSS,, AJAX, JavaScript, JDBC, DOM, IBM DB2, IBM RAD.
Application Developer
Confidential
Responsibilities:
- Extensively used core Java concepts like, Collections, Exception Handling, Generics and Multithreading during development of business functionalities.
- Designed and implemented business functionalities using Spring framework.
- Wrote numerous test cases for unit testing of the code with Junit framework.
- Wrote SQL queries to fetch data from Oracle database.
- Implemented user input validations using JavaScript and jQuery.
- Developed and designed the Front-End using HTML, CSS, and JavaScript with Ajax and tag libraries.
Environment: Java, Spring MVC, Hibernate, XML, Servlet, JSP, HTML5, CSS3, JavaScript, Oracle, JDBC.