Big Data/ Java Developer Resume
Alpharetta, GA
SUMMARY:
- Over 2 years of IT experience in Java, MS SQL Server suite of products like SSRS, SSIS and SSAS of MS SQL Server and big data technologies like sqoop, hive, pig, spark, scala, flume, kafka, oozie and hbase.
- Experience in Agile (Scrum) methodologies where there is a daily SCRUM meetings to understand the status and resolve any issues.
- Expertise skills in Java Multithreading, Exception Handling, HTML and related technologies.
- Efficient in Database / Data Marts design, Stored Procedures, Triggers, Indexes, Data Cleansing / Data Mining .
- Experienced in working on several build tools like Maven.
- Expertise in working with different databases likes MySQL, MS SQL Server along with exposure to Hibernate for mapping an object - oriented domain model to a traditional relational database.
- Strong skills in RDBMS implementation and development using MS SQL Servers and hands on experience with T-SQL .
- Hands on exposure to Apache Tomcat Web/Application Servers.
- Strong understanding of various Hadoop services, MapReduce and YARN architecture.
- Responsible for writing MapReduce programs.
- Experienced in importing-exporting data into HDFS using SQOOP.
- Load log data into HDFS using Flume and Kafka.
- Experience loading data to Hive partitions and creating buckets in Hive.
- Developed MapReduce jobs to automate transfer the data from Hbase.
- Expertise in analysis using PIG, HIVE and MapReduce.
- Highly experienced in using SSIS Import/ Export Wizard, SSIS designer and other package execution utilities for performing the ETL operations.
- Experience in creating packages, Jobs, Sending Alerts using SQL Mail.
- Expert writing SSIS Configuration, Logging, Procedural Error Handling and Data Error Handling.
- Hands on Resolving complex issues and Error Handling in SSIS.
- A dedicated, hardworking individual with strong analytical and problem solving skills and able to play vital role in team or individual environment.
TECHNICAL SKILLS:
Languages: C, C++, JAVA, T-SQL, XML, XSD, Scala
UI/UX Technologies : Javascript (ECMA 5.0), HTML5, CSS, Bootstrap
Big data Ecosystem: Sqoop, Hive, Pig, Spark, Hbase, Oozie, Flume, Kafka, Zookeeper, MapReduce, Hadoop/Big data HDFS
Sql Server Tools: SQL Profiler, Query Analyzer, SQL Server 2008, SQL Server 2005 Management Studio, DTS, SSIS, SSRS, SSAS
Architecture: Relational DBMS, Client-Server Architecture, OLAP, OLTP, Hadoop
Database : MS SQL Server 2008,2005, 2000.
Operating Systems: Windows XP/NT/2000, Windows Server 2007, MacOS, Linux, Unix
Tools: Power BI, Microsoft Visual Studio 2008/2005, Talend, Eclipse, IDE, Jetbrains, IntelliJ, Cloudera.
PROFESSIONAL EXPERIENCE:
Confidential, Alpharetta, GA
Big data/ Java Developer
Responsibilities:
- Involved in the project from requirements gathering and involved in various stages like Design, Testing, deployment and production support. Created design for new requirements. Understanding and creating application structure, behavior and business process.
- Used Maven Deployment Descriptor setting up build environment by writing Maven pom.xml, taking build, configuring and deploying of the application in all the servers.
- Used agile methodology process in the development project and used JIRA to manage the issues/project work flow.
- Involved in the project from POC and worked from data staging till saturation of DataMart and reporting.
- Completely responsible for creating data model for storing & processing data and for generating & reporting alerts. This model is being implemented as standard across all regions as a global solution.
- Responsible for providing technical solutions and work around.
- Migrating the needed data from Data warehouse and Product processors into HDFS using Talend and Sqoop and importing various formats of flat files into HDFS.
- Comfortable with SCALA functional programming idioms and very familiar with Iterate / Enumerate streaming patterns. Almost entire DQ and end to end reconciliation is done in SPARK.
- Implemented partitioning, dynamic partitions, indexing and buckets in HIVE.
- Created Custom UDF’s in JAVA to overcome HIVE limitations on Cloudera CDH5.
- Used Hive to process data and Batch data filtering. Used Spark/Impala for any other value centric data filtering.
- Supported and monitored MapReduce programs running on the cluster.
- Monitored logs and responded accordingly to any warning or failure conditions.
- Worked with big data teams to move ETL tasks to Hadoop.
- Responsible for preserving code and design integrity using GIT.
Environment: Maven, Java, Hadoop, Talend, Spark, PIG, Hive, Sqoop, Impala, Scala, HDFS, GIT.
Confidential
Associate Software Developer
Responsibilities:
- Involved in SDLC Requirements gathering, Analysis, Design, Development and Testing of application using AGILE methodology (SCRUM ).
- Used native SQL Queries in hibernate to perform complex select operation for reporting.
- Used Maven as a build tool automating the building, testing, publishing and deployment.
- Involved in the complete life cycle of the project from the requirements to the production support.
- Used Jenkins for Continuous Integration and Git for Version Control.
- Developed SSIS packages with various tasks like FTP, file system, for-each-loop, execute package, data flow task, execute SQL task, custom script task, analysis services task, data mining task.
- Database Maintenance, Replication, Tuning, System Database Migrations and Maintenance.
- Developed the SQL Server Integration Services (SSIS ) packages to transform data from SQL 2008 to MS SQL 2012 as well as Created interface stored procedures used in SSIS to load/transform data to the database.
- Identified and worked with Parameters (e.g. cascading parameters) for parameterized reports in SSRS.
- Create Stored Procedures, Triggers, Tables and Indexes for MySQL and MSSQL 2012.
- Created and managed schema objects such as Tables, Views, and Indexes depending on user requirements.
- Defined Check constraints, Business Rules, Indexes and Views.
- Managing historical data from various heterogeneous data sources (i.e. Excel, Access).
Environment: Java, Jenkins, GIT, Maven, JIRA, Intellij IDEA, DataGrip, T-SQL, SSIS, SSRS, SQL Server and Microsoft Visual Studio, XML, MySQL, Windows XP.