J2ee Lead Developer Resume
SUMMARY
- Motivated Hadoop certified, Cassandra certified, Sun Certified Java/ Bigdata Architect with strong communication skills working in the
- Big Data Platform providing an end - to-end design/development solution for Big Data.
- 16+ years of Java/ J2EE experience including Architecture, Technical Design, Development/
- Testing and Implementation of Web Based Applications using Agile/ RUP Methodologies, BDD/ TDD pattern.
- Extensive experience creating Vision document, Software Architecture document (SAD) containing Use Case modeling, Analysis Model, Design Model, Deployment Model and Implementation Model.
- 4+ years of experience working in Big Data Hadoop Ecosystem comprising Apache Spark 2.3, PySpark API, Databricks Cloud, Docker, Map Reduce, Hive, Pig, Airflow scheduler, Apache Oozie, Sqoop, Flume, Zoo Keeper, WebHDFS, Apache Kafka, Apache Avro, Apache Kylin, Cascading API, Apache Zeppeline, Superset, Druid.
- 4+ years of experience working in Hadoop Distributions including Cloudera, Horton Works and MapR
- 2+ years of experience working on AWS using Lambda, SNS, SES, Glue API, Data Pipeline, S3, API Gateway, Athena API.
- Exploring currently on Amazon Kinesis for Message Ingestion and DynamoDB No SQL DB.
- Extensive experience working on Automation end to end using Jenkins Groovy and Shell Script files.
- Have 2 years of experience working in Boto3 API for Python to upload/ download files between On-Prem and AWS S3 folders.
- Solid Experience using Airflow scheduler automating all the tasks within a DAG for end to end workflows.
- Extensive experience in creating Complete Workflow chain from the scratch for multiple projects within client domain using Apache Oozie.
- Workflow chain/ Scheduling involves Map Reduce Jobs, Hive, Pig, Java and Shell Script, Email actions with output of one workflow fed as input to another.
- 1+ years of experience working in Ehcache framework for caching data across 10 managed servers (5 PRODUCTION Servers) within an Oracle Web logic clustered environment.
- Solid experience in using Multi cast and Unicast settings for data replication across managed servers using Ehcache.
- 6+ years of Strong experience working in Spring framework -
- Confidential, Spring IOC/ AOP, Spring Transaction, Spring JMS, Confidential, Spring JDBC, Spring Integration with Hibernate/ Ibatis and JPA, Spring Roo, Apache STRUTS Framework and Tiles.
- 1+ years of experience implementing SOA based architecture using Oracle Enterprise Service Bus, SOAP/ REST Web Services, JAX WS/ JAX RS, Apache Axis, Spring WS, WSDL/ SOAP/ WS-Security/SAAJ API and JAX RPC Web Services, SOAP UI
- Strong development experience in JDK 1.6, JSP, Servlets, Oracle 12g, XML/ XSL/ XSLT, XSD, EJB 2.0, Knock Out JS framework, JUnit, Mockito framework, jQuery, JavaScript, AJAX, JSON, JMS, MQ 6.0, MDB, ILOG JRules 6.7, UNIX/ LINUX Shell Scripts.
- Strong experience working in Hibernate, Hibernate Connection Pooling, HQL, Collections, Hibernate Caching, Hibernate Transactions, Optimistic Locking.
- Sound experience in writing shell scripts ( Confidential Jobs) to process the Input big data files and move it to appropriate output locations.
- Good experience working in Apache Kylin for sub seconds latency, Cascading/ Lingual API to write Map Reduce jobs, Hive scripts.
- 5+ years of experience working in Agile environment, Scrum based and Test Driven Development (TDD) pattern.
- Strong experience using ANT/ MAVEN for creating build scripts, Jenkins for CI.
- Good experience working in Automated Test Framework using JRuby/ Cucumber, Jasmine Scripts for JS.
- Strong design experience in UML Modeling using RAD v7.5.2, Rational Software Architect v7.5, Rational Rose 2003, Star UML, Enterprise Architect, Microsoft Visio 2007 tools.
- Strong experience using JProfiler, JMeter tools for performance testing, memory leak and code review tools like PMD, Findbug and Fortify and SONAR code coverage with Jenkins.
- Solid experience working on all major Application servers like Oracle Web logic 12.1.1.2, IBM Web Sphere Application Server WAS 6.1, Apache Tomcat 6.0.20, JBoss Application Server 4.2, Jetty.
TECHNICAL SKILLS
J2EE: JDK 1.5, STRUTS 2.0, Tiles, Spring IOC/ AOP, Confidential, Spring Transactions, Spring Webflow, Confidential, JMS, IBM MQ 6.0, JMS, MDB, JSP, JSF, Servlets, EJB, XML/ XSL/XSLT, XSD, JAXP, CASTOR, JAXB 2.1, RMI, JavaScript, JSTL, JUNIT 3.8.1, Regular Expressions, LDAP, Spring JDBC, Spring Roo, AJAX
Hadoop ecosystem: Apache Spark 2.3, Java/ Scala/ Python API for Spark, Map Reduce (MR) programming, Hive, Pig, Apache Oozie, Zoo Keeper, Flume, Sqoop, Kafka, Apache Avro
AWS: Lambda, Data Pipeline, SNS, SES, EMR, S3, Boto3 API to download/ upload files from/ to S3, Amazon Kinesis
NoSQL Database: DynamoDB, Apache Cassandra
Other Big Data Frameworks: Apache Kylin, Cascading/ Lingual API for Hive scripts.
Caching Frameworks: Ehcache, Coherance
Automated Test Scripts: JRuby/ Cucumber for Automated Browser based testing, Jasmine Scripts, Mockito JUnit test cases, Monitoring test cases/ Contract Test Cases
Object Relational Mapping (ORM): Hibernate 3, iBATIS, JPA
SOA Implementation: Oracle Service Bus, JAX WS 2.0, Apache Axis 1.4, Mule Enterprise Service Bus (ESB), SOAP 2.0, WSDL, Web Services Security, WSRP, IBM Web sphere Message Broker (WMB) ESB, IBM Data Power, Spring JMS 3.0.2 (DMLC), Confidential
Business Rules Management Systems (BRMS): ILOG JRules 4.0, 6.7
Performance tools: Perf4j, JProfiler 5.0, Rational Performance Studio
Reporting Tools: Business Objects XI Java SDK, Web Services SDK, Actuate BIRT Reports, Jasper Reports
UML Modeling: RAD 7.5.2, Rational Software Architect v7.5, Rational Rose 2003, Star UML, Enterprise Architect, Microsoft Visio 2007
IDE: Rational Application Developer RAD v7.5.2, JBoss Developer Studio 1.1.0, Spring Source Tool Suite 2.3.1, Eclipse 3.3.2, JBuilder IDE, Text Pad, Edit Plus, TOAD
Version Management: Rational Clear Case/ Clear Quest, Tortoise SVN, CVS, VSS, MKS, Star Team
Build Tools: ANT 1.8, MAVEN 2.2.1
Methodology: RUP, Agile - Scrum, XP Programming
Database: Oracle 8i, Oracle 10g, DB2, SQL Server 2003, MS Access
Application Servers: Oracle Web logic 12.1.1.2, IBM WAS 6.1, JBoss Application Server 4.2, Apache Tomcat 6.0.20
Operating Systems: UNIX/ Red Hat LINUX, Windows XP
Defect Tracking Tools: Quality Center, Test Director
Others: Erwin Data Modeling tool, Remedy, File Zilla, Win SCP, MQ Visual Edit, MPP, MQ VISUAL EDIT, JBoss Developer Studio 1.1.0, Per4j, PMD/FindBugs/Fortify tools for Code Review, Soap UI, Jenkins CI, Rally, Firebug, JIRA, SONAR for Code Metrics.
Project Status Tracking Tools: MS MPP, Rally, Tachometric
Message Types: NCPDP 10.6, HL7 v2.x
PROFESSIONAL EXPERIENCE
Confidential
Big Data Architect
Responsibilities:
- Develop Java Spark Job to process iSpot crawl data in S3 ( Confidential ) format, convert Confidential to Parquet and push the output to S3. This is a daily Confidential .
- Develop Java Spark Job to process STB raw data in Parquet and push the output to S3. This is a daily Confidential .
- Currently using Databricks Python Notebook to spin up EMR Cluster and run Spark Jobs and integrate with Jenkins.
- Developed Shell script to spin up EMR Cluster, run Spark jobs for Reports and invoke AWS Validation Lambda.
- Developed Validation Framework using Python, AWS Lambda and Glue API which creates DB, runs the crawler to create tables ( Confidential ), runs all 50 validation queries at Brand level, comparing with iSpot data, check Show Stopper conditions and send Email with validation results as attachment. Used Amazon SES API for email attachments.
- Developed Data Mover module using Python which uploads data from On-Prem to AWS using SSE Encryption.
- Implemented Automation end to end using Jenkins and Shell Script files for iSpot, STB Processing and Report Processing.
- Exploring currently on Amazon Kinesis for Message Ingestion and DynamoDB No SQL DB.
- Automated Docker Creation using Jenkins to run MAVEN commands.
- Create Information Architecture (IA) which explains the different aspects of the data feeds (
- Campaign, VOD, Linear, Set-up Top Box data for Confidential ) -
- Source from where the data is consumed, destination where the data is sent, who is using it, when it is being used, how it is used etc.
- Created System architecture/ Design and Software development for AAM (Adobe Audience Manager), Visible World (VW) projects for end to end workflows.
- Visible World (VW) Project - Develop Apache Spark code using Java/ PySpark API to perform Input File validations, conversions from Confidential / Text files to Parquet and Parquet to Confidential formats, transformations, uploading to AWS S3.
- Adobe Audience Manager (AAM) - Develop Apache Spark code using Java/ Python API to create Opt Outs, process Confidential Input file, Join with AIDB, create Signal File/ Trait Reports and then deliver to Adobe S3 for Segment processing.
- 100% Automation of the individual stages in Visible World (VW) and Adobe Audience Manager (AAM) projects with Airflow Scheduler Framework.
- Airflow involves DAG’s which has one to many tasks.
- Automation includes end to end for all the 4 VW phases and 16 steps in AAM
- Develop File Listener using Watch Dog Python API to notify whenever new file is received from Confidential .
- Developed Data Mover module using Python which uploads the STB/ AIDB/ Opt Out files from On-Prem to AWS S3 buckets.
- The files are Server Side Encrypted (SSE) and also ACL is set to “bucket-owner-full-control”.
- Develop Spark code to process and convert files from one format to another format. For ex, XLS to Confidential, Parquet to Confidential and Confidential to Parquet etc.
- Develop Spark Code to implement Data Quality Checks - Duplicate Count of records, Timeliness of Input files etc.
- Execute Spark Jobs for AIL (T-Mobile) for all the 24 campaign files.
- Analyze the Business Requirements and come up with Design/ Architecture identifying the different components, flow diagrams and discuss with the team.
- Participate in the end-to-end life cycle of the project right from requirements, design, development and testing.
- Involved in complete Architecture/ design/ development for the Citibank Confidential project which joins Map File with Citibank Encrypted files to create Output files.
- Develop Code using Boto3 API (AWS SDK for Python) to push the files from specific server MapR FS location to AWS S3.
- Installed Oracle BDD in Linux Server and created BDD Reports by importing huge Confidential dataset in Confidential format and applying transformations using Endecca Query Language (EQL).
Confidential
J2EE/ Big Data Platform Lead
Responsibilities:
- Write Map Reduce Jobs/ Hive queries to identify the Top Call Record Issues, which the customers are facing and come up with solution to fix the issues.
- Develop Confidential Data Adapter using Apache Kafka, Flume ( Confidential ), WebHDFS, Zoo Keeper, Hive client to ingest Confidential data into Production Hadoop cluster.
- Create Workflow chain/ Schedule from scratch using Apache Oozie to process Map Reduce Jobs, Hive, Pig, Java, Shell Script, Email actions with one Job output processed as Input to another Job.
- Migrate existing MARS Scheduler Jobs to Oozie and come up with document with instructions on the migration steps.
- Working on a POC to compare Apache Spark vs Map Reduce to migrate Map Reduce Jobs to Apache Spark.
- Develop and maintain Mongo DB Adapter using Apache Flume to ingest User Preferences data from Mongo DB into Hadoop production cluster.
- Implement User Stitching module by writing Map Reduce Jobs and Hive queries to identify Dedupe and Duplicate Visitors and work with Portal team to create appropriate graphical User Interfaces.
- Implement Platform Transaction Attribution module using Map Reduce Jobs to convert Orders to local time and work with Portal team for graphical interfaces.
- Create Confidential Jobs using Linux shell scripts to invoke Map Reduce jobs and create partitions in HDFS using Hive queries for specific User Stitching project requirements.
- Prepare Training Material explaining the Hadoop data, different formats, how to access data through Hive, developing/ running Map Reduce Jobs, Pig Scripts for other teams.
- Attend the daily design/ status meetings to discuss on the project architecture and project status.
- Investigate how to use Apache Kylin to write Map Reduce Jobs and achieve sub seconds latency for billions of rows.
- Research on using Cascading framework in the Big Data Platform to write new
- Map Reduce Jobs, migrate existing Jobs and how to write Hive queries using Lingual API and check performance of Lingual API vs Hive
- Involve in code reviews of the other team members and suggest review comments where ever needed.
- Implement Test Driven Development (TDD) pattern by writing JUnit test cases first and then implement the functionality.
- Pull the source code from the GIT Repository on a daily basis and check-in the files incrementally to the Parent repository.
- Analyze the different Big Data frameworks like Apache Tez, Apache Kylin, Cascading etc and come up with proposals suggesting technologies for different set of use cases.
- Create a Hive view to join multiple Confidential tables and displays as one table data for the end customers.
J2EE Lead Developer
Confidential
Responsibilities:
- Response time now in Milli seconds which was earlier in seconds when remote DB calls were made.
- Created the JUnit/ Mockito base framework for each story involving Happy path and negative scenarios.
- Created the Cache monitor console which gives details on the cache data, how much memory is used, how much memory is still available etc.
- Involved in architectural reviews, design discussions, mock code reviews with the team members and suggesting code changes where ever used.
- Participate in daily scrum calls, retrospective meetings for each iteration and handle brown bag sessions for the team with new technologies.
- Creating automated test cases using Soap UI to handle different scenarios for each module.
- Working with Experian to implement address verification and credit worthiness of any user enrolled in the system.
- Involved in designing the Usage Tracking module, which logs every bit of action performed by the user for both positive and negative scenarios.
Technologies: Backbone.js, Bootstrap, Confidential, Spring IOC/ AOP, Spring JMS, SOAP/ REST, Apache Cassandra, Ehcache, JUnit, Web logic 12.1.1.2, Eclipse, Jenkins, Rational Clearcase, ANT/ MAVEN, jQuery, AJAX.
Confidential
J2EE Lead Developer
Responsibilities:
- Created System Architecture document explaining the different flows of the application.
- Involved in developing JAX WS Web Services to make web service calls to SAP
- Developed MDB's to listen to JMS Queues for messages coming from FAS.
- Developed OSB (Oracle Server Bus) routers for message processing and routing.
- Created MAVEN scripts for build and Jenkins jobs for Continuous Integration (CI).
- Using Soap UI for executing web services and QBrowser to post messages to Queue.
- Participate in daily status calls for defect tracking.
- Working in highly dynamic agile environment.
- Created project related documents and uploaded in EIRS system.
J2EE Lead Developer
Confidential
Responsibilities:
- Created Validator framework for Review Applications using Spring.
- Used Confidential for making calls to Middleware for Account number validation.
- Worked on Address Validation which uses Confidential API.
- Managing the defects in QC on a daily basis, fixing defects and tracking it to closure.
- Set-up the Jenkins for CI and integrating with the build/ deployment.
- Integrated Confidential and Jenkins for Metrics and check the Code Coverage on a daily basis.
- Created JUnit test cases in tandem with Software development.
- Build critical components that can be reused across the project and build UI screens.
- Design and Develop large-scale highly integrated enterprise system using J2EE technologies.
- Conduct peer reviews for the code developed by other team members.
- Design and Develop large-scale highly integrated enterprise system using J2EE technologies.