Senior Developer Resume
SUMMARY
- More than 10 years of experience in Software Development, coding and application support including more than 3.5 year’s experience in Big Data Hadoop.
- Excellent implementation knowledge of Distributed / Enterprise / Web / Client Server systems using Java, J2EE (JSP, Servlets, JDBC, EJB, JNDI, JMS, Custom Tags), XML, Spring, Struts, AJAX, Hibernate, Web Services, ANT, JUnit, Log4J and Maven.
- Hands on experience on Apache spark integration with kafka,ElasticSearch,Cassandra,MongoDB,file System source(CSV,XML,JSON) and RDBMS data source(Oracle,DB2)
- Hands on experience on scala language used in Apache spark application.
- Hands on experience on writing python validation scripts.
- Hands on experience in application servers like WebLogic 8.1, Tomcat 6, JBoss AS 7.0, WAS 7.0.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
- Experienced with IDE tools such as Eclipse 4.2.
- Experienced in web design using HTML, CSS, and jQuery.
- Proficient in XML Technologies including SAX, DOM parsing.
- Good experience in recognizing and reusing Design Patterns - classical and J2EE design patterns.
- Developed architecture framework for Presentation layer and Business layer using Spring, Struts and Data Access Object using Hibernate.
- Extensively used Rational Rose and Enterprise Architect for Use Case Modeling, to draw Use Case Diagrams, Sequence Diagrams and Class Diagrams.
- Proficient in relational database environments (Oracle, DB2, MS-SQL, and MySQL).
- Strong experience in Systems Development Life Cycles and Object Oriented Design and Development.
- Experienced in handling offshore business process model.
- Proficient in MS Office, particularly Excel, PowerPoint and MS Visio.
TECHNICAL SKILLS
Programming Languages: Core JAVA, JDBC 2.0, Servlets, JSP, JavaBeans,Python,Scala
Database: Oracle 11g/10g/9i
Web Development: HTML, CSS, jQuery
Frame work: Spring 3.1/4.x, Hibernate 3.x/4.x,JPA 2.1,Struts 1.3 & Web services JAX RPC, JAX WS, JAX RS, Struts 2,JMS
Other Tools: TOAD, SQL Navigator, FileZilla, HP Quality Center, JIRA, Elastic Search
IDE Tools: NetBeans 6.5, MyEclipse 6.0, Eclipse 3.7, Eclipse 4.2 Juno
Web Server: Apache Tomcat 5.5/6.0/7.0/8.0
ORM Tool: Oracle 9i/10g (SQL, PL/SQL), DB2, MySQL, Derby, SQL Server Express
Operating Systems: Windows 98/2K/XP/7/8, Unix & Linux OS
Web Technologies: HTML, JAVA SCRIPT
Utility Tools: Log4j, Ant, Maven
Design Pattern: MVC, Service Locator, Business Delegate, DAO, Value Object, Singleton, Factory, Abstract Factory, Builder, Prototype Design Pattern
Version Control Tools: CVS, SVN, GIT
Code Review Tool: Sonar, Crucible
Database Tool: Toad, SQL Developer
JavaScript Framework: Angular JS 1.0, jQuery
Testing Framework: Mockito, JUnit
Security Framework: ACEGI Spring Security, OWSAP Security
Big Data framework: Apache Hadoop, SPARK, HDFS, Map Reduce, PIG, Hive, Sqoop, Flume, Hase and Storm
NoSQL Database: MongoDB, Cassendra
Message Broker Tool: ActiveMQ, Apache Kafka
Web Service Framework: SOAP,RESTful Web service (Jersey), Micro Services
Caching Framework: Ehcache
Hadoop Distributions: Cloudera distribution 5.12 and HortonWorks (HDP 2.5)
Rest Client: Postman, Advanced Rest client, SOAP UI
AWS framework: S3,RedShift,EC2,RDS,DynamoDB
Big data Tools: Tableau,Cloudera Manager,HDP, WebHDFS,AWS,Azure
PROFESSIONAL EXPERIENCE
Confidential
Senior Developer
Responsibilities:
- Worked on Pig and Hive scripts
- Kafka integration with Spark using Spark Streaming API.
- Worked on Scala with Spark SQL and spark streaming integration.
- Write new Oozie coordinator and bundle jobs for existing jobs in Hadoop cluster.
- Worked on Flume integration with server log file to do analysis using Hive Thrift server.
- Worked on Sqoop for importing RDBMS data into HDFS.
- Managed Hadoop 120 nodes production cluster by monitoring, adding new nodes into cluster
- Worked on shift based maintenance based support for monitoring,debugging activities of hadoop and spark jobs.
- Provisioning to build up hadoop project to add resources and components into applications.
- Worked on operational changes for processes followed in Application . worked on Call support as L1/L2 and fixing issues on high priority.
Technologies Used: HDFS, MapReduce, Linux,Pig, Hive, Sqoop, HBase, Oozie,, Spark,Scala, SnapLogic and Oracle
Confidential
Senior Java Developer
Responsibilities:
- Writing RESTful webservices for exposing services to client create UI based application for Trinity processes for business data of Party, Contract,Activity, Contractoption and ContractfundOption.
- Generating flatfiles for these business domain data as pipe separated to consume in big data platform.
- Integration of Kafka with Spring using kafka template to read rest services data store it into Elasticsearch.
- Worked on FIA inforce CSV data files to read using Apache Spark and doing validation on schema of hive tables and apply transformation on the specific column to provide transformation rule from property file and correct records insert overwrite into actual hive table and write the error data records into hive error table .
- Creating dashboard application UI for jobs running for big data platform to see the status of jobs.
Technologies Used: Java, Spring boot microservices, Apache Spark,RESTful Web Service,Oracle,kafka,ElasticSearch
Confidential
Senior Hadoop Developer
Responsibilities:
- Data preparation using REST API
- Data Governance System using Data lake to dump data from various data sources eg. RDBMS,File System, Streaming data and NoSQL tool
- Integration of hive and tableau for dashboard report and graphs for visualization.
- Integration of kafka and spark streaming to read the online streaming data . After transformation and aggregation result store into Hbase.
- Used Microservice architecture, with Spring Boot-based services interacting through a combination of REST and Apache Kafka message brokers.
- Used incremental load using sqoop import into hive table used python script to validate data loaded into HDFS from RDBMS worked on oozie for workflow of notifications mails to end users
Technologies Used: Java, Hive, Sqoop,Flume,Oozie,Python,Spark, REST, Service now ticketing tool
Confidential
Senior Hadoop Developer
Responsibilities:
- Monitoring Event Engine tool which shows alerts for failed jobs. Accept the alarm and alerts and analysis of logs of job to login into edge node.
- Working on service now tickets incidents to assign to work on that and service restored once issue is fixed.
- Worked on different parser in big data application e.g. Delimited parser, Mainframe MR parser, fixed width parser, json parser, RFC delimited parser.
- Confidential the end of all parsers, there is a record count check, with the number of parsed records validated against the total record count sent by the SOR system as part of control info.
- Worked in PDS (Platform Data Standardizer) module which is related to platform specific functions like data validation against ingest/feed layer, voltage decryption, threshold-based ingestion
- Worked in FDS (Feed data Standardizer) module which creates hive tables on top of PDS output. It is expected the output of hive query should match the storage schema. The external hive meta data is purged then.
- Worked on Data load module which comes after input data processing with different types Load append (Appending data into existing data), Snapshot (deduplication of data and replacing older file), Organization (transforming existing data into a new data set).
- DQM is usually executed once the data has been standardized and transformed (if required).
- Masking module is responsible for masking sensitive column values in the data file. A third-party tool - DgSecure - has been selected to provide the masking capability. The data guise tool accepts delimited data and a set of structure files as input. worked on hadoop and apache spark jobs in production support to debug & fix the issues.
Technologies Used: Java, Hive, Sqoop,Flume,Oozie,Python,Spark, REST, Service now ticketing tool
Confidential
Senior Java Developer
Responsibilities:
- Created Login module using spring security, user management module, event module and event attendee module using spring boot architecture for various environment DEV, IT, QA, UAT and Production.
- Worked on Integration User management module integration with sales platform integration using Kafka.
- Worked on workflow management module to create workflow and assign workflow to contact and add registration fields in workflow.
- Created RESTful web services using Spring Boot architecture.
- Used Microservice architecture, with Spring Boot-based services interacting through a combination of REST and Apache Kafka message brokers.
- Worked on configuration changes for different environments to design,implement and test application.
- Worked on kafka nodes scalability and no of messages in batch size and no of threads involved in pulling data using kafka message broker.
- Worked on mail management module to create segment, assign to campaign, sending mail to campaign using Email template.
- Worked on Apache spark to analyze events data producing coming through rest services and kafka integration and storing into cassandra.
- Working in Agile environment with daily scrum calls and JIRA integrated for issue tracking for fixes for new requirement and existing code fixes.
- Used Java 8 Lambda functions and stream API in project.
Technologies Used: Java 8, J2EE, Spring MVC, Spring Boot, Microservices, Kafka, Oracle 12g,Spring REST,cassandra,Apache spark, Hibernate and JPA
Confidential
Senior Developer
Responsibilities:
- Responsible in migrating from Crons to Tidal so that monitoring of jobs can easily be done.
- Worked on Pig and Hive scripts
- Kafka integration with Spark using Spark Streaming API.
- Worked on Scala with Spark SQL and spark streaming integration.
- Write new Oozie coordinator and bundle jobs for existing jobs in Hadoop cluster.
- Worked on Flume integration with server log file to do analysis using Hive Thrift server.
- Worked on Sqoop for importing RDBMS data into HDFS.
- Managed Hadoop 120 nodes production cluster by monitoring, adding new nodes into cluster
- Worked on shift based maintenance based support for monitoring,debugging activities of hadoop and spark jobs.
- Provisioning to build up hadoop project to add resources and components into applications.
- Worked on operational changes for processes followed in Application . worked on Call support as L1/L2 and fixing issues on high priority.
- Deployed application services modules in Azure cloud platform.
Technologies Used: HDFS, MapReduce, Linux,Pig, Hive, Sqoop, HBase, Oozie, Cisco Tidal,Spark,Scala, SnapLogic and Oracle
Confidential
Sr Software Engineer
Responsibilities:
- Worked in Spring MVC and RESTful Web services architecture to create various applications with their infrastructures details such LDAP detail, architecture, usage details,
- Analysis of history data using MapReduce (images, video and unstructured data), Pig and Hive for (XML and Excel files). Clickstream is used to record data of user visiting the website.
- Installed and configured multi-node Hadoop cluster for data store and processing.
- Responsible for unit and system testing. Developed unit test cases using JUnit framework
- Implemented solutions using Hadoop, HBase, Hive, Sqoop, Java API, etc.
- Imported and exported data into HDFS, HBase and Hive using Sqoop.
- Loaded and transformed large sets of structured and semi structured data. Implemented solutions using Hadoop.
- Used ActiveMQ for messaging.
- Worked on front end layer using Dojo framework.
Technologies Used: Spring MVC and Hibernate, Web services and DB2, MapReduce, Pig, Hive, Sqoop, Dojo
Confidential
Senior Developer
Responsibilities:
- Worked on various modules such as Customer Search, Policy Search, Policy List, Contact Search, Contacts Update, Contacts Delete and Fund Inquiry Services which are called from Sales force UI using Cordys interface.
- Understood the configuration management using Sales force.
- Involved in the design and coding in DAO classes.
- Involved in project development using RESTful services design.
Technologies Used: HTML, Spring and Hibernate and SQL Server 2005, SalesForce, JBoss 7.1
Confidential
Senior Developer
Responsibilities:
- Worked on comprehensive and highly configurable policy administration suite for Life, General (P&C)and group insurance that provides end-to-end services for insurers including product management, new business and underwriting, policy ownership services, general ledger, claims and renewals, policy/group management, etc.
- Worked on UI generator tool which parses the existing screens, reads the proposed layout information from a file, that information in an XML, and generates the new JSPs using these XMLs.
- Worked on new screen generation. A screen designer/BA will design the screen Meta data in XML for new screen and this tool will parse that XML and generate the JSP for same. UI Generator Confidential the heart has two engines:
Technologies Used: HTML, Core Java, Spring, Hibernate, SQL Server 2005