Senior Java Software Engineer Resume
MA
SUMMARY:
- Experience in implementation and responsibility includes Software requirement analysis, Technical analysis, Coding and Testing.
- Experience in all phases of software development life cycle.
- Involved in meetings to gather information and requirements from the clients.
- Research - oriented, motivated, proactive, self-starter with strong technical, analytical and interpersonal skills.
- Seasoned software developer with strong algorithmic background, practicing architect in building cutting edge solutions leveraging distributed systems (Storage/Compute/Messaging)
Education: Master of Science (M.S) in Computer Science from SUNY Buffalo, graduated in Dec 2013.
SKILL:
Programming Skills: Java, Object Oriented Programming.
Design Patterns Used: Observer, Singleton, Factory, Builder, Publisher, MVC.
Databases: MongoDB, Sybase, Postgresql, Oracle 12c, MySql, DB2 (AS400).
Programming Models: Hadoop, MapReduce.
Web Programming: Jsp, Servlets, Html, Css, Xml, Java Scripts.
Tools: Dbvisualizer, Amazon S3, Eclipse, Microsoft visual studio 2017, IBM I console.
Operating Systems: UNIX, Linux, AIX, IBM i (OS/400), Red Hat, Suse Linux, Solaris, Windows 10.
Methodologies: Scrum, Kanban, Waterfall model.
Subversion Tools: CVS, GIT.
Big Data Tools: Hadoop (HDFS), HBASE, Apache HIVE, PIG, SQOOP, Flume.
PROFESSIONAL EXPERIENCE:
Senior Java Software engineer
Confidential, MA
- Certified scrum master (CSM) of data replication team with a team consisting of Sr Architects, Sr QA’s, SE’s and principal engineer.
- Defined sprint stories, managed sprint velocity, burndown analysis, communicating work responsibilities to team members by guiding through the sprint planning meetings and streamlining release plan to ensure product features are delivered in time.
- Implemented support for Apache Kafka, Sql server and oracle12c as target databases/servers for data replication.
- Designed and implemented end to end data replication to Amazon kinesis streams using aws-sdk and Kinesis producer library from Oracle, sql server, db2400 and UDB database sources.
- Secured data and connectivity to IBM i AS400, mysql, postgresql over distributed machines using SSH and SSL encryption.
- Implemented RSL import functionality for kafka server using java and BISON compiler.
- Experience working with Product managers, clients on POC’s and working with cross functional scrum-based teams.
Software Java developer
Confidential
- Designed post actions and rule engine processor in C++ to handle incoming Edifact, XML and JSON messages.
- Resolved user stories from requirements gathering phase to load and activation in dev, test systems and prod (complete SDLC).
- Followed up with team in Nice (France) to resolve IR’s, PTR’s and issues in a Kanban agile methodology, taking client calls.
- Developed Map Reduce Program for searching the production log files for application issues and download performance
- Designed kibana dashboard for internal monitoring of app traffic by pulling data from Elastic Search engine.
- Wrote MapReduce job/ Hive QL/ Pig Latin to process the source data to structured data and store in relational databases or NoSQL database (HBase, Cassandra).
- Exported the analysed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
Software engineer in Big Data
Confidential
- Designed and developed Media Ranker tool for comcast, which uses ranking algorithms to segment audiences based on TV user behaviour using Hadoop framework, java, hive.
- Developed query templates in groovy, effective ways of partitioning data in hive, TV Advertisements ranking and query logic in hql to get precise and parallelizable data feed for massive data and executing them on Mapr cluster.
- Developed Media Bridge API tool for Comcast which acts as a software component that resides on the edge of a big data repository and enables a remote client to request privacy-safe aggregate statistics for uploaded audience segments by giving actual audience behaviour by zone ranking.
- Experienced with optimizing techniques to get better performance from Hive Queries, creating Hive tables, dynamic partitions and working on them using Hive QL.
Software Developer AOL
Confidential
- Designed database queries for order processing, designing database schema, normalization and worked I DB migration.
- Wrote complex sql queries to perform various database operations using TOAD and performance tuning.
- Designed architecture for web module, XML parser of order entry Module & Product Search Module.
- Designed portal for AOL’s web traffic and applications monitoring.
Software Developer Intern (Internship)
Confidential
- Designed and created code for trend and data analysis of network apps response times using MongoDB and Hadoop.
- Teamed up with Tibco integration team to perform development, integration deployment of Tibco web services