- Senior J2EE Developer with 12 years of IT industry experience in analysis, design, development, customizations, and implementation of enterprise applications encompassing a wide range of skill sets, roles, and industry verticals.
- Hands on experience in Big Data - Data Extraction, Transformation, Loading and data Analysis on AWS platform.
- Experienced in Agile Methodologies and Test Driven Development to keep projects accountable & on track.
- Proficient in SQL, SP, Triggers, Functions, Indexes, and performance tuning of the queries for large database.
- Adept in generating test cases with test automation and automated regression test tools like JUnit.
- Ensured defects found in Production and QA are logged, monitored, fixed and reported complete.
- Performed in the roles of Application Designer, Technical & Development Lead, and Application Support.
- Review and approved external parties systems and solutions as per specification.
- Participated in peer code reviews, mentoring junior developers, build and deployment activities.
- Quick learner and self-starter with ability to prioritize tasks and to meet deadlines.
- A team player with involvement in a combination of off-shore & on-site parties.
Languages: Java, J2EE, C++, XML, XSL, XSLT, SQL, Python, PHP, Scala, C#
Big data: Hadoop, Hive, Spark, Elastic Search, Redis, Oozie
AWS: S3, SQS, Lambda, Data Pipeline, Redshift, EMR, Dynamo DB
Framework: Spring, Hibernate, Struts, iBatis, Velocity template, Python’s popular module
Web Services: JAX WS, JAXB, WSDL, SOAP, RESTful, Postman, soapUI, JSON
Application servers: Tomcat, WebSphere Process Server, Jboss, Jetty, Apache
Database/Tools: Oracle, SQL Server, MySQL, PostgreSQL, TOAD
Build & Test Tools: Maven, Jenkins, Ant, jUnit, DbUnit
IDEs: Eclipse, IntelliJ Idea, Pycharm, WebSphere IDE, Cast Iron Studio, Visual Studio
Other Tools: Git, SVN, Splunk, putty, Jira, Confluence, AWS-cli
Methodologies: OOAD, Agile, Scrum
Sr. Software Engineer
Confidential, New York City
- Writing Scala module which is reading large sets of unstructured data from S3 bucket, partitioning and producing JSON/Parquet file. Applied business logic on Spark, put aggregate data into Hive and S3 by Python Lambda function.
- Scripting Python module to read on/offline client’s order data from S3 and processed by Lambda, for attribution.
- Developing Python module to improve existing ROAS (up to 600%), after applying Machine Learning algorithm. Uses Spark to get the aggregate result which runs AWS data pipeline/EMR cluster.
- Creating Java module runs on Lambda function, to pull message from SQS and parse the message to know input file.
- Making mapper/reducer class to produce aggregate data after parsing JSON file. Creating class dump data to Redis.
- Generating a unique key for each metrics. Creating dashboard on grafana to produce live metrics.
- Writing Java module-produces sales data in JSON file in S3, parses, populates by velocity engine and sends to clients.
- Developing Python classes which read real time data of job details from Oozie and produce their current status in time/percentage (is faster/slower respect to avg time) and also produce which job is slower or failed with the error message. Send real time alert if any job is slower than a threshold or failed. All data sends to datadog dashboard.
- Building Python class which reads database table to check accuracy of attribution data (Hive vs. SQL).
Environment: AWS - S3, SNS, SQS, Lambda, Data pipeline, Redis, Redshift, Dynamo DB, EMR, Python, Spark, Hive, Hadoop, Oozie, Java, Scala, SQL Server, Elastic Search, Logstash, Flask, Datadog, Grafana, Angular js.
Sr. Software Engineer
Confidential, New York City
- Designed and developed 15+ major components of Confidential .com including: Membership, Calendar, Payment, Corporate sales and CRM. All of the components were written in Java, spring, Hibernate, and used Web services modeling techniques and SOA based strategies to meet stakeholders’ requirements.
- Wrote special hibernate mapping tools which were generating hbm, DAO, DAO Impl, mapper classes.
- Created scripts to optimize the HQL query, avoiding Select N+1 problem and improved performance of DAO layer.
- Analyzed systems and wrote requirements to Confluence by using MS Visio.
- Deployed code in dev, test/stage environments and worked with devops team to push the code into prod.
- Wrote unit and system integration test cases using jUnit and dbUnit framework. Wrote job to see service’s status.
- Worked closely with QA team and business stakeholders to test system during the testing phase.
- Utilize ESB Mediation: It was a bridge between integration component, other web services, and core project.
- Created integrating business modules, process flow on Enterprise Service Bus by IBM tools.
- Devised a framework which was using for changing binding of endpoint dynamically.
- Cloud Integration: These modules were used to communicate with salesforce.com, and mosocloud.com.
- Exposed web services, created connectors to cloud endpoint/database, transformed data according to business.
- Used Salesforce API and SOQL query language to verify data at salesforce.
Environment: Linux, IBM WebSphere Integration Developer, WebSphere Process Server, Java, Hibrenate3, Spring, Spring boot, AOP, Google framework - joda-time, guava, SOAP, RESTful, SQL Server, SoapUI, Postman, IBM Cast Iron, SVN, Atlassian Confluence, Jira, Agile, Python, Jenkins, maven, Salesforce.com, mosocloud.com, findBugs.
Confidential - Maywood, NJ
- Involved in design and implementation of system and participated as scrum master with offshore team.
- Used java for extracting meta data, Exif, from image. Used iReport and Spring MVC to create report.
Environment: Mac, RCP, Java1.6, iBATIS2, Hibrenate3, Spring, Agile, JSF, DWR, Rich Faces, jQuery, Oracle10g, Tomcat6.
- Defined component detailed design and used it as a service; reports were generated by iReport and Jasper report.
Environment: Java 2EE, Hibernate3, spring, Struts, iReport, Jasper Report, Oracle9i, MySQL 5, PHP, Tomcat, Apache.
- Used Struts and MVC for application and wrote Ant scripts to create build, run automated test.
Environment: Linux, Windows 2K, Java (1.3), Struts, Oracle Developer (6i), Report (6i), Oracle Database (9i), Tomcat.