Senior It Developer Resume
SUMMARY
- 16+ years of experience in full Software Development Lifecycle (including requirements gathering, functional specification development, Project Analysis, Design, Development, Testing, Maintenance and End - User support) in Bigdata technologies( 4+ Years) and Web technologies(12 Years) with skills in analysis, design, development, testing and deployment of ETL Jobs, Web / Enterprise based Client /Server applications using Hadoop technologies like Hive, HBase, Spark SQL, Spark Streaming, Apache Nifi, Java, J2EE Technologies, NodeJS.
- Good experience in Hadoop Technologies in Designing and Developing ETL Jobs using Spark SQL, Spark Streaming, Hive, HBase, MongoDB and Postgres.
- Good understanding of Hadoop architecture and underlying frameworks like HDFS, resource management (YARN).
- Experience in Designing and Developing Real time Data Pipelines using Spark Direct Streaming, Spark Structured Streaming between (to and from) databases Hive, HBase, RDBMS(DB2) and MongoDB.
- Experience in Designing and Development of Extract Transfer Load (ETL) batch jobs using Spark with Scala between different databases to transfer data from Hive to RDMS, create Mongo DB collections with data from Hive and RDBMS, create collections in Postgres with data from Hive.
- Experience in Designing and Developing Reconciliation frameworks using Spark Scala ETL jobs between RDBMS and Mongo Data and Hive and RDBMS…etc.
- Experience in Designing and Developing data routes using Apache Nifi.
- Experience in Designing and Developing Workflows using Netflix Conductor.
- Extensive experience in J2EE architecture and developed serverside applications using technologies like Node JS, Java, JSP, Servlets, Spring, Web services - REST & SOAP, JDBC, JNDI, JSTL, JMS, Struts, XML, and XML Schemas.
- Experience in designing and developing Microservices using Spring REST, Spring Boot and Node JS with Mongo DB and Postgres.
- Experience in designing and developing Rich UI interfaces with React JS with Redux.
- Experience in developing application using Spring MVC with Hibernate.
- Experience in Developing the Rich Internet Applications and rich user interfaces using JQuery, AngularJS, Ajax, JSON and also has good knowledge about node.js.
- Very good experience to use browser based APIs JQuery (Events, Effects, Ajax, Selectors, Attributes, Traversing, Manipulation, CSS, Utilities, Internals) and Prototype.
- Extensive knowledge in Ant, JNDI, XML, XSLT, XPath, XQuery, JAXP, DOM, SAX, RMI, WSDL, SOAP, REST (RESTful webservice).
- In depth understanding of Apache Spark after developing various Proof of Concepts.
- In depth understanding of Scala and its usage with Lift framework.
- Experience in writing entities in Scala and Java along with named queries to interact with database.
- Experience in generating reports and publishing them in Hyperion.
- Development Methodologies: Agile, Rational Unified Process (RUP), Test Driven Development.
- Testing: Supporting and involving application development testing using JUnit, system testing, boundary testing,regression testing and UAT testing phase.
- Experience in development and deployment of J2EE Application archives (jar, war, ear) on WebLogic 7.0/8.1/10.3 , IBM Websphere 7.1, Tomcat Application Servers.
- Experience in relational database systems Oracle, DB2, Postgres, MS SQL Server and My Sql.
- Good knowledge of NoSQL database concepts, MongoDB.
- Rich experience in Java, Object Oriented Analysis & Design (OOAD), particularly in the development of Sequence and class diagrams using MS Visio, Rational Rose and IBM RSA.
- Extensive knowledge and rich experience in RAD6.0, WSAD5.1, Eclipse IDEs
- Designed and developed stored procedures in Oracle to migrate the complex business logic from DAO to Database, which enhanced the performance.
- Experience on RUP, Agile and Extreme Programming (Test Driven development, Code Reviews)
- Experienced in application production support and troubleshooting problems.
- Excellent Team Player with ability to perform independently.
- Has years of experience in Requirements elicitation, analysis, Estimations and Design of web projects.
- Has complete knowledge on various quality practices that are followed.
- Excellent analytical, written and verbal communication skills with the ability to interact with individuals at all levels and capable of picking up any new technology with a minimum learning curve.
TECHNICAL SKILLS
Operating Systems: Windows2000, Linux (Red Hat, Fedora), UNIX, AIX, Windows 95/98Windows XP, NT/2000, MS DOS.
Web/App Servers: Websphere 5.1/6.1/7.1, WebLogic7.0/8.1/10, Tomcat 5.
Hadoop Technologies: Spark SQL, Spark Streaming, Apache Nifi, Hive, HBase.
WorkFlow Tools: Netflix Conductor
Java Frameworks: Spring REST, Struts 1.3, Spring IoC, Spring MVC, Spring AOP, Hibernate
Web Technologies: JSP, JSTL, Servlets, JavaBeans, Applets, XML, JQuery, REST, SOAP/HTTP, JAXB, WSDL, JNDI, RMI, Ajax, JSON, Angular JS, Node JS, React JS, Redux, Microservices
IDEs: Weblogic Workshop, IBM RAD6.0/7.0, WSAD, Eclipse 3.4.
Languages: JAVA, HTML, JavaScript, AJAX, XHTML, XML.
PROFESSIONAL EXPERIENCE
Confidential
Senior IT Developer
Responsibilities:
- Interact with business analysts to understand business requirements and implement the same.
- Designed and developed ETL jobs to validate data, stage it at an intermediate location and then load the data to destination database. Apart from the intended requirement this design includes control, alerts, replay and reconciliation features as well to provide a base for Operational dashboard reporting needs and faster recovery in case of failure. Technologies used are Hadoop technologies like Spark SQL, Mongo DB/Postgres, Hive and Mongo DB/Postgres.
- Designed and developed a Streaming data pipeline, which collects data from a Kafka middleware system in Realtime, validates it, transforms it and stages it in Hive and finally stages in MongoDB/Postgres. Apart from the intended requirement, this design has to capture the business validation failure records and report the same. Technologies used are Hadoop technologies like Spark Streaming, Hive, Kafka, MongoDB/Postgres.
Environment: Hadoop Technologies like Hive, HBase, Spark SQL, Spark Streaming (Direct & Structured), Workflow Orchestration Tools like Netflix Conductor Core, Scala, Mongo DB, Kafka, Postgres, JSON.
Confidential
Senior IT Developer
Responsibilities:
- Interact with business analysts to understand business requirements and implement the same.
- Designed and developed ETL jobs to validate data, stage it at an intermediate location and then load the data to destination database. Apart from the intended requirement this design includes control, alerts, replay and reconciliation features as well to provide a base for Operational dashboard reporting needs and faster recovery in case of failure. Technologies used are Hadoop technologies like Spark SQL, DB2 and Hive.
- Designed and Developed an ETL job to validate the new Ids provided by CMS and replace the existing Medicare Id’s with new Id’s. Apart from the intended requirement this design has to capture the various reasons why some of the new Id’s failed, maintain the data lineage, retain the mapping between old and new Ids, provide a reconciliation report and gracefully handle exception and errors that might come across. Technologies used are Hadoop technologies like Spark SQL, Hive.
Environment: Hadoop Technologies like Hive, HBase, Spark SQL, Scala, DB2.
Confidential
Senior IT Developer
Responsibilities:
- Interact with business analysts to understand business requirements and implement the same.
- Designed and developed a Streaming data pipeline, which collects data from a middleware system in Realtime, validates it, transforms it and feeds to the end system middleware component. Apart from the intended requirement, this design has to capture the business validation failure records and report the same. Technologies used are Hadoop technologies like Spark Streaming, Hive, IBM MQ.
- Interact with business analysts to understand business requirements and implement the same.
- Designed and developed ETL jobs to validate data, stage it at an intermediate location and then load the data to destination database. Apart from the intended requirement this design includes control, alerts, replay and reconciliation features as well to provide a base for Operational dashboard reporting needs and faster recovery in case of failure. Technologies used are Hadoop technologies like Spark SQL, Mongo DB/Postgres, Hive and Mongo DB/Postgres.
Environment: Hadoop Technologies like Hive, HBase, Spark SQL, Scala, Mongo DB, Postgres.