Technical Architect / Lead Developer Resume
PA
SUMMARY:
- 13 years of professional experience in Software Development Life Cycle process which includes analysis, design, development of Hadoop/Spark and Java/ JEE framework based applications with a wide range of skill sets and roles.
- Expertise in Spark technologies Spark Core, SparkSQL, RDD, Dataframes/DataSets and Scala.
- Experience in Hadoop ecosystem covering Map Reduce, HDFS, Hive, YARN.
- Expertise in designing and building data pipelines for big data implementations. And across platforms like Apache, CDH.
- Experience in technologies - Spark Core, SparkSQL, Scala, Hadoop, HDFS, Hive/Impala, Hbase, Kafka, Kudu, Mapreduce, YARN, SAS, Java and JEE, Python, Spring, XML, Web Services, Hibernate, Java-Script, AngularJS, Servlets, JSP, Struts 2, JUnit, JBPM, JDBC.
- Participated in Coursera Machine learning courses.
- Experience with statistical tools like SAS (pattern recognition, distributions like chi/t, clustering, variance, etc.).
- Familiar with No-SQL databases like HBase, MongoDB, and semantic RDF database Virtuoso.
- Experience with Spring, Spring MVC, Hibernate and WebServices.
- Experience in designing business process models using UML, Design Patterns and Rational Rose.
- Hands on knowledge of Object Oriented Analysis and Design (OOAD), Design Patterns, MVC, Multi-tier architectures and distributed architectures.
- Experience in ANT, Maven, Scalatest, JUnit, Mockito, JavaScript, HTML5, AngularJS, Bootstrap.
- Experience in JIRA, Hudson, Jenkins, Fisheye and Crucible.
- Sun Certified Java Programmer (SCJP) 1.5
- Proven technical and communication skills with strong customer orientation and client interfacing skills.
- Expertise in modeling, architecture and development tools - Rational Rose, Rational Software Architect ( Confidential ), Rational Application Developer(RAD)
TECHNICAL SKILLS:
Hadoop Ecosystem: Spark Core,RDD, Spark SQL, Scala, Sqoop, Map Reduce, HDFS, YARN, and Hive/Impala, Cloudera (CDH), Avro, Kafka, Parquet.
Programming Language: Scala, Python, Java
Database & Technologies: Hive/Impala, Hbase, Oracle, MySQL, DB2, Virtuoso, Apache Jena, JDBC
Web Technologies: JSP, Servlets, CSS, HTML5, Javascript, AJAX, JSON, XML, SOAP, REST, Velocity, JavascriptDesign Patterns: Singleton, Observer, Command, MVC, Factory, Strategy, Fa ade, Adapter, Service Locator
Testing Strategy: JUnit, Eclipse, Mockito
Configuration: Git, CVS, Svn, JazzCLM/Rational Team Concert, Rational Clear Case
Defect Tracking: Rational Clearquest, HP Quality Center,JIRA
Platforms: Windows 2000/ XP, 7, linux centOS, RedHat, Ubuntu
Build / Debugging Tools: Log4J, Junit, ANT, Maven, Jenkins, Fisheye, Crucible, Hudson
PROFESSIONAL EXPERIENCE:
Confidential, PA
Technical Architect / Lead Developer
Responsibilities:
- Perform the change data capture and the de-normalization dynamically for any given source, i.e. the spark jobs execute on any given source model and generate the bi-temporal CDC.
- Load/parse avro source files with custom schema into hive tables.
- Design and develop the change data capture (CDC) for slowly changing dimensions.
- Design and implement bi-temporality for the CDC processing, and as Hive/Impala tables.
- Denormalize the bi-temporal 3NF tables into dimensional model.
- Design and implement business effective dates and bi-temporality for denormalization process. Generate transformation queries dynamically at runtime for a given source.
- Implemented with Spark1.6.x using Scala and Spark SQL API for processing data.
- Develop business process rules for validating the schema.
- Workflow with oozie and CI/CD using Jenkins and maven build. GIT for source management and Elastic stack (Elastic Search, Log stash, Kibana) logging.
Environment: Spark RDD, Spark SQL (Datasets/Dataframes), Scala 2.10, Java, YARN, HDFS, Hive, Impala, CDH 5.7.x, oozie, zookeeper, Kafka, Jenkins, Nexus, JIRA, Confluence, RHEL, Elastic Stack, Shell script, Avro, json, parquet, Maven, Kafka, GIT, IntelliJ.
Confidential, PA
Technical Architect
Responsibilities:
- Complete ownership of the design and development of data pipe line job
- Optimization of KPI calculation with base tables having hundreds of columns and billions of records.
- Compute key performance indicators for business metrics.
- Migration of legacy ETL into optimized Spark data pipeline transformations using Spark DataFrames and Scala.
- Develop Spark jobs using Scala and Spark SQL API for processing data.
- Perform joins on dimension and fact tables using Spark SQL and persist data in KPI tables for end-user reports.
- Sqoop of SQL tables from RDBMS into the datalake.
Environment: Spark RDD, Spark SQL, Scala, YARN, HDFS, Hive, Impala, Parquet, CDH 5.6.x, RHEL, Shell script, SBT, IntelliJ.
Confidential, Chesterbrook, PA
Senior Consultant
Responsibilities:
- Complete ownership of the design and development of data pipe line jobs.
- Implemented Spark using Scala and Spark SQL API for processing data.
- Involved in converting Hive/SQL queries into Spark transformations using Spark RDDs and Scala.
- Implemented on SparkSQL, Spark UDFs for the data transformations and device data analysis.
- Optimize the performance of ingestion and consumption.
- Working closely with customer and addressing solutions for all issues.
- Worked with complete lifecycle i.e. modeling, ingestion, transformations, aggregation and data access layer.
Environment: Spark RDD, Spark SQL, Java, Scala, YARN, HDFS, AWS Cloud, Hive, RHEL, Shell script, SBT, Beeline, Swagger, API.
Confidential, Atlanta, GA
Senior Consultant
Responsibilities:
- Participated in requirement gathering and converting the requirements into technical specifications.
- Developed Map Reduce application to find and analyze useful metrics from the collected data. Evaluation of the solution in local and distributed modes.
- Analyze the data from the logs of cable, broadband and monitor performance issues. Hadoop was used to capture, store, monitor and analyze the data generated by the viewer/user logs.
- Implemented Hive UDFs for the log transformations.
- Used Flume to collect log from the servers and store the data in HDFS.
- The data was imported from various data sources, performed transformations using Hive, loaded data into HDFS.
- Developed MapReduce Jobs in java for data cleaning and pre-processing.
- Load and transform large sets of structured, semi structured and unstructured data.
- Used hive for data transformations, to analyze the partitioned data and compute various metrics for reporting.
- Developed Map Reduce to discover patterns in data usage by users.
Environment: Java, MapReduce, Hadoop, HDFS, Flume, Hive, Flume, ORC, Eclipse, RHEL.
Confidential
Responsibilities:
- Worked with the end user to get the requirements and did the DRS solutions document.
- Evaluated performances of Hive and Map Reduce for this solution.
- Used Flume to collect log from web servers and store the data in HDFS.
- Developed Map Reduce application to find out the useful metrics from the data. Did a thorough testing in local mode and distributed mode found bugs with the code and ensured 100% issue free delivery to production.
- Expert level understanding of Map Reduce internals, including shuffling and partitioning. The bottlenecks in performance of a map reduce program.
Environment: Map Reduce, HDFS, YARN, Oozie, Sqoop, ZooKeeper, Hive, Confidential WebSphere Application Server.
Confidential
Lead
Responsibilities:
- Designing the application using AngularJS, JEE platform using Spring framework and REST web services.
- Implemented a POC to demo to the business of the application functionality.
- Analyze the system requirements and prepare estimates for the project.
- Create design high-level documents from the system requirements and interaction with Business Analyst to better understand the requirements. Also create detailed LLD for implementation of the application.
- Worked as a technical lead in mentoring team and provided design/development guidance.
- Developed UI controllers using AngularJS and bootstrap.
- Implemented custom directives and several core Angular services for the application.
- Used angular ui router for the navigation views.
- Experience using AngularUI/bootstrap component modules.
- Prepare the design artefacts for implementing the proposed system.
- Implemented the middleware services using Java, Spring MVC, REST webservices.
- Used SpringREST template to develop the REST services.
- Used Spring Batch for log message processing.
- Used LDAP authentication and spring security (Spring LDAP Template)
Environment: Java, Web services, JBoss EAP 6.x, Spring Core, Spring, Spring REST, JBPM, BPMN2.0, Spring Batch, Hibernate jbpm, Eclipse, SVN, SOAP/REST, AngularJS, SASS, CSS, HTML5, SAML, LDAP, SpenGo, Spring security, Kerberos, Compass, bootstrap, Yeoman, Javascript, Visio, Scrum.
Confidential
Java Lead
Responsibilities:
- Preparing understanding documents (UseCases, DSTD’s) and functionality Specs.
- Review the specifications with product team and the management teams.
- Complete end-end delivery of the project. Designed and delivered the Hibernate feature for Confidential .
- Adopted sprint-based development cycle.
- Used spring for the MVC framework
- Used Hibernate for the feature development and modeling.
- Speaker at Rational Innovate client-focused conference.
- Reviewed beta-versions with customer technical piloting teams
Environment: Java, XML, Hibernate, Spring, Eclipse, BPM, Websphere
Confidential
Rational Software Architect
Responsibilities:
- Owned and successfully took transition from Canada labs, the BPMN import/export component of BPM.
- Successfully delivered features for customer on Compare and Merge component.
- Added new JAXB and JAXRS feature support.
- Has been instrumental in developing feature-customization of Confidential to the client - Curam Software using Java 1.5, MVC architecture. Provided feature enhancements for transforming the Curam models (modeling artifacts) w.r.t. the Curam platform.
- Used spring for the MVC framework to deliver a visualization component to customer.
- Development of a new approach which enables the Rose models to be imported with virtually no size limit (had a previous limitation of 300MB).
- Used Yourkit Java profiler for the memory analysis and performance improvement.
- Created Configuration files for using Hibernate.
- Development of the component handling Rose model migrations.
- Development, productization of the telecommunications extension API (SIP support) for 754. This API provides support for modeling domain-specific elements and artifacts.
Environment: Eclipse, Java 1.5, plug-in development, Rational Software Architect ( Confidential ), CVS, RTC, Junit, Web services, SOA, REST, JAXRS, Hibernate, Confidential BPM tool.
Confidential
Sr. Java Developer
Responsibilities:
- Designed the application from the business requirements, developed and delivered various enhancements for the product.
- Developed and led various module enhancements for NGMP, Broadband, Wireless services.
- Used JBPM for executing the various workflow processes.
- Utilized Struts2 framework, Java, velocity, servlets for the web-app. Using Weblogic application server.
- Implemented a new feature which allows administrator to configure role-based access restrictions to users. The XML and the schema are used to store and configure role-based access. This allows for easy maintenance and enhancements with no code changes.
- Hibernate OR framework is used for data maintenance.
- Java-Script and AJAX used for the request-validation and response retrieval from the server.
- Jbpm is used for the process definition of the work-flow.
- Websphere ESB used for messaging.
Environment: Eclipse, Java, J2EE, Spring, Struts2, Hibernate, eclipse plug-in development, Java-script/AJAX, Hibernate, Servlets, Velocity/html, JBPM, Websphere ESB.
Confidential
Java Developer
Responsibilities:
- Prepared the Design from the System requirements Specification. Used Java2, Hibernate, XML, JDBC, Oracle, MVC framework, OOAD and design patterns, Rational Rose for modeling artifacts..
- Implemented the Lab Order Interface. Scope includes client side admin changes and a Message Builder that will take an order generated, retrieve order/patient-specific information and builds an XML object and then transmits to the external interface/product. Technologies used - Java, XML Schema.
- The scope includes transmitting HL7 messages specific to each Lab vendor using XML, Schema and Jboss server and receive a standard response that will be readily matched to the originating order and updated automatically within the patient chart. Java 1.5, Struts2, Hibernate, Java-Script. Websphere ESB used for communication of lab orders.
- Crystal Reports for generation of reports.
Environment: JAVA, Hibernate, XML, Jboss, JDBC, Oracle, Websphere ESB, MVC framework, OOAD, and Rational Rose
Confidential
Java Developer
Responsibilities:
- Development and feature- support. Responsible for resolving technical and work-flow issues.
- Led the development of enhancements for different modules using Java 1.5, Java beans, JDBC, XML, Oracle DB, Hibernate ORM, design patterns, UML, Rational Rose.
- Hibernate ORM framework is used for data maintenance.
- Java-Script used for request-validation
- Crystal reports are used for report-generation.
Environment: Java, Java beans, JavaScript, JDBC, XML, Oracle DB, Hibernate, Crystal reports, UML, and Rational Rose
Confidential
Java Developer
Responsibilities:
- Designed Requirements and part of the development team of Questra Insite2 project.
- Developed the module that executes the database health monitoring scripts and logs the result and upload the files to Insite2 server. Oracle, JDBC, Java.
- Eclipse for development environment.
Environment: Eclipse, Java, Oracle, MS-SQL server.