We provide IT Staff Augmentation Services!

Technical Architect Resume

4.00/5 (Submit Your Rating)

OBJECTIVE

  • Accomplished Software engineering professional with 16 years of experience in the JAVA, J2 EE, SparkCore, Spark SQL, Hadoop, Micro Services, using Spring Cloud, Spring Boot, Spring MVC, Semantic Web T echnologies, involved in designing and developing software applications. Work experience in Public Sector, Supply Chain, Retail, Insurance, Energy and Media domains. Willing to take on demanding projects and always strive in implementing the BIGDATA projects using future technologies in AI/ML in combination of linked data structures like graphs.

SUMMARY

  • Experience in systems architecture and design using architecture frameworks and design patterns, multi - tier architecture and distributed software Development
  • Hands on experience on major components in Hadoop Ecosystem like Hadoop HDFS, YARN and SPARK.
  • Strong Knowledge on Scala, Hive, Spark Streaming and Apache Kafka
  • Strong Object-Oriented Concepts, Functional Programming, extensive experience in conceptualizing, designing and implementing MVC, Micro Services Architecture across various projects using J2EE and Spring Cloud frameworks.
  • Good Experience in designing, architecting BIGDATA batch processing, transformations.
  • Experienced in Service Oriented Architecture (SOA) and publishing Web Services that include several components like WSDL, SOAP, RESTful, Axis and JAX-WS
  • Developed Micro Services using Spring Boot, Restful, Spring Cloud.
  • Based on the size and complexity of the project, plays the role of a architect and developer to implement/review application code to meet expected quality standards.
  • Worked with major clients like Canada Government, Coca-Cola (Atlanta) - USA, State of Arkansas, State Of South Carolina, State of Confidential, New York Life Insurance -USA, PUMA (Germany), G.E (Energy), Atlanta - USA, Progressive Energy, USA and Schneider Electric.
  • Played a vital role in the projects to address any architectural gaps in a regressive way and resolve the issues/blockers and dependencies within or outside of the team.
  • Hands on Experience in using Test Driven Development and Model Driven Development practices.

TECHNICAL SKILLS

Programming Languages: JAVA / J2EE

Frameworks Spring: Boot, Spring MVC, Spring Cloud, Micro Services, Struts 1.x, 2.x, Spring Data, JPA, JMS, Spring REST, Apache Jersey, Apache Axis, Apache Camel, Servlets, EJB, AWT, Swing, JDBC.

Internet Technology: JSP, HTML, XML, XSLT, JavaScript, jQuery, Json, Angular, Fetch, Axios, AJAX, CSS, React JS, Angular JS, REST, SOAP, Node JS, npm

Bigdata /Hadoop: HDFS, YARN, Spark

RDBMS: Oracle, MySQL, DB2,SQLServer, SQL, PL/SQL.

ORM: Hibernate, Spring Data, iBatis

No SQL/ Graph databases Semantic Technologies: (Fuseki, Jena, Virtuoso, sparql), MongoDB, Cassandra

IDE: Eclipse, RAD (Rational Application Developer), IntelliJ, STS, VS Code

Application Servers: WebLogic, IBM Web Sphere, Tomcat, JBoss, Portal Server.

Messaging: IBM WebSphere MQ, ActiveMQ

Cloud: Microsoft Azure, AWS

Operating Systems: UNIX, Windows, Linux

Configuration Management: CVS, SVN, GIT, Jenkins, JIRA, UDeploy, SonarQube, Maven, CICD, SOAP UI, Postman.

Design Tools: Visio, RSA, Enterprise Architect

PROFESSIONAL EXPERIENCE

Confidential

Technical Architect

Responsibilities:

  • Preparing Technical Design for the Functional requirement for the programs.
  • Multi-Benefit Payment Processing: Developed ETL jobs using Azure Services and Hadoop platform with Spark to process target benefit payments to eligible citizens by processing millions of structured data sets and send payment advise to banks to process payments to citizen bank accounts.
  • Involved in development of Rest API end points for the functional requirements.
  • Involved in designing and writing batch programs in parallel processing of the threads using multi core CPU’s using Java8 API to perform batches to process faster.
  • Involved in designing and writing big data batch processing using HADOOP, HDFS and SPARK to process citizen’s benefits for the payments processing to the banks.
  • Load and transform large sets of structured data.
  • Involved in creating Azure Data Factory(ADF) to create pipelines and monitor the batch job process.
  • Developed Azure functions to send REST all to trigger the function to execute additional processing.
  • Used Parquet format to process the data to reduce the Cloud operational costs.
  • Configure and create Batch Process Automation using SOS-Berlin job scheduler, writing file transformations, and generating output file for the processing and creating data pipeline to process the financial batch jobs.
  • Involved development of front-end application using React JS, Redux and Bootstrap.

Environment: Spark Core, Spark SQL, Java 8, React JS, Redux, Node JS, Spring Boot, Micro Services, SoS-Berlin Job Scheduler, Azure Data Factory(ADF), Azure HDInsight, Blob Storage, Pipelines, Git.

Confidential

Technical Architect

Responsibilities:

  • Preparing Functional requirements to design and deliver bigdata processing for benefits.
  • Involved in design and develop bigdata solution to process recertification notices for beneficiaries by considering previous notice categories.
  • Used Parquet format to process the data to reduce the Cloud operational costs.
  • Involved in designing and writing big data batch processing using HADOOP, HDFS and SPARK to process citizen’s benefits for the payments processing to the banks.
  • Involved in creating AWS EMR and S3 Buckets.
  • Involved in development Rest API to develop notice framework which can insert data in to notice tables to process notices generation.

Environment: Spark Core, Spark SQL, Java 8, J2ee, Spring, Spring Boot, Spring Cloud, Micro Services, AWS, S3, EMR, Apache Kafka Tomcat, DB2.

We'd love your feedback!