We provide IT Staff Augmentation Services!

Big Data Analytics Architect Resume

5.00/5 (Submit Your Rating)

PROFILE SUMMARY:

13 - year experience in IT Consulting Services as architect/lead and developer. Currently working as Big Data Technical Architect in Confidential for an energy client.

SKILL:

  • Big Data
  • BigData & Analytics Technical Design & Architecture
  • Apache Spark
  • IBM Rational Team Concert
  • Relational Database Management Systems (RDBMSs)
  • Hadoop
  • Big Data Analytics
  • Hive
  • Apache Kafka
  • Flume
  • MySQL
  • Kubernetes
  • Hadoop Hortonworks
  • Hadoop Cloudera
  • Scala Programming Language
  • 2018 noSQL HBase
  • Machine Learning
  • Spring AOP (Aspect Oriented Programming)
  • Spring Framework
  • Apache Maven
  • Batch Architectures
  • Batch Applications for the Java Platform
  • GitHub
  • Spring Batch Framework
  • R Studio
  • AWS Redshift
  • Oozie
  • Core Java EE Design Patterns
  • Eclipse Integrated Development Environment
  • IBM Rational Application Developer
  • JEE Architecture
  • JavaServer Pages (JSP)
  • Java Servlet
  • Asynchronous JavaScript and XML (AJAX)
  • Atlassian JIRA
  • Hibernate ORM Java Framework
  • JavaServer Faces (JSF)
  • Spring Web model view controller
  • Java API for XML Web Services (JAX-WS)
  • Apache Struts
  • Spring Data
  • UNIX Shell Scripting
  • Hadoop MapR
  • Sqoop
  • JQuery
  • No sql Couchbase
  • SpringBoot Microservices

TECHNOLOGIES WORKED ON:

Big Data Skills: Hadoop, HDFS, Map Reduce, Spark, Cloudera, Hortonworks

Stream processing: Kafka, Flume, Storm, Spark streaming, Apache Nifi, Rabbit MQ, Sqoop

DataBases: Hive, Aws Redshift, Oracle, DB2, MySQL, Phoenix

NO sql DB: CouchBase, HBase, Redis, Neo 4j, Cassandra, Dynamo DB

Cloud Storage: AWS S3, Azure Blob

Language Skills: Java, Scala, Shell script, SQL

Java Frameworks: SpringBoot, Microservices, RestFul Web services, Hibernate/JPA, Spring, JSF, Struts, Batch Executor Framework, JSP /Servlet.

Data Science: Spark ML, Apache Zeppelin Notebook

Containers: Docker, Kubernetes

OS: Unix, Ubuntu, Windows

Azure: Azure Kubernetes Service, Azure Redis Cache, Azure Blob.

AWS: Redshift, S3, EC2, Dynamo DB

Business Domain: Energy, Insurance, Pharmacy Retail, Health and Public safety, Criminal and Justice

JOB EXPERIENCE:

Confidential

Big Data Analytics Architect

Responsibilities:

  • Architected the data pipeline for the ingestion of volume/pressure parameters in the system (Azure Blob) using spark batch transformations on Kubernetes pods (containerized using docker) in spark standalone cluster (Azure).
  • Designed the spark application framework for the analytics generating events.
  • Designed Containerized (docker) springboot microservices api’s on pods orchestrated using kubernetes.
  • Rabbit mq publisher and subscriber for syncing data between ARIS (analytics app) and CIW(mobile app).
  • Spark SME for architecture /design /performance of the spark standalone cluster on the kubernetes pods.
  • Integrated SparkR /Python machine learning programs in ARIS.
  • Design for Apache Redis database (as cache) for performance improvements in the web application.
  • Data Modeling for CouchBase NoSql DB, MySQL DB for the data visualization reporting tool.

Confidential

Big Data Architect

Responsibilities:

  • Technical architecture/configuration for data pipeline to continuously stream data from various sources such as IBM MQ, logs/rdbms using flafka design (flume + kafka) to Hive DB.
  • Developed data model for the Hive data lake.
  • Security configuration using Kerberos for hdfs,hive,spark. For kafka network streaming using SSL.
  • Developed Structured Spark Streaming Parsing/Transformations for in flowing data.
  • Developed custom components for flume (interceptors), kafka (partitioners,serializers).
  • Developed ETL spark batch code to denormalize the data for consumption for the presentation layer.
  • Performance Testing/Scaling of the data pipeline.

Confidential, Chicago

Big Data Architect

Responsibilities:

  • Architected data streams using Apache Nifi, Kafka and spark streaming /spark batch to stream and transform data directly into phoenix tables/ AWS Redshift Datawarehouse using lambda architecture. Security configuration using Kerberos for hdfs,hive,spark. For kafka network streaming using SSL.
  • Integration of various technologies (kafka for streaming data + spark streaming for processing + HBase DB/Phoenix for storage) for AIP ( Confidential Insights Platform).
  • Security configuration using Kerberos for hdfs,hive,spark. For kafka network streaming using SSL.
  • Developed data model for Phoenix/HBase tables for the three layers: staging, intermediate and denormalized.
  • Developed Data model for aws redshift datawarehouse
  • Developed ETL Spark Batch Transformations for raw data to final denormalized data in Phoenix tables.
  • Developed machine learning algorithms for win retention probability and loss probability.
  • Data transfer from amazon S3 to amazon redshift.

Confidential

Big Data Support Analyst

Responsibilities:

  • Developed component (spark/hive scripts) which generates automatic Load and Insert to load files in HDFS (Hive Datawarehouse) through Apache oozie.
  • Finalizing data model, creating Hive/Phoenix DDLs, hive/hbase database in HDFS and creating hive/Hbase tables. This included understanding client needs and file pattern and making decision on hive partitioning scheme, load type (full refresh or incremental load).
  • Developed scripts to convert the hive tables into hbase tables by storing them as HBase storage in Hive

Confidential

Project Lead/Manager

Responsibilities:

  • Managing the interfaces and batches team for IOWA Confidential maintenance.
  • Solving critical production defects (Java/J2EE application).
  • Improving productivity by developing innovative reusable functionalities and automation.
  • Enhancing the application based on the user requirements.
  • Trained the team technically and functionally for the technologies used in the application (spring/hibernate).
  • Developed new modules for the ABMS and Interfaces.
  • Optimized the performance for batches.

Confidential

Team Lead

Responsibilities:

  • Analyze, design, code the java programs according to the requirements specified.
  • Work with Application Architects/Business Process Architects/Specialists and other System Specialists to gather and interpret user/system requirements into design specifications.
  • Used ABMS/ACSSP frameworks to code.
  • Improved performance of the Java batches by implementing multithreading through executor framework.

Confidential

Project Lead

Responsibilities:

  • Design the overlap analysis application and code the application using Java executor framework for batches
  • Managing the team by having daily stand up calls and tracking.
  • Support the master data management team (IBM Initiate).

Confidential

Support Analyst/Onsite-Offshore Module Lead

Responsibilities:

  • Leading the offshore team i.e. inform them of the client expectations for the applications.
  • Supporting the applications technically by resolving L3 technical tickets.
  • Worked as a Java Developer for a module in CMS. Used Spring, Confidential SDF.
  • Deployment and unit testing of the application.
  • Weekly reporting to the client, management.

Confidential

Java/J2EE Application Developer

Responsibilities:

  • Designing and coding the web application using Struts/Hibernate/Oracle DB.
  • Training the new trainees on the technologies used in the system.
  • Improving the performance of the existing batches in the Tax Mantra application.
  • Performance Testing and Improvement of the application.

Confidential

Java/J2EE Application Developer

Responsibilities:

  • At design phase I was responsible for the creation of the Functional and Technical design documents -SRS/HLD.
  • At the development phase I was Involved in Designing and Development of two web application Modules using Spring, Hibernate, JSF as the J2EE frameworks. DB2 was used as DB.
  • ETL developer using pentaho.
  • Training the client about the application
  • Maintenance and Enhancements after the go live of the application.

We'd love your feedback!