We provide IT Staff Augmentation Services!

Solution Architect Resume

4.00/5 (Submit Your Rating)

Stamford, CT

SUMMARY

  • Over 20 years’ experience in Solution Architect, DW Solution Architect and Technical Architect Project Manager for major US Banks, HealthCare companies, HealthCare department of health Arizona State. Property and Casualty Insurance and Pharmaceutical Industries.
  • Possessed outstanding technical knowledge and expertise in enterprise data warehousing, Migration Projects and business intelligence concepts, design principles and software architecture.
  • Analytical, highly adaptable professional with extensive experience developing, deploying and evaluating systems aimed at improving quality and efficiency.
  • Skilled in aligning end - user needs with long-term resolutions to complex IT challenges. Track record of success.
  • Skilled troubleshooter continually focused on identifying, isolating and resolving technical issues.
  • Strong knowledge and comfort with Cloud like (AWS, Cloud Foundry, Open Shift etc.) environments along with S3, EC2 etc.
  • Strong knowledge and comfort within Microsoft-based server environments, along with all peripheral processes.
  • Strong knowledge and comfort within LINUX/UNIX-based server environments, along with all peripheral processes and in cloud domain.
  • Proficient in Performance Optimization of reports and OLAP Cubes.
  • Working Experience with various Databases like MS-SQL Server, Oracle, Teradata, MySQL.
  • Working Experience with various NoSQL Databases like HBASE, Couch base DB, Mango DB, Couch DB, MapR DB etc.
  • Working Experience with Hive Database, Reshift Database on the warehouse environment.
  • Data modeled on MDM concepts, Consumer product group, Social Media and Tv ratings using snowflake, start schema. Converted from redshift to Snowflake Database environment.
  • Working Experience with various Data warehouse like REDIS, Data Mart, Operation Data Store, DSS supporting System.
  • Accomplished communicator skilled in building and strengthening relationships across functions to drive cohesive, strategic operations.
  • Liaised between internal and external team members and customers
  • Developed timing and mitigation plans to meet financial, scheduling, and performance requirements
  • Created and maintained clear, concise project plans and developed strategies to meet deadlines
  • Worked in Data Model rationalization, standardization, and consolidation Process along with IT Leadership team in the IT Optimization journey & EA Road Map exercise.
  • Acquired rich transnational experience in providing strategic IT Solutions and Services to global customers to leverage technology in creating revenue opportunities.
  • Worked in Identifying, Evaluating, Defining & Building KPI’s (Key Performance Index) for BI Reporting / Dashboards.
  • Established credibility in managing end-to-end large business-critical projects in global delivery model to global customers with distributed multicultural, multi-vendor teams from across the globe.
  • Extensive experience with Physical, Logical, Dimensional & Snow flake Data Models.
  • Good working knowledge in Project Management using Agile, Scaled Agile& Waterfall Methodologies.
  • Informatica /Talend performed develop, architect, and Extract, Transform and Load (ETL) Processes.
  • Excellent Communication Skills, delivery oriented and a self-starter
  • Specialized in ETL tool - KABANA, Tableau and Pentaho understanding of BI Reporting Tools.

TECHNICAL SKILLS

Cloud: Google, AWS, Cloud Foundry private cloud, Open shift private cloud, containers Kubernites, Docker (on LINUX, UNIX, Windows), AZURE.

Front End: Angular(2.0/4.0),React JS, Node JS, Spring MVC,VERTX

API: RESTFUL,REST,JAXRS,JAX-RS,JAX-WS,VERTX

MIGRATION TOOLS: TALEND, Pentaho, IBM DATASTAGE,SSIS

BITOOLS: KABANA, Tableau, Pentaho

Search Tool: ELK and SOLR

Project Status tools: Jira, Ralley

Security: LDAP, kerbrose, SSL,SSH

Microservices: API gateway such as Mulesoft, APIGEE, SOA J2EE Platforms, Spring Technologies, Spring boot, Spring MVC,, Hibernate, Web Logic, AquaLogic, JBoss, Web Sphere, CRC, UML, Rational Rose, Together Control Center, Visio Data Model and Design Platforms, Horton Works (Ambari), Cloudera Manager (CM) Shell Scripting (C, Korn & bash), CITRIX Server, Python, Perl and php. AngularJS (1.6, 2.0), REACTJS,NODEJS, Oracle (8i/9i/10g/11g,12c), DB2/UDB (10/11), MS-SQL Server (2016,2018)

Build: Maven, Gradel, Antlr(UNIX/LINUX/Windows)

CI/CD: Jenkins(UNIX/LINUX/Windows),NEXUS repository

Version control: Git/Github, SVN,CVS, HARVEST (UNIX/LINUX/Windows),Team Foundation

Streams First Technology: Apache Kafka, Apache Kafka Admin, Apache Flink, Splunk stream, Kafka Stream, Spark stream.

BIG DATA: Hadoop Technologies, HIVE, Apache Spark, Sqoop, oozie, Hbase, SCALA,PYTHON,JAVA 1.8, RXJAVA

Relational Databases: ORACLE,MYSQLDB,SQLSERVER DB,Teradata,DB2, UDB

NoSQL DB: MAPR db, Cassandra DB, Couch base db, MongoDB, REDIS Cache db, MEMSQL

Data warehouse: Star Schema, Snow flax Schema, REDIS Data warehouse (All Database, Colum base Database or key value pair, Row base).

PROFESSIONAL EXPERIENCE

Confidential, Stamford CT

Solution Architect

Responsibilities:

  • Currently working on creating a streaming service for Confidential programs similar to Netflix, Hulu, and Amazon Prime Video as the Senior Manager, utilizing kafka data streaming. The programs include Confidential Smackdown, RAW, Total Divas, Total Bellas, etc. Technical functions have been installed using the following: edge node cloud computing on AWS, data architecture and modeling on Redshift and Snowflake, and data feeds on external sources from S3 Bucket, Lambda functions, and EC2 environments.
  • Data modeled on MDM concepts, which include consumer product groups and social media and TV ratings using Snowflake and Star Schema. The model is converted from Redshift to a Snowflake database environment.
  • Created a search system using supervised data using keys and queries from Cassandra to retrieve UI such Node.js and Network 2.0 systems. Used elastic search patterns .
  • Machine learning algorithms were applied to the data model . The machine learning patterns are tagged according to the most viewed topics discussed on social media chats and online searches. The data from these platforms are retrieved from kafka streams to create the model.
  • Created ETL architecture for the external data feed. This is stored in HIVE to load into Redshift, and then later converted to Snowflake. ETL pipeline scripts and jobs have also been created on cloud.
  • Imported and exported large sets of data into HDFS and vice-versa on AWS.
  • Worked on HIVE partition and Hadoop Big D ata technologies on AWS .
  • Worked on AWS to setup between EMR S park C luster and S3 S torage.

Confidential, Farmington,CT

Solution Architect/Senior Manager

Responsibilities:

  • W orked as a Solution Architect/Senior Manager utilizing Kafka, Postgres Rabbitmq, Microservices architecture and Pivotal Cloud Foundry and AWS; working on technologies such as APIGEE,RabitMQ and Splunk.
  • Search on superverised data using keys and queries from Casandra for retrieval for UI such as Node.js and PEGA claims systems. Used some of the lucene search patterns on Elastic search patterns .
  • Talend tools was used to extract from various policy admin system to update current status of policy, from policy admin systems and update the Casandra database.
  • Each API was written write on to postgress tables of SAGAs for the failure tracking machanism for each event the api that event was not posted to various systems eg RabitMQ queues or failover machanism on PCF cloud.
  • APIS and proxies are deployed platform as service using jenfins pipeline scripts. I have deployed to verious environment and did the setup on Github and deployment done our separate space on the cloud on PCF. All the logs, splunk scripts .
  • Oauth security, verifing the token, generating the token process done with additional security layer at the spring boot with jwt encryption done at transport layer.
  • Processing posted payments to SAP using REDIS cache and integration of Premium billing cycle and payments. Secondly, disbursement of payment during Policy insured death with multiple Benificiaries that are to disbursement to SAP and retrival of payment checks amount disbursed etc on machanism on PCF cloud..
  • Involved in Data Modeling sessions to develop models for Hive tables on PCF.
  • Convert Agency management, policy customers for various project to MDM, hierarchy and snowflake models using Snoflake tools on cloud enviroment.
  • Imported and exported large sets of data into HDFS and vice-versa on AWS.
  • Worked on Hive partition and Hadoop Big data technologies on aws.
  • Worked on AWS to setup between EMR spark cluster and S3 storage
  • Worked on AWS Web service computing EC2(resizing of services during high traffic) and S3 storage for the document retrival for email communication attachment of documents etc to PEGA cloud (AWS) systems.

Confidential, Phoenix, AZ

Technical Architect Project Manager and Consultant

Responsibilities:

  • Currently working as a Solution Architect/Senior Developer utilizing Big data, Microservices architecture and AWS; working on technologies such as Eureka, Zulu and Hystrix.
  • Changing to Proxies for network mesh architecture for defining servers on proxies related to Microservices.
  • Informatica Extract from cloud and distribution of data using virtualization.
  • Recommended node Horizontal/Vertical scalability needed for capacity planning for the existing processes such as Informatica Cloud/Informatica Master data management/TALEND/Hadoop for distributed/parallel processing environments.
  • Integration of all data source posting to General Ledger, balancing ledger, remittance, differed ledger etc.
  • Processing posted payments using REDIS cache and integration of billing cycle and payments.
  • Involved in Data Modeling sessions to develop models for Hive tables.
  • Imported and exported large sets of data into HDFS and vice-versa.
  • Worked on Hive partition and Hadoop Big data technologies.
  • Worked on AWS to setup between EMR spark cluster and S3 storage
  • Worked on AWS Web service computing EC2(resizing of services during high traffic) and S3 storage

Confidential - Englewood, CO

Technical Project Manager and Consultant

Responsibilities:

  • Closely worked with the customers and executive team to access the requirements and functionality for the Enterprise applications.
  • Analyze the functional requirements and create JIRA stories
  • Prepare detailed design documents for multiple APIs and uploaded the same in Confluence(documentation)
  • Developing next generation user interface for Confidential ’s front end applications: Smart point, GVM, Galileo Web Services for consumption of Online Travel Agents adhering to OD standards
  • Customizing micro services strategy that would allow for greater flexibility and scalability for Confidential ’s systems by designing multiple models for the MICRO services.
  • Recommended node Horizontal/Vertical scalability needed for capacity planning for the existing processes such as DATA Stage/TALEND/Hadoop for distributed/parallel processing environments.
  • Co-ordinate and mentor with the onshore/offshore development team for their clarifications
  • Manage application deployment in Dev/Test/QA/Production environment
  • Ensure on-time status reporting to managers/Scrum masters as Technical Architect/Development for proof of concepts.
  • Drools business rules implementations on the processes and procedure.
  • Code Version control using GIT,GITHUB
  • Reports using high performance queries using IMPALA for data extraction from Hadoop.
  • Worked on Open Shift to setup between spark cluster and Open Shift storage
  • Worked on Azure Web service computing (resizing of services during high traffic) and Internal storage
  • Worked on RXJava map reducer and performed performance tuning for the processing of data and loading into Hadoop.
  • Python scripts for processing the data and reporting on Dashboards.

We'd love your feedback!