We provide IT Staff Augmentation Services!

Solution Architect Resume

5.00/5 (Submit Your Rating)

PROFESSIONAL EXPERIENCE

Confidential

Solution Architect

Responsibilities:

  • Created initial workshop project plan & schedule to assess enterprise digital architecture assets.
  • Evaluated requirements definition: on - sight: J2E architecture requirements, data lake virtualization areas: platforms, tools, applications, security, and network architectures.
  • Assessed pipe line analytic programming components in J2E architecture (lambda functions) & micro-services, etc.
  • Developed solutioning e-commerce per process models: e-learning, apprenticeship, electrical for dev-ops modeling improvement criteria
  • Assessed digital assets source system landscapes in MySQL, Apache, AWS, Courseware, MediaWeb, Lambda functions; Avalara tax, keystroke, digital books, Web-internal, apprenticeship, and Content assets.
  • Recommended process model delivery in Agile/SCRUM facilitation on digital lake components and best practices scaling options
  • Assessed full stack repository monitory systems-virtual memory allocation: security JAVA, JAR, virtual instance-JVM connection monitoring between clusters, contains and nodes clusters - appliance cloud options, etc.) REST API’s components (SOA/ESB/programming design)
  • Developed initial architecture documentation: process design, release management, process artifacts, leading toward future state deliverables -data migration - pipeline analytics: data modeling design
  • Alignment programming techniques of the following J2E eco spaces, Java - applets, Angular 6.1, and python

Confidential

Sr. Solutions Enterprise Architect

Responsibilities:

  • Created initial workshop project plan & schedule to assess enterprise architecture frameworks.
  • Evaluated requirements definition: DEMOS via release configuration for on-sight: infrastructure data lake virtualization areas: platforms, tools, applications, security, and network architectures.
  • Assessed workday solutioning ticketing process for dev-ops modeling improvement criteria
  • Assessed: full stack source system landscapes in Peoplesoft, Oracle, SAS, Cloudera, Informatica, datastage, Hadoop eco-space for memory management and on disk fragmentation from hardware appliance & Hadoop cluster common: HDFS & Map Reduce.
  • Assessed full stack repository frames setup for virtual memory, security & ETL data pipelines, Jboss, IDocs, JAVA, JAR, virtual instance JVM connection monitoring between cluster client configuration between portals, ESB Hubs, Node configuration setups of master, slave & data nodes - per ports between hardware & datapipe line applications in Hadoop.
  • Big data pipe line analytic components, Hadoop options - Eco space (ingestion tools: Hive - sql, Nosql, data migration, scripting, mappers, reducers, clusters - appliance cloud options, etc.) REST components (SOA/ESB/programming design - Clustering appliance, frames- RAID, port transfer options, COSMOS copler packages, with attuity micro-services, etc.).
  • Developed initial architecture documentation: process design, release management, process artifacts, leading toward future state deliverables -data migration - pipeline analytics: data modeling design (Customer catalog taxonomy normalizations at various levels).
  • Developed GCP virtual platforms, data services, per project cluster - node, with Exadata, SAS functions virtual network hybrid prim.
  • Recommended full scope delivery on Agile/SCRUM facilitation on data lake components and best practices scaling options
  • Recommended HADOOP: MAP R vs. HDFS for SAS scientific research & on-campus student enrollment criteria attrition models. Apache products alignment programming techniques of the following eco spaces of Hadoop: Scala, Scoop, Spark, HiveQL, Nosql in Cloudera BDM with informatica (power center MDM core repository).
  • (power center MDM core repository).

Confidential

Sr. Solutions Enterprise Architect

Responsibilities:

  • Created initial workshop project plan & schedule to assess enterprise architecture frameworks.
  • Evaluated requirements definition: DEMOS via release configuration for on-sight: infrastructure data lake virtualization areas: platforms, tools, applications, security, and network architectures.
  • Assessed workday solutioning ticketing process for dev-ops modeling improvement criteria
  • Assessed: full stack source system landscapes in Peoplesoft, Oracle, SAS, Cloudera, Informatica, datastage, Hadoop eco-space for memory management and on disk fragmentation from hardware appliance & Hadoop cluster common: HDFS & Map Reduce.
  • Assessed full stack repository frames setup for virtual memory, security & ETL data pipelines, Jboss, IDocs, JAVA, JAR, virtual instance JVM connection monitoring between cluster client configuration between portals, ESB Hubs, Node configuration setups of master, slave & data nodes - per ports between hardware & datapipe line applications in Hadoop.
  • Big data pipe line analytic components, Hadoop options - Eco space (ingestion tools: Hive - sql, Nosql, data migration, scripting, mappers, reducers, clusters - appliance cloud options, etc.) REST components (SOA/ESB/programming design - Clustering appliance, frames- RAID, port transfer options, COSMOS copler packages, with attuity micro-services, etc.).
  • Developed initial architecture documentation: process design, release management, process artifacts, leading toward future state deliverables -data migration - pipeline analytics: data modeling design (Customer catalog taxonomy normalizations at various levels).
  • Developed GCP virtual platforms, data services, per project cluster – node, with Exadata, SAS functions virtual network hybrid prim.
  • Recommended full scope delivery on Agile/SCRUM facilitation on data lake components and best practices scaling options
  • Recommended HADOOP: MAP R vs. HDFS for SAS scientific research & on-campus student enrollment criteria attrition models. Apache products alignment programming techniques of the following eco spaces of Hadoop: Scala, Scoop, Spark, HiveQL, Nosql in Cloudera BDM with informatica (power center MDM core repository).

We'd love your feedback!