We provide IT Staff Augmentation Services!

Cloud Data Integration Architect Resume

5.00/5 (Submit Your Rating)

Plano, TX

SUMMARY:

  • Data Engineer/Cloud Engineer with 14+ years of experience in Application Analysis, Infrastructure Design, Development, Integration, deployment, and Maintenance/Support for AWS cloud Computing, Enterprise Search Technologies, Artificial Intelligence, Micro - services, Web, Enterprise based Software Applications.
  • Hands-on AWS Technical Architect-Associate with 5 Years in developing and assisting Architecting enterprise level large scale multi-tier solutions that require complex Architectural decisions.
  • Result oriented approach with Exceptional leadership skills.
  • Hands-on experience on implementing Cloud Solutions using various AWS Services including EC2, VPC, S3, Glacier, EFS, AWS Kinesis, Lambda, Directory Services, Cloud Formation, Ops works, CodePipeline, CodeBuild, CodeDeploy, Elastic Beanstalk, RDS, Data Pipeline, DynamoDB, Redshift etc.
  • Hands-on experience on Architecting and securing the Infrastructure on AWS using IAM, KMS, Cognito, API Gateway, Cloud Trail, Cloud Watch, Config, Trusted Advisor, Security Groups, NACL etc.
  • Strong experience in Major AWS services like Cloud Formation, Cloud Front, Cloud Watch, Cloud Trail, VPC, RDS, DynamoDB, SQS, SNS.
  • Experienced in Designing and implementation of public and private facing websites on AWS Cloud.
  • Good knowledge on Application Migrations and Data migrations from On-premise to AWS Cloud.
  • Worked on infrastructure with Docker containerization .
  • Have experience in design, implement and test services using Python in a Microservice-oriented architecture.
  • Experience in , Amazon S3 for storage, SNS, Cloud Front for accessing and content delivery(CDN)and VPC for network security access as per requirement.
  • Administration various environments in software development life cycle (SDLC) Windows, Ubuntu, Red Hat Linux, SUSE Linux and CentOS .
  • Experienced in building Micro services using API gateway in AWS.
  • Good knowledge of RDBMS and projects using Oracle, DynamoDB, MAPR, SQL Server, SQL, PL/SQL
  • Good understanding of monitoring tools like Splunk, Nagios .
  • Efficiently build, test, deploy and maintain any-to-any dataflow pipelines using a drag-and-drop interface within an integrated development environment using Streamsets.
  • Designing and deploying scalable, highly available, Secured and fault tolerant systems on AWS.
  • Lift and shift of an existing on-premises applications to AWS.
  • Selecting the appropriate AWS service based on Data, Compute, Database and Security requirements.
  • Identifying appropriate use of AWS Architectural best practices.
  • Estimating AWS costs and identifying cost control mechanisms.

SAP COMPETENCE:

  • Business Process Analysis
  • Architecture Design and Recommendations.
  • Protyping
  • Process Design
  • Implementation and System Configuration
  • System Maintenance
  • Experience in Installation of ECC 5.0, CE 7.1, PI 7.1, CRM 7.0, SRM 5.0 and Business objects.
  • Strong at designing SAP system landscapes, DB layouts, High Availability and Disaster Recovery Scenarios.
  • Implementation: Completed 4 life cycle implementations for ERP 5.0, CRM 7.01, PI 7.1, CE 7.1
  • Support pack and enhancement pack installations for various SAP product lines
  • Experience in Handling Systems like ECC 5.0, PI 7.1, CRM, SRM, SOLMAN, TDMS, EP 7.0, and APO. Worked towards increasing an efficiency and productivity through devising an automations and improvements in the process (e.g. Monitoring, Alert notifications and MIS reports). Have taken lead on ORACLE technology and anchored the performance optimization and oracle base lining for the team. JAVA Server troubleshooting and performance analysis. Implemented Tivoli Logfile Adapter.
  • Homogenous and Heterogeneous System Copies for ABAP and Java systems.
  • Carried out modifications / tuning at the SAP and the Database level to overcome the application query performance bottleneck.
  • Designing and execution of Backup strategy and disaster recovery management. Implemented ONLINE SPLIT MIRROR BACKUP for all SAP Production systems.
  • Was able to control the growth of BW database by providing visibility to functional team by identifying the tables contributed to the growth. Played a pivotal role in Archiving of functional tables based on the recommendations given in SAP DVM guide and implemented the Archiving strategies for BASIS Tables.
  • Ensured effective operational security for both SAP and DATABASE environments.
  • Direct interaction with client in all the projects I have handled.
  • Sound Knowledge on Oracle 10g and 11g and APO Live cache Administration.
  • Handled the responsibilities of Planning, tasking, scheduling, monitoring, assessing, evaluating, motivating, mentoring & the colleagues. Fast learning curve and strong analytical, decision making, problem solving, visualizing, negotiating, communication & interpersonal skills.
  • Involved in planning and setting up of DR servers for a far DR site. Planning and executing DR Drills. Planning and coordinating UNIX Patch Upgrades. Carried out POC for PR- DR Switchover w/o Data Guard. Implemented Oracle Flashback Technology for DR Drills.

TECHNICAL SKILLS:

SUSE LINUX, Oracle 10.2.0.4, Oracle 11g, Windows 2003.

CAREER PROFILE:

Confidential, Plano, TX

Cloud Data Integration Architect

Responsibilities:

  • Installed informatica PowerCenter 10.2 in AWS EC2 instance
  • Leveraged EFS on Informatica servers as NAS shared Files system and EBS
  • Configured RDS for informatica PowerCenter version 10.2
  • Used SQL Database as Informatica PowerCenter metadata database in AWS servers
  • Provide L2 and L3 support for informatica applications
  • Installed and Configured Informatica PowerCenter 10.1 on Linux Servers
  • Upgrade Informatica PowerCenter from 9.5.1 to 10.1
  • Responsible for Power Center, PMPC and Power Exchange installations, configuring Power Center Domain, nodes, Grids and Create Different services in the Domain.
  • Implementation of Informatica PowerCenter DR
  • Used debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Migration of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments through Deployment groups.
  • Configured OS Profiles, LDAP Authentication and applying the Patches
  • Creating Users, Groups, Roles and grant privileges to the users. Create folders, Relational and Application connections and configure the ODBC connectivity.
  • Worked extensively with the off-shore development teams.
  • Configured Master Node and Backup nodes under High Availability environment
  • License compliance maintenance and update as necessary.
  • Documentation of ETL server installations and migration including client components.
  • Used Informatica Dynamic Deployment groups to deploy the Informatica jobs among different environments (DEV, TEST, QA and PROD).
  • Used Workflow Manager, Repository Manager for Creating, Validating, Testing and running the Batches and Sessions and scheduling them to run at specified time.
  • Responsible for maintaining repository backups and their restorations.
  • Responsible for migrations from Development repository to the QA and Prod repositories.
  • Configuring HPOV alerts for informatica repository services.
  • Installed Informatica data quality 10.1 and the services content management services. MRS, DIS on Unix servers
  • Handling the INC calls, PRB ticket analysis
  • Upgraded informatica from 9.5 to 10.1, 10.1 to 10.2 HF1, 10.1 to 10.2
  • Worked with informatica in migrating ICS to IICS with application teams in case any issue they are facing after the migration to IICS.
  • Troubleshoot the Productions failure and provide root cause analysis.
  • Worked on emergency code fixes to Production.
  • Worked with IBM vendor and Application team to move the informatica workflows from Netezza to Sailfish database on informatica 10.1
  • Worked with the DBA to improve Informatica session performance and query performance by collecting statistics and defining relevant indexes on target tables.
  • Worked on gathering the requirements to install informatica in AWS.

Environment: Informatica Power Center 9.5/10.1, Power Exchange 10.1, PMPC,9.6.1/9.7, Informatica data quality 10.1, Informatica cloud (ICS & IICS), DVO, Hortonworks 2.2, SAP, Oracle 12, Toad, SQL Developer, SQL Loader, Windows 7, Windows 10, UNIX/Linux, Netezza, DB2 9.7, SQL Server 2012, Autosys, Sailfish.

Confidential, New York, NY

AWS Architect

Responsibilities:

  • Designed, deployed, managed and operating scalable, highly available, and fault tolerance system on AWS.
  • Responsible for configuring and securing the infrastructure on AWS Cloud
  • Done the Capacity planning and Architecture Design of AWS Infrastructure
  • Configuring the Continuous integration and Deployment(CI/CD) of the code on to AWS cloud.
  • Responsible for Creating AWS IAM Users, Policies, Groups etc.
  • Provisioning of AWS resources like EC2, VPC, EBS, AMI, S3 buckets, creation of subnets and all other operational tasks.
  • Designed heterogeneous Hybrid Cloud solutions which integrate EC2, S3 storage on AWS Cloud and on-premise physical and VMware Virtual Servers, Networking and Security.
  • Performed configuration, deployment, and support of cloud services including Amazon Web Services. Performing the hardening of AWS Root account after requisition.
  • I also used on demand, spot, reserved instances based on my scenarios and tasks.
  • Deployed Micro services, including provisioning AWS environments.
  • Provisioned Load balancer, auto-scaling group for micro services.
  • Installation of search technologies such as Web Crawler on AWS EC2.
  • Installation of Apache Tomcat on EC2 instances and Installation of build tools such as ANT.
  • Installation and configuration MAPR Cluster on EC2 Instances.
  • Installed MAPR Platform Services like MAPR-XD, MAPR-DB, MapR Streams .
  • Installed Streamsets on MAPR Nodes so as provide a pipeline for data streaming from AWS S3 to MAPRDB .
  • Configured Amazon Aurora Postgresql RDS and MSSQL RDS and assisted in Data Migration from postgres to sql.
  • Installed and configured Graph DB for Machine Learning and Performance Tuning.

Confidential, St Paul, MN

AWS Architect

Responsibilities:

  • Migrating from On-Premise Infrastructure to AWS Cloud .
  • Focusing on high-availability, fault tolerance, and auto scaling using AWS Cloud Formation .
  • Configured and managed various AWS Services including EC2, RDS, VPC, S3, Glacier, CloudWatch, CloudFront, and Route 53 etc.
  • Configured various performance metrics using AWS Cloudwatch & CloudTrial
  • Worked on configuring Cross-Account deployments using AWS CodePipeline, Code Build and CodeDeploy by creating Cross-Account Policies & Roles on IAM.
  • Configured Security and RBAC models in AWS IAM , to authenticate users and application in AWS environment.
  • S3 bucket management and access control via Policies, Versioning, Lifecycle Policies and IAM permissions.
  • Developed and managed AWS Elastic Search cluster for real time analysis for VPC flow logs Via Kibana and Log stash .
  • Leveraged Route 53 for high availability and load balancing of internet facing applications.
  • Deployed WordPress , RDS Instance, Database and EC2 Instances via Cloud Formation .
  • Written various Lambda services for automating the functionality on the Cloud.
  • Used AWS Route 53 for configuring the High-Availability and Disaster recovery to make the environment up and running in case of any unexpected disaster.
  • Maintained the user accounts (IAM), RDS, Route 53, VPC, RDB, Dynamo DB, SES, SQS and SNS services in AWS cloud.
  • Involved in setting up builds using Chef as a configuration management tool
  • Deployed and configured Chef Server and Chef Solo including bootstrapping of chef client nodes for provisioning; Created roles, cookbooks, recipes, and data bags for server configuration, deployment, and app stack build outs.
  • Managed and monitored AWS infrastructure, Linux, and Windows components.
  • Deploying and managing many servers utilizing both traditional and cloud oriented providers ( AWS ) with the Chef Platform configuration system.
  • Deployed application updates using Jenkins. Installed, configured, and managed Jenkins.
  • Experience in building applications of various architecture styles like Micro-services architecture consisting of Restful web services and Docker container based deployments.
  • Deployed and configured Git repositories with branching, forks, tagging, and notifications.
  • Installed, configured & upgraded to Informatica Power Center 9.5.1 from previous versions and applying EBF and Hotfixes.
  • Creating Object Queries, Deleting and purging objects.
  • Formulating deployment plans and W eekly Audit reports.
  • Created Users, Groups and Roles in Admin console (Administrator) and granted the privileges to the same.
  • Creation of Static and Dynamic deployment Groups, Labels and Queries.
  • Updating and Adding ODBC and TNS entries.
  • Created Folders and assigned permissions to developers for Folders depending up on the environments.
  • Involved in Migration of Informatica Mappings/Sessions/Workflows from DEV, QA to PROD environments.
  • Create and maintain the Relational and Loader ODBC connections in Development, QA and Production environments.
  • Created backup, restore and DR for Informatica Repositories.
  • Handled ETL batch jobs across all the environments (DEV, QA & PROD).
  • Involved in Performance Tuning of Informatica Server and Repository Server.
  • Monitoring systems capacity and performance, plans and execute Disaster Recovery procedures.
  • Record and maintain ETL software procedures.
  • Providing Application support to ETL application development teams.
  • Coordinating with Offshore and support teams regularly.
  • Running UNIX shell scripts for repository backups, domain backups and scheduled the Informatica sessions runs.
  • Communicated infrastructure needs and directions with management.
  • Implementing procedures to maintain, monitor, backup and recovery operations for ETL environment.
  • Report regularly on health and performance of ETL environment and jobs.
  • Provided 24/7 Support related to Administration tasks (Emergency Deployments, FATAL ERRORS on the Informatica Server, unexpected Informatica services failures, Repository issues etc.)
  • Worked on Service Now tool to track all the changes and incidents.

Confidential

ERP Consultant - Technical

Responsibilities:

  • Leading the team for SAP Basis implementation & operation of all major SAP components like ECC, BW, SCM, WM, SRM, APO, PI and Solution Manager
  • Design, implementation and maintenance of landscapes E2E.
  • Responsible for enforcement of IT policies for these systems.
  • Performed SRM upgrade from SRM4.0 to SRM7.0 Ehp2 end to end.
  • Performed Ehp upgrades.
  • Performed oracle upgrades.
  • TSM backup configuration.
  • Extensively worked on oracle stage copies.
  • Worked on latest installation and upgrade tools like SWPM and SUM.
  • Coordinating with Application Teams for Upgrade and performance optimization related issues.
  • Performed Oracle RAC conversions.
  • Planning and executing new system builds according to the requirement.
  • Worked on solman LMDB and maintenance optimizer.
  • Providing Production/Service Support to all SAP systems.
  • Responsible for Disaster Recovery/Business Continuity Planning (DRP/BCP).
  • Configured and maintaining online and offline backup/restore using TSM and Oracle Rman utility.
  • Performed system refresh using various conventional methods. (Time finder, filesystem rename, Data copy over lan)
  • Configuration of Web Dispatcher and load balancing configuration
  • Configuration of CTS+ in PI Landscape
  • Creation of backend systems in portal, single sign on and JCo configuration
  • Adobe document services installation, upgrade and configuration.

Confidential

SAP Basis Consultant

Responsibilities:

  • Client Copy - Local and Remote, Client export/import activities and monitoring.
  • R/3 Kernel Upgrades.
  • Applying support packs using SPAM.
  • Activities pertaining to Transport Management System.
  • Workload analysis of SAP Tcode - ST03N.
  • Configuring OP-Modes for different workloads.
  • Troubleshooting the OS level and DB level.
  • Background job administration, Job Scheduling according to SAP Housekeeping standards.
  • Handling Customer issues.
  • Involved in Table space Administration: Including Expansion of Table spaces and adding of data files.
  • Problem Analyzing System Log, Update Monitoring, System Traces, Lock Entries Deletion, Short Dump analysis.
  • Performance Analysis of systems.
  • Spool Administration.
  • Performing system refresh activities every quarterly.
  • Java Engine administration
  • Monitoring the Nestle portal systems in GPMS site.
  • SCOT Configuration and E-mail Processing.
  • Setup of Single Sign On and SNC for Login to R/3 Systems
  • Language Implementation in R/3 system.
  • CCMS Alert Monitoring.
  • Providing On-Call support for pre-prod and prod flash copy and backup issues.
  • Collecting the KPI (Key Performance Index) reports for Production Systems - AMS, AOA and EUR Regions.
  • Taking the GOLDEN offline backup and restoring the same on L&T systems.
  • SAP Profile maintenance.
  • Starting and Stopping the LIVECACHE APO system through LC10.
  • OSS activities (Opening OSS connection, providing access keys and developer keys, searching OSS note relevant to problem).
  • Trouble shooting RFC destination problems.
  • Troubleshooting IDOC issues.
  • Troubleshooting of Portal/URL unavailability Issues.
  • Resolving the Event ticket generated by Tivoli like

We'd love your feedback!