Devops Tech Lead Resume
San Antonio, TexaS
SUMMARY
- 15 years of IT experience in Banking and Financial Domains across Asia, North America and Latin America.
- Experience in API Management tools (IBM API Connect using IBM Datapower Gateway, Google Apigee)
- Devops Experience in setting up CICD pipeline across Datacenters and AWS cloud environments.
- Infrastructure as a code (IaC) Experience in AWS Cloud Infrastructure automation using Terraform Modules and Templates.
- Experience in Docker engine and Docker Machine environments, to deploy the micro services - oriented environments for scalable applications.
- Experience in Legacy Modernization Solution Design and Modernization architecture solution through modular design and identify the need for API, Microservices creation, language conversion and Database conversion.
- Served as a Lead Consultant for various development and infrastructure projects driving them from offshore and onshore.
- Developer experience in ALS/IM Systematics (FIS) product suite.
- Honest person who loves to straight talk assertively with the right attitude towards work, possessing excellent communication, interpersonal and leadership skills and an avid learner.
TECHNICAL SKILLS
Programming/Scripting skills: Python, JSON, XML, Bash Shell, COBOL, CICS, JCL, REXX
Cloud Environments: Amazon AWS, GCP
CICD Tools: Terraform, Jenkins, Git, Bitbucket, Docker, Artifactory
Databases/File System: MySQL, MongoDB, DB2, VSAM, zFS, IMS
Messaging: Websphere MQSeries, HTTP/2.0
Job schedulers: Autosys, Tivoli Workload Scheduler(TWSz)
Vendor Tools/Products: IBM API Connect, DataPower Gateway, AppDynamics, Splunk, Kibana, Swagger 2.0, SoapUI, Postman, JIRA, RLM, IBM zOS Connect, CICS Transaction Gateway for z/OS, Rational Developer for z, SonarQube, Systematics Core Banking Suite, CA LISA, Serena ERO/Changeman, RTC
Operating Systems: Ubuntu, RedHat, Amazon Linux, Mainframe z/OS, UNIX Shell Services
Domain Knowledge: Core Banking, Unsecured Loans, Collections, Billing, SWIFT Payments, ACH, FileAct, Check Print
Application/Web Servers: WebSphere, Web logic, Nginx, Apache, Tomcat
PROFESSIONAL EXPERIENCE
Confidential - San Antonio, Texas
Devops Tech Lead
Responsibilities:
- Worked on configuration, installation, and administering the Jenkins Continuous Integration servers.
- Created Python scripts to automate AWS services which includes web servers, ELB, Cloudfront distribution, database, EC2 and database security groups, S3 bucket and application configuration.
- Created Terraform templates and modules to spin AWS resources.
- Created monitors, alarms, and notifications for EC2 hosts using CloudWatch.
- Automated Continuous Deployment, Application Server setup, Stack Monitoring using Ansible playbooks and has integrated Ansible with Jenkins.
- Migrated VMware VMs to AWS and Managed Services with Ansible.
- Worked at optimizing volumes and AWS EC2 instances and created multiple VPC instances.
- Design roles and groups for users and resources using AWS Identity Access Management (IAM).
- Worked on configuring S3 versioning and lifecycle policies to and backup files and archive files in glacier.
- Maintained Bitbucket source code repository; perform branching and merging.
- Connected continuous integration system with Git repository and continually build as and when the pull request come from the developers.
- Integrated Jenkins with Terraform for Continuous Deployments.
- Installing, setting up & Troubleshooting Ansible, created and automated platform environment setup.
- Extensively used Ansible for Configuration management.
- Configured JIRA workflows based on the needs of the CM team and integrated the project management features of JIRA with the build and release process.
- Worked on automating the deployments using CI/CD pipeline Jenkins and RLM.
- Build and support the APIs through the entire lifecycle.
- Created and managed cloud VMs with AWS EC2 command line clients and AWS management console.
- Setup Elastic load-balancers for different applications to ensure high availability of applications.
- Used Amazon RDS Multi-AZ for automatic failover and high availability at the database tier for Oracle workloads.
- Used Amazon IAM to grant fine-grained access to AWS resources to users. Also managed roles and permissions of users to AWS account through IAM.
- Used AWS API to communicate with the resources on the cloud and monitored them using cloud trail.
- Used Amazon S3 to backup database instances periodically to save snapshots of data
- Developed JSON scripts to build the stacks required for AWS.
- Contributed to the Continuous Integration pipeline running component builds, creating, and running Deployment jobs on individual stages on Jenkins, and running automated tests.
- Involved in setting upstream and downstream Jenkins jobs.
- Reduced build & deployment times by designing and implementing Docker workflow.
- Configured Docker container for branching purposes.
- Worked on API management console, created REST APIs, added API’s to the Product, created rate limits with multiple Plans and stage/publish the product to a Catalog.
- Secured the APIs using OAUTH2.0Access Code and Password flow and basic authentication security options both in IBM APIM and APIGEE.
- Using API Management console with assembly, created custom authentication, authorization, audit using XSLT transformations and Gateway Scripts.
- Worked on API Connect upgrades, customize advanced developer portal, created API documentations.
Confidential, Texas
Devops ConsultantResponsibilities:
- Performed Architecture, design and development ofREST APIs using IBM APIConnect and Google APIGEE.
- PerformedIBM APIConnect infrastructure design, software installation, configuration and Advanced Developer Portal Installation and Configuration.
- Worked on Cloud Console Management configuration with Management Server,Gateway Serverscreated Organizations and invited developer toAPI management.
- Spearheaded Proof of Concept for migration of existing APIs from IBM API Connect to Google APIGEE platform and executed the performance test for APIGEE platform. This evaluation involved around 90 use cases covering aspects of security of API Gateways, cost analysis, monetization, infrastructure, automation deployments and disaster recovery.
- Involved in the design, planning and implementation of APIGEE infrastructure.
- Architected solution and topology for disaster recovery of APIGEE platform and IBM APIM platform.
- Designed and tuned infrastructure to improve application performance.
- Derived the TCO cost for APIGEE platform in comparison to IBM APIM and presented results to senior managers for decision making.
- Writing Open API spec and knowledge of smart docs and on boarding APIs to developer portal. Troubleshooting and monitoring API proxies running on APIGEE using Trace tool and component level log checks.
- Experience with design and development of REST API platform using APIGEE/APIM, converting web services from SOAP to REST or vice-versa.
- Lead the Customer as well as Conversion projects (conversion from legacy to end state application; define specifications, schedule and execute mock and real conversions) delivered in Sprints.
- Conversion of existing SOA based services to Restful APIs and onboarded the same to IBM zOS Connect tool.
- Managed the Proof of Concept for IBM zOS Connect which later became as the API Management strategy for Confidential .
- Set up zConnect files in ZOS Unix shell services to use websphere liberty profiles and CICS WOLA for JSON and Restful API connections.
- Set up z/OS Connect in multiple mainframe environments using CICS Websphere Liberty profiles which can expose existing core application services to restful APIs as endpoints.
- Supervise and peer review monitoring and automation scripts and routines to enhance environment integrity.
- Collaborate and work with Process and Tools management team to pilot code and test using Bitbucket source code repositories and Git commands.
- Review PMRs with IBM and execute the corresponding fix in all environments.
- Manage and publish daily, weekly sprint burndown charts, environment health dashboard and Velocity Charts to senior management.
- Support monthly application releases, review all software upgrades and collaborate with cross functional legacy, cloud based and Next Gen Development teams.
Confidential
Senior Systems Engineer
Responsibilities:
- Work on DB2 queries using SPUFI, QMF and COBOL programs for data extraction.
- Write automated process for application testing, working on FTP for file transfers across data centers
- Use VSAM files as master database for data extraction and reporting.
- Write Easytreive jobs in creating reports and output files.
- Write shell scripts to pull UNIX system reports.
- Set up MDM, FTA agents on TWS workstations.
- Implement automation ideas with the help of latest tools and techniques in DevOps area for continuous integration and deployment, build process and deployment execution.
- Suggest and implement workflows for operational efficiency, effectively use various DevOps tools like ServiceNow, RTC, Jenkins, GIT, RLM etc.
- Review syslog, joblog and console logs to diagnose and resolve problems in z/os products/systems.
- Use Omegamon to monitor Websphere MQ, batch and online CICS transactions.
- Schedule and monitor job status and return code conditions in TWS.
- Configure the CICS MQ adapters for effective message communication and bridges for 3270 terminal applications.
- Coordinate and set up ETT automated triggers using TWS for streamlined batch operation.
- Configure Mainframe environments including software change and release management activities and help in implementing replication and monitoring technologies using Agile Scrum Framework.
Confidential
Senior Software Engineer
Responsibilities:
- Define DB2 tablespaces with logical partitions for each country and code for proper application data backup and recovery.
- Extract and develop DB2 stored procedures to enhance modular programming in legacy applications.
- Write efficient source code that runs with optimal memory, MIPS and storage consumption.
- Set up NDM connectivity between all the interfaces.
- Update static and dynamic parameter controls in each environment.
- Build new CICS modules and integrate them with MLIs/websphere MQ/JMS protocols.
- Work with Eclipse entitlements team to provision eclipse users using IBM DB2 database objects.
- Work on EERS feed file to report eclipse entitlements.
- Use Changeman ERO to build, deploy and release the mainframe code in any environment.
- Support multiple production support jobs to meet SLA and online availability of eclipse screens.
- Develop new easytreive, SAR reports, and COBOL batch programs.
- Design new jobs and work on DB2 queries using SPUFI, QMF, MS-Access and VSAM for data extraction and reporting.
Confidential, San Antonio, Texas
FIS IM/ALS Onsite Support
Responsibilities:
- Work with Business teams and maintain the various products and pricing features for IM and ALS products through the AM and IM screens.
- Learn the associated IM/ALS APIs and RPIs and clearly execute the change requests based on the posting date and other business rules.
- Co-ordinate with offshore team on the change requirements or new business requirements from clients and deliver them with impeccable quality.
- Set up Batch Processing for memo as well as hard posting. Make sure the batch to online SLA is met.
Confidential
Mainframe Team Lead
Responsibilities:
- Work with Business analysts and business sponsors to finalize functional specifications.
- Preparation of Test plans, Coding and Testing.
- Do Internal Quality Audits and code reviews.
Confidential
Mainframe Developer
Responsibilities:
- Work on requirements analysis and design.
- Preparation of Test plans, Coding & Code Review
- Do Unit, Integration and System Testing
- Comply to and Maintain all the quality related documents
