We provide IT Staff Augmentation Services!

Product Manager - Aws Architect Resume

2.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • 17+ years of experience designing and executing innovative data driven solutions for financial services, CRM, and supply chain
  • Expertise in Data Management, Product Management, Software Development Lifecycle and Machine Learning concepts
  • Experience in Requirements Gathering, System Analysis, Design, Data Modeling, Development, testing of enterprise - wide Business Intelligence Analytic Solutions for various business verticals wif Transactional Systems like Oracle ERP Applications, PeopleSoft, Siebel following SDLC, Snowflake, agile, DevOps methodologies
  • Extensive experience in the design and implementation of Continuous Integration, Continuous Delivery (CI/CD), Continuous Deployment and DevOps processes for agile
  • Experience wif Configuration Management tools like Chef, Puppet and Ansible
  • Certifications in Amazon Web Services, Oracle, SAS
  • Experience in Systems Analysis and Design, Product Strategy, Automation and Replication of Environments, Systems integration design, and overall Project Management
  • Experience in Ticket management using JIRA, Confluence, Service Now Management
  • Experience Defining process & Ensuring Process adherence to ensure best quality of service
  • Oracle Certified Associate wif SQL, PL/SQL, OBIEE, WebLogic, OCI certifications
  • Experience wif Web Development, Amazon Web Services, Python, and the Django framework
  • Experience wif Snowflake Multi - Cluster Warehouses
  • Experience wif Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system which include loading nested JSON formatted data into snowflake table
  • Extensive experience in Architecting, Administering OBIA using ETL Informatica Power center and OBIEE
  • Provisioning AWS EC2 instances wif Auto scaling groups, Load Balancers in a newly defined VPC and used Lambda Functions to trigger events in accordance to the requests for Dynamo DB
  • Experience in changing over existing AWS infrastructure to Server less architecture (AWS Lambda, Kinesis) through the creation of a Serverless Architecture using Lambda, API gateway, Route53, S3 buckets
  • Experience Implementing services using security mechanism such as OAuth, SAML, Single Sign on data security
  • Experience in Designing, Architecting, and implementing scalable cloud-based web applications using AWS
  • Experience in providing highly available and fault tolerant applications utilizing orchestration technologies like Kubernetes
  • Experience in Cloud Computing technologies
  • Experience wif AWS Lambda, Atana, RDS, S3, Gateway, AWS CLI
  • Experience in AWS Services like EC2, S3, RDS, Redshift, Lambda, Glue, Stich, IAM, Quick Sight, Kinesis
  • Worked on CI/CD (continuous integration & deployment) data pipelines and apply automation to environments and applications making use of devops workflows
  • Architect Experience taking multiple initiatives for cloud-based applications working closely wif various development teams and team members
  • Dockerized applications by creating Docker images from Docker file
  • Experienced in Branching, Merging, Tagging and maintaining the version across the environments using SCM tools like GIT and Subversion (SVN) on RHEL and Ubuntu platforms
  • Experience automating the deployments on AWS using GitHub, Terraform and Jenkins
  • Experience using matillion for building data transformations to integrate into AWS redshift
  • Experience wif Data Visualization platforms like oracle Data Visualization and AWS Quick sight
  • Experience in Python to automate the WLST scripts in OBIEE and ETL code for AWS Glue
  • Experience working in webservices using SOAP, WSDL
  • Experience in shell scripts to automation and monitoring using shell, python, and bash
  • Accomplished in navigating both established corporate setup as well as a startup environment
  • Proficient in business relationship management fostering positive relations wif stakeholders and driving operational success
  • Experienced leading globally located cross-functional delivery teams in dynamic and ambiguous environments

TECHNICAL SKILLS

Cloud Computing: Amazon web services EC2, S3, ELB, Auto scaling Servers, Glacier, Storage Lifecycle rules, VPC, Elastic Beanstalk, Cloud Front, Functional knowledge of Import/Export.Route53, CloudWatch, Cloud Trail, opsWorks, IAM, SNS, Elasticach Snowball

Database: Oracle 11g/12c, HBase, DynamoDB, Redshift, S3, Aurora, PostgreSQL, MongoDB, Teradata, DB2 Mainframe, Microsoft Access

Programming: PL/SQL, SQL, Python, R programming, SAS, Pig, Hive, Spark, Java, HTML5

Applications: OBIA, Oracle Fusion, PeopleSoft, Bigdata

Middleware: WebLogic, SOA suite

Tools: OBIEE, BIP, OTBI, Oracle Visual Analyzer, AWS Quick sight, Crystal Reports, Discoverer reports, Microsoft Excel, PowerPoint

ETL Tools: Informatica IICS, ODI, MapReduce Glue, Stich

Methodologies: Agile, DevOps, SDLC

Containerization Tools: AWS ECS, Docker, Kubernetes

Scripting: Shell, Bash, Ruby, and Perl

PROFESSIONAL EXPERIENCE

Confidential, Chicago, IL

Product Manager - AWS Architect

Responsibilities:

  • Deployment and management through AWS cloud formation on EC2 (Elastic compute cloud) instances and maintaining amazon S3 storage
  • Optimized volumes and EC2 instances& Created multi AZVPC instances& Used IAM to create new accounts, roles and groups
  • Experience in deploying custom chef cookbooks and puppet modules, to configure the machines indifferent environments wif appropriate package/services and versions
  • Configured S3 to host static web content, versioning, and lifecycle policies to and backup files and archive files in Glacier
  • Configured CI, CD (Continuous integration/Deployment) pipeline for the automatic deployment of artifacts/application to the required servers or environments in Jenkins CI, and Clean for the new build process whenever code commits are done to SCM tool Git
  • Utilized CloudWatch to monitor resources such as EC2, CPU memory, Amazon RDS DB services, DynamoDB tables, EBS volumes; to set alarms for notification or automated actions; and to monitor logs for a better understanding and operation of the system
  • Developed and executed Shell scripts and worked on Python Scripting in different projects for automation of regular repeating tasks
  • Created and maintained Highly Available and Fault Tolerant infrastructure in Amazon VPC using EC2 wif Elastic load balancing and Auto scaling groups in AWS cloud
  • Automated AWS components like EC2 instances, Security groups, ELB, RDS, IAM through AWS Cloud formation
  • Implemented AWS Code Pipeline and Created Cloud formation JSON in Terraform for infrastructure as code
  • Worked on Glue crawlers to scan data in all kinds of repositories, classify it, extract schema information from it, and store the metadata automatically in the AWS Glue Data Catalog
  • Worked on AWS data pipeline for Data Extraction, Transformation and Loading from the homogeneous or heterogeneous data sources
  • Worked wif marketing team to structure used cases, identify key KPI’s, extract data and develop reports and visualizations
  • Performed cleansing, de-duplication, and harmonization of data gatheird from various sources to create a golden dataset for training of the models
  • Collaborated wif two offshore teams and an internal team to streamline the data pipeline efficiency and processing times of the model training in six months, enabling custom mappings of data models
  • Streamlined and enforced detailed data quality checks wif statistical significance and anomaly detection for accuracy and completeness
  • Built a process to log and track the defects and generate a detailed defect report which systematically lowered the defect rate and enhanced the data quality
  • Managed the data access through role-based security protocols

Confidential, Topeka, KS

Senior Technical Consultant

Responsibilities:

  • Instrumental in creating and maintaining a customer master data hub along wif the item master data required for the procurement which improved the searchability
  • Implemented the matching process to merge customers and create unique customer profile in the customer data hub reducing the redundancies
  • Led a group of nine engineers to build a Purchase Orders system for services and supplies which increased the process efficiency
  • Supervised and headed the development, deployment, and support of integrations by translating and transporting several data objects between third-party systems and Confidential systems which streamlined the user experience and the overall satisfaction
  • Branching, Tagging, Release Activities on Version Control Tools like GIT
  • Part of an Interface team and worked on the following
  • Inventory Inbound Interface
  • Inbound Meter Reading Interface
  • Bulk Fuel Inbound Interface
  • AP Invoice Inbound Interface
  • AP Invoice Outbound Interface
  • Item Interface to Demantra
  • Location Interface to Demantra
  • Outbound Order Acknowledgement Interface
  • Outbound Shipment Notice Interface
  • GPS Meter Reads Web Services Interface
  • Part of an Extension team and worked on the following
  • Capturing Confidential Tracking Numbers
  • Parts Warranty Core Returns Notification
  • Purge Sales Order History from Demantra
  • Part of a Conversion team and worked on the following
  • Stamps Sourcing Assignment Conversion
  • Customer Conversion
  • FA Locations Conversion
  • As part of the tokenization initiative developed a program which would create a Bank Account wif the token and purge the active Credit Card in the system
  • Created a custom interface program which kills the inactive MWAJDBC sessions
  • Implemented barcode functionality in the BI Publisher reports
  • Worked on the personalization of the PO form for the consigned and non-consigned items
  • Extensively used the HP Load runner for the performance testing of various custom objects
  • Created PL/SQL packages to load data from EBS into the OBIEEE
  • Developed the following barcode and label reports
  • PICK Slip Report
  • PACK Slip Report
  • PICK List Report
  • WH Label Report
  • Transfer Document Report
  • Office Master Report
  • Office History Report
  • Order Release Schedule Report
  • Order Release Schedule Summary Report
  • Improved extract, transform, load (ETL) efficiency by leading two engineers and introducing new reporting process, expediting data analysis functions
  • Designed preventive testing scripts using HP-ALM for heavily customized and frequently used applications and generated redundancies to mitigate node overload
  • Builds reports based on Data Extracts from Oracle EBS and OBIEE BI increasing the efficiency of the analyst
  • Mentored SQL data analysts and provided subject matter expert support to various team’s company-wide

Confidential, Jamesburg, NJ

Technical Consultant

Responsibilities:

  • Created custom JAVA program which was extensively used for scanning by handheld devices
  • Created a custom workflow which would update the service contract based on the options selected in the customer profile page
  • Created following programs to mimic Back-to-Back Orders
  • Requisition for specific sales order lines based on Quality plan and value sets
  • POs for above created requisitions
  • Receiving interface to receive goods for above created PO
  • Customized ‘Requisition Form’ to show vendors and vendor site from ASL list for items wif USED APPROVED FLAG
  • Auto-Create PO form to restrict from creating POs for requisition created as a part of above
  • Modified existing Drop ship order related programs
  • Part of a conversion team and created the following:
  • Item conversion program, including item cost and item cross reference
  • Conversion program to load Locators
  • Program for item sub-inventory and locator assignments
  • Developed On-hand conversion program which can handle LOT and SERIAL controlled items
  • Developed following programs to support new software items warranty
  • Sales order form to make some attributes mandatory for items created for software
  • Dummy IB for above items and created relationship for these IBs
  • Personalized Transaction Move order to capture existing license number for extended warranty items
  • Warranty update program to extended warranty program for above items
  • Created a PLSQL package to calculate Sales-reps profit percentage based on quality plans defined for transaction types, categories and order line attributes
  • Developed Ship confirm interface to process files received from UPS wif details of shipped orders
  • Modified workflow to send notification for overbooked items through Tele-services
  • Worked as a part of SCM production support team. Supported month-end and year-end process
  • Added New Menu item for Label creation process wif different formats
  • Added item description field to pick confirm screen
  • Added LOT validations for PO receipts
  • Developed the following custom reports
  • Account Reconciliation reports
  • Inventory turns, Scrap, Obsolete & Excess Inventory reports
  • Orders, In-transit, Inter-company and Shipment tracking reports
  • Plan vs Actual & Plan vs Plan production planning reports
  • Discrete Job pick list, Job shortage, Job-lot composition, Job variance reports
  • Serial and Lot Traceability Reports
  • PPV and IPV analysis reports
  • Contracts sell rate and gross margin reports
  • PO buyer monitoring and spend & Vendor management reports
  • Sales and Invoice Aging Reports

Confidential

Technical Consultant

Responsibilities:

  • Involved in design and architecture of a Custom Sales order, which involves creation of extension tables and development of which can create /update /cancel /copy /delete orders and which can also be used to Create/release holds and hold sources
  • Used Standard OM API to achieve all above specified operations. This form TEMPhas ability to default different values based on standard and custom setups
  • Worked closely wif Business users and supported CRP1 and CRP2
  • Modified existing 11.0.3 order import program to work in 12i
  • Converted all 11.0.3 Order entry and shipping module procedures, packages, triggers and views to 12i
  • Created different transaction types, defaulting rules, holds and quick codes
  • Converted 11.0.3 organization-based views to 12i organization-based synonyms
  • Created different profile options to make WIP/OE forms site specific
  • Created different setup forms to store different custom setups
  • Used Oracle alerts to send notifications whenever an international order is created and whenever special orders are created
  • Modified standard Generic Line workflow for different transaction types
  • Used holds & Line workflows together as approvals. This can be used in Automatic and Manual approvals
  • Developed XML reports like Inventory details report, Price evaluation report and many other Inventory related reports
  • Added special triggers to Custom Order Entry form so that Inter-company revenue split can be provided at the time of order entry

Confidential, Santa Clara, CA

Technical Consultant

Responsibilities:

  • Involved in finalizing new business process architecture to integrate Oracle database wif three other databases
  • Designed new data flow architecture and created required shell scripts for Secured FTP wif error handling capability
  • Defined and created new Flex fields, Segments and Value sets to support new business requirements
  • Used Data Loader scripts to load data into Flex fields, segments and Value sets
  • Extensively Modified Shipping INBOUND/OUTBOUND interface programs to accommodate changes in business process
  • Developed an interface program to perform miscellaneous transaction to clean up the given items from sub inventories
  • Modified Shipping documents which include Pick, Pack and Invoice reports using Reports 6i to provide more information to 3rd parties
  • Developed Shell scripts for Error handling and E mail notification
  • Created test cases and involved in making test plans for SIT and UAT
  • Created ASCA validation reports
  • Used “Sub Versions” for version control and REMEDY for Change Request

Confidential, Chelmsford, MA

Technical Consultant

Responsibilities:

  • Worked on complete SDLC from gathering requirements to Deployment
  • Developed GL interface to import GL data from a third-party tool
  • Worked wif business users to gather business requirements and provided those reports in Financial, manufacturing, supply and forecast data for Executive S&OP implementation
  • Developed queries for summaries of AR aging reports. And used different Look up techniques in Excel to compare the data
  • Created import BOM interface program to import BOM items, routes, sequences and departments
  • Customized workflow to send notifications about monthly payments
  • Developed Order Management, Purchase Order, Inventory, Receivables, Forecast, Shipping, Warehousing Management, Pricing queries to compare summary information of the standard and custom reports
  • Used Pricing APIs to attach pricing attributes that is derived to the Orders imported from legacy systems
  • Created and modified reports using Discoverer and XML Publisher
  • Involved in training and demonstration to end users
  • Used Hints, Indexes, Other tuning techniques and advanced SQL Programming techniques like analytical techniques and WITH statement to increase the performance of the reports
  • Created new materialized views and used existing views to increase the performance of the reports
  • Developed test scripts and tested in four different instances of database. And provided documentation for all the reports
  • Created Corporate Source report to show the worldwide material status and other required information about them

Confidential, Jamesburg, NJ

Technical Consultant

Responsibilities:

  • Worked in the project phase of Solution design, SIT, UAT and Go live for SCM enhancement projects
  • Developed Shipping Interface program to receive shipments, which are in In-Transit for more tan given number of days
  • Developed Inventory Interface program for sub inventory transfer and miscellanies issues
  • Customized Workflow, OM Order Line - Return Receipt/Inspection to send Notifications to inform the individual who created the Return that the return TEMPhas not been received or inspected wifin defined criteria
  • Modified existing Sales order Interface and Ship confirm Interface, in order to consolidate and split the order lines
  • Developed programs to send PO data to third party and modified existing inbound PO open interface program
  • Developed conversion program for validation while importing the GL data into interface tables
  • Modified Service Contract forms using form builder
  • Assigned tickets to the other team members and supervised their release to production. Used Apps to maintain production tickets
  • Created, updated and closed several TARs to solve issues related to Order Management Production problems
  • Interacted wif business users, for making project plan and documentation of Consumable Order Line Consolidation
  • Modified existing procedures to consolidate order lines, when communicating wif 3rd party shipping tool and split the order lines when communicated back from the external system

Confidential, Monterey, CA

Technical Consultant

Responsibilities:

  • Involved in writing code using Base SAS & SAS/Macros to extract and to validate data on the mainframe as well as windows.
  • Created SAS datasets in local SAS directory from raw data files and modified existing datasets using Set, Merge, Sort, Join, Update and conditional statements.
  • Responsible for converting the business rules to SAS Datasets
  • Performed extensive QC (Quality Check) on the data obtained from various data sources
  • Used SAS Macros to write re-usable code
  • Created checkers on the mainframe to compare the data files having the raw scores of each students wif that in the reports
  • Created different reports as per the business requirements
  • Created checkers to verify the validity of the data in the reports generated

Confidential, Richmond, VA

Technical Consultant

Responsibilities:

  • Involved in the design of the defense engine to detect the potential internal fraud cases
  • Documented the High-Level and Technical Design documents
  • Responsible for converting the business rules to SAS defense scripts
  • Coded using SAS/SQL to extract data from TeraData and Oracle tables
  • Created SAS datasets in local SAS directory from raw data files
  • Performed extensive QC (Quality Check) on the data obtained from various data sources
  • Developed the defense scripts using SAS/SQL, Base SAS & SAS/Macros
  • Used SAS Macros to write re-usable code
  • Wrote scripts to send error notification emails to the business
  • Wrote shell scripts for UNIX and scripts for NT
  • Wrote shell scripts to invoke SQL scripts, SQL Loader, create logs and manipulate data files.
  • Wrote PLSQL packages for common DML actions
  • Wrote stored procedures for specific analytic reports
  • Wrote large aggregation SQL queries using in-line views
  • Used ANALYZE, DBMS STATS, EXPLAIN PLAN, SQL TRACE and TKPROF to tune sql queries
  • Used SQL hints and indexes to improve SQL performance
  • Wrote SQL Loader programs and control files for multiple file, parallel, high-volume data loads
  • Used table hash partitioning to ensure that all SQL and SQL Loader programs process by partition
  • Used table compression after data loading when necessary
  • Converted some SAS modules to Oracle PL/SQL stored procedures
  • Wrote unit test cases and performed unit-testing for each developed program
  • Performed data verification for new applications where data had to match the application to be replaced

We'd love your feedback!