We provide IT Staff Augmentation Services!

Hadoop Admin Resume

2.00/5 (Submit Your Rating)

San, JosE

SUMMARY

  • Over 16+years of experience in the IT industry encompassing the Analysis, Design, Development, implementation, upgradation, admin and support
  • 12 years of experience in Oracle Applications eBusiness Suite (ERP). Experience includes managing sizeable teams, analyzing gaps in the product, estimating technical components and cost, offshore development co - ordination and management, RICE development using Oracle Technologies like Forms, Reports, APIs and Oracle AIM methodology.
  • 5 years of Experience as Hadoop Admin/Hadoop Developer.
  • Hands on experience in installation, configuration, supporting and managing Clusters with Horton works, MapR distributions and Cloudera.
  • Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.
  • Design Big Data solutions for traditional enterprise businesses.
  • Having good experience in Bigdata related technologies like Hadoop frameworks, Map Reduce, Hive, HBase, PIG, Sqoop, Spark, Kafka, Flume, Zookeeper, Oozie.
  • Experienced in writing complex MapReduce programs that work with different file formats like Text Sequence, Xml, JSON and Avro.
  • Having working experience on Cloudera Data Platform using VMware Player, Cent OS 6 Linux environment.
  • Used Network Monitoring Daemons like cloudera management,Ambari alerts, Nagios and CheckMK.
  • Adding/removing new nodes to an existing cluster.
  • Backup configuration and Recovery from a Name Node failure.
  • Decommissioning and commissioning the Node on running cluster.
  • Installation of various Ecosystems and Daemons.
  • Experience on securing the cluster by using Linux, ldap and Ambari ranger.
  • Experienced on authorization, authentication, auditing, data encryption and security administrationusing Apache Ranger, Apache Knox on Kerberos Cluster
  • Excellent command in creating Backups & Recovery and Disaster recovery procedures and Implementing BACKUP and RECOVERY strategies for off-line and on-line Backups.
  • Involved in bench marking /HBase cluster file systems various batch jobs and workloads
  • Making cluster ready for development team working on POCs.
  • Experience in minor and major upgrades of and eco system.
  • Experienced in scheduling backups and recovery of the entire EDW databases across various geographical locations for the business continuity and response time.
  • Efficient in vacuuming and analyzing database through regular cleanup of old logs, deleting old data, collecting statistics and tuning queries for efficient database running.
  • Extensive experience in creating Roles, Users and providing privileges to roles and user management.
  • Expert in extending core functionality of Hive and PIG by writing the Custom UDF’s using Java, Python based on user requirement.
  • Very good experience of Partitions, Bucketing concepts in Hive and designed both Managed and External tables in Hive to optimize performance.
  • Optimization and performance tuning of Hive QL, formatting table column using Hive functions.
  • Experienced in maintaining and monitoring the database backup, disc space.
  • Experienced in troubleshooting and analyzing problems like slowdowns, revalidating distribution keysfor structured data distribution and update dev team accordingly.
  • Hands on experience in analyzing Log fileson server for Master, segments and mirrors and finding root cause.
  • Extensive experience in creating and modifyingResource queues.
  • Experienced in modifying connection limits in master and segment.
  • Extensive knowledge of creating schemas, databases, table creations with distribution keys, partitions with orientation and append only conditions
  • Extensive knowledge of creating external tables and data load using gpfdist and analyzing of error logs.
  • Experienced in maintaining and deleting the partitions, and terminating freeze transactions and connections.
  • Extensive experience in Big Data - Hadoop solution implementation for Operational Reporting and Analytics using HIVE, Sqoop. Good Knowledge in SPARK & SPARK SQL.
  • Experience in UNIX Shell Scripting& scheduling jobs using CRONTAB.
  • Good hands on experience in coding, developing and explain analyzing.
  • Very good Knowledge of Architecture and components.
  • Proficient in data analysis and documenting the findings.
  • Expertise in Development of interfaces and conversion programs to integrate Oracle Applications modules to import data from various sources into Oracle Applications using Data Load, PL/SQL, SQL*Loader.
  • Strong programming experience in creating application specific Procedures, API, Packages, Functions, Triggers and other database objects using SQL and PL/SQL.
  • Possess good interpersonal, presentation and developmental skills with Strong analytical and problem solving approach and an excellent team player.
  • Very good knowledge about the ERP systems and has proven experience in developing and implementing ERP solutions for various clients.
  • Self confident and positive approach on challenging assignments.

TECHNICAL SKILLS

Data Bases: Oracle, MySql, Postgres

ERP: Oracle ERP R 12, 11i/ 11.5.9/10.7 Order Management (OM), TCA (Trading community Architecture), General Ledger (GL), Accounts Receivable (AR), Accounts Payable (AP), Purchase Order (PO), Inventory (INV), Application Object Library (AOL), SysAdmin.

GUI Tools: Oracle Developer 6i/2000 (Forms 6i/4.5, Reports 6i/3.0), Oracle database 10g, SQL*Loader, Oracle 8, 8i, 9i/10g

Languages: Java, Python, SQL, PL/SQL, Postgres

Web Related: HTML, XML, JSON

Tools: & Utilities: PGADMIN III,Aginity Workbench, TOAD, Benthic, Putty, Winscp, Golden Gate, PVCS, GitHub, Jenkins, Microsoft Office Tools, Quality Center, Remedy

Operating Systems: Windows Server/Vista/XP/ 2000/2007/2008 , Ubuntu, Unix/Linux, Cent OS

Security: LDAP,AD, Kerberos, Apache ranger, Apache Knox, Sentry

Big Data: Hortonworks hdp 2.2 to 2.5.0.0, Cloudera 4.X,5.X,apache, pig, hive, hbase, sqoop, Flume, Puffet, Chef, zookeeper, ambari, Oozie, spark, Kafka, Ambari, Cloudera Manger,Pivotal HD, Apache Slor, AWS

PROFESSIONAL EXPERIENCE

Confidential, San Jose

Hadoop Admin

Responsibilities:

  • Currently working as Hadoop admin on CDH 5.7 distribution for 4 clusters ranges from Dev, QA and PROD contains 100 nodes.
  • Responsible for Cluster maintenance, Cluster Monitoring, commissioning and decommissioning Data nodes, Troubleshooting, Manage and review data backups, Manage & review log files.
  • Extensively worked on capacity planning, design and installation of the clusters by fulfilling the business requirements.
  • Day to day responsibilities includes solving developer issues, troubleshooting jobs and providing instant solution to reduce the impact and documenting the same and preventing future issues.
  • Worked on Performance tuning at cluster level to improve the overall performance for the application running.
  • Experience on new component installations and upgrading the cluster with proper strategies.
  • Very good hands on exp on Linux admin tasks, in case of OS level issues causing problems to Hadoop cluster
  • Experience on new Discovery Tools installation like tableau, greenplum, Informatica and integration with Hadoop Components.
  • Monitoring systems and services, architecture design and implementation of deployment, configuration management, backup, and disaster recovery systems and procedures.
  • Hand on experience on cluster up gradation and patch upgrade without any data loss and with proper backup plans.
  • Changing the configurations based on the requirements of the users for the better performance of the jobs.
  • Experienced in guiding the teams while application On Boarding process starting from getting access to Preparing run book for them to use Hadoop cluster.
  • Involved in snapshots and mirroring to maintain the backup of cluster data and even remotely.
  • Installation of various Hadoop Ecosystems and Daemons.
  • Experienced in managing and reviewing log files.
  • Hand on experience on configuration and management of security for Hadoop clusters using Kerberos and integration with LDAP/AD at an Enterprise level.
  • Working experience on maintaining MySQL/Postgres databases creation and setting up the users and maintain the backup of cluster metadata databases.
  • Helping the users in production deployments throughout the process.
  • Experienced in production support which involves solving the user incidents varies from sev1 to sev4.
  • Managed and reviewed Log files as a part of administration for troubleshooting purposes. Communicate and escalate issues appropriately.
  • As an admin followed standard Back up policies to make sure the high availability of cluster.
  • Develop the HIVE Scripts and load data to HIVE Tables using HDFS Frame Work.
  • Schedule HIVE Jobs through UC4 Scheduling Tool
  • Worked on moving data marts into Hadoop by using Python, Hive, Sqoop and Tableau
  • Worked on Hadoop optimizer from moving Teradata into Hadoop
  • Involved in Hive partitioning, Bucketing, and performing different types of joins on Hive table and implementing serde’s like RegEx.
  • Design, Develop & Test complex ETL Jobs using Tera Data.
  • Design and Develop UC4 (Automic ) jobs and workflows to schedule the batch jobs
  • Worked on JIRA Agile Software Development Tool.
  • Expertise in GIT Version Controlling Software
  • Created and Developed Reports and Dash boards using Tableau Visualization Tool
  • Involved in Analyzing system failures, identifying root causes, and recommended course of actions. Documented the systems processes and procedures for future references.
  • Worked with systems engineering team to plan and deploy new environments and expand existing clusters.
  • Part of Version upgrade from CDH 4.X to CDH 5.X.
  • Monitored multiple clusters environments using Cloudera Manager monitoring services like alert publisher, service/host monitor and Nagios.

Environment: CDH 5.7, HUE, Hive, pig, Sqoop, Flume, Zookeeper,spark and HBase, MYSQL, Python, Shell Scripting, Red hat Linux, sentry, Kerberos, Rstudio, Historian, Greenplum, Solr, Jupiter, Teradata, NoteBook.

Confidential, San Jose

IT Analyst

Responsibilities:

  • Gathered details about current processes and requirements from Business.
  • Designed and developed Reports & Dashboards using Tableau Data visualization Reporting Tool
  • Designed the process to load data from Teradata to Anaplan Planning Tool for Dashboard Creation
  • Worked as Lead in Data Modeling, Dimensional Modeling and Physical Design of Data warehouse projects.
  • Design, Architect and implement highly efficient & scalable ETL/ELT Processes using Informatica.
  • Created and maintained Standards and Best Practices documents for Data warehouse Design and Development
  • Performance Tuning of SQL queries in Teradata & Informatica workflows to meet SLAs.
  • Designed Scalable Solutions in migrating large (Terabytes of Data) volume of Data to EDW Teradata Warehouse.
  • Work with QA & Business teams for unit, integration & UAT testing
  • Developed $U Uprocs, Sessions, Tasks, Management Units(MU’s) &Rules and schedule informatica workflows using $U Scheduling Tool.
  • Actively involved in supporting UAT (User acceptance testing), Production Deployment & Normalization.
  • Follow up with infrastructure team, DBA & informatica support team for setting up the Dev, QA & Production Environments and resolve any environment issues which affects project timelines
  • Document best practices, Continuous process improvement with bi-Monthly sessions with teams.

Environment: Informatica 9.5/8.x, Talend,, SQL, ER/Studio 7.6.0, Tidal Scheduling Tool, Tableau Reporting Tool, OBIEE, Teradata V15.0,Oracle, PVCS,GIT, UCS UNIX Server, Hadoop, HIVE,PIG, SPARK,SPARK SQL, FAST LOAD, TPUMP, MULTILOAD,FAST EXPORT & TPT

Confidential, San Jose

IT Analyst

Responsibilities:

  • Gathered details about current processes and requirements from Business.
  • BI poc for CAB/CCB meetings to do impact assessment on change requests
  • BI impact assessment for CCW Releases
  • BI impact assessment for SOE releases
  • BI Impact assessment for Revenue attribution
  • BI impact assessment for SAJO project
  • BI Impact assessment for NGCCRM .
  • Experience writing detailed documentation including RD050, MD50, CV40 and BR100
  • Worked with global project and Operations teams to clearly and concisely communicate proposal issues and statuses.
  • Supported quarter, year-end reconciliation and closing process
  • Provided timely communication and escalation of issues to the Operations Leadership Team.
  • Worked as Lead in Data Modeling, Dimensional Modeling and Physical Design of Data warehouse projects.
  • Design, Architect and implement highly efficient & scalable ETL/ELT Processes using Informatica.
  • Created and maintained Standards and Best Practices documents for Data warehouse Design and Development
  • Designed Scalable Solutions in migrating large (Terabytes of Data) volume of Data to EDW Teradata Warehouse.
  • Extensively used Teradata utilities like fastload, fastexport, multiload, TPump.
  • Developed $U Uprocs, Sessions, Tasks, Management Units(MU’s) &Rules and schedule informatica workflows using $U Scheduling Tool.
  • Actively involved in supporting UAT (User acceptance testing), Production Deployment & Normalization.
  • Follow up with infrastructure team, DBA & informatica support team for setting up the Dev, QA & Production Environments and resolve any environment issues which affects project timelines
  • Worked on multiple concurrent projects by coordinating with global teams on providing end to end solutions delivery
  • Designed and developed Reports & Dashboards using Tableau Data visualization Reporting Tool
  • Designed and Developed Job Groups and Jobs and schedule informatica workflows using Tidal Enterprise (TES) Scheduling Tool.
  • Worked as Lead in Data Modeling, Dimensional Modeling and Physical Design of Data warehouse projects.
  • Created and maintained Standards and Best Practices documents for Data warehouse Design and Development
  • Performance Tuning of SQL queries in Teradata & Informatica workflows to meet SLAs.
  • Designed Scalable Solutions in migrating large (Terabytes of Data) volume of Data to EDW Teradata Warehouse.
  • Investigating and resolving Issues and Change Requests from customer
  • Work with QA & Business teams for unit, integration & UAT testing

Environment: UNIX, Windows NT, Oracle 9i, SQL, PL/SQL, Oracle Apps Modules, Teradata, Doller Universe, Informatica 8.x, SQL, ERWin 7.6.0, $Universe Scheduling Tool, OBIEE & Business Objects Reporting Tool, Teradata, Oracle, PVCS,UCS UNIX Server, FAST LOAD, TPUMP, MULTILOAD,FAST EXPORT

Confidential, San Jose

Oracle Apps 11i Techno functional Consultant/Analyst

Responsibilities:

  • Gathered details about current processes and requirements from Business.
  • Worked with various stakeholders like internal and external partners for setting up new Operating Units for different countries.
  • Participated in 3 different phases of Business Process Simulations
  • Performed the Gap Analysis of 11i and R12 OM processes
  • Worked with business to implement Oracle out of box functionalities.
  • Interacting with various Cross-functional team to gather business requirements
  • In one R12 instance, simulate additional Cisco scenarios identified for Phase 1.2 without integrations in an Oracle Hosted Environment
  • Re-execute BPS 1.1 scenarios with Cisco configuration & data
  • Simulate limited cross-functional Order to Cash scenarios within R12
  • Gathered requirements and implemented changes for Ordering, Invoice management, Vendor dispute processing for Indirect procurement and AP/ AR netting/ offset.
  • Analyzed, designed and developed an interface to load GL period rates.
  • Implemented Subledger Accounting (SLA)
  • Implemented Milestone based Revenue Recognition, Revenue Transfers, Revenue Reconciliation, Amortize Revenue, and Allocation Posting.
  • Designed and developed enhancements for Purchasing, Payables and Requisitions.
  • Experience writing detailed documentation including RD050, BR100, MD050
  • Developed conversion strategy and mapping (CV40) for various data objects like open AR invoices, customers, sales orders, etc.
  • Wrote Test Scripts and trained users on the process steps.

Environment: UNIX, Windows NT, Oracle 9i, SQL, PL/SQL, Oracle *Forms 6i, Reports 6i, Oracle Apps Modules, R12

Confidential, San Jose

Oracle Apps 11i Techno functional Consultant/Analyst

Responsibilities:

  • Worked with various stakeholders like internal and external partners for setting up new Operating Units for different countries.
  • Performed Gap Analysis and designed custom solutions for different requirements.
  • Review standard Oracle capabilities in the Order-to-Cash space for new buy sell entities
  • Developed the BP080 (To-be future process) documents for AP, AR, PO, FA and GL.
  • Designed and developed solutions for end-to-end processes for Evaluation Orders, Donations Orders, Service Fulfillment, RMA, etc.
  • Interacting with various Cross-functional team to gather business requirements
  • Designed the party and customer data model using TCA from business requirements.
  • Working in integrating Oracle ERP OM and AR with RevPro system for revenue recognition process.
  • Worked on Internal order transformation - Ordering & Booking entity are different
  • Working with Revenue Team in defining the revenue recognition process for software and subscription parts.
  • Developed RD50 documents for various interfaces and customizations.
  • Wrote the BP080 (To-be future process) for doing OM, AR and RMA processes which include entitlement checking, closed-loop RMA process, Install Base checking, etc.
  • Analyzed the Oracle 11i AS-IS and the Oracle R12 TO-BE processes and documented the business impacts for Revenue Recognition process.
  • Worked in mapping the TO-BE process for standard revenue recognition for both products and services.
  • Configured Revenue Recognition rules, AGIS, Intercompany Relations and Trade compliance business processes in Oracle.
  • Developed the process training manuals for various business groups.
  • Developed BR100 (setup) document for finance modules.
  • Gathered requirements and implemented changes for Ordering, Invoice management, Vendor dispute processing for Indirect procurement and AP/ AR netting/ offset.
  • Implemented E-business Tax for Germany, Singapore and Australia
  • BIE rule setups by country
  • Experience writing detailed documentation including RD050, BR100, MD050
  • Wrote Test Scripts and trained users on the process steps.
  • Provided support to QA teams on UAT/BAT testing
  • Developed the process training manuals for various business groups.
  • Provided KT sessions to Production support team on new enhancements

Environment: UNIX, Windows NT, Oracle 9i, SQL, PL/SQL, Oracle *Forms 6i, Reports 6i, Oracle Apps Modules, R12

Confidential, San Jose

Oracle Apps 11i Techno functional Consultant/Analyst

Responsibilities:

  • As part of PLS 1.1 worked on Data Migration and Inbound interfaces for Parts, MFRs, MPNs, AMLs, Commodities, TG, BU, PF objects creation and Worked on FA Case migration into Agile.
  • Worked with the Business team to get the requirements.
  • As part of Data Migration and ERP Adapter wrote FRD’s and TDD’s for Cisco Product Life-cycle Management (PLM).
  • Developed EDI interfaces to LECO and scheduled them through $U jobs.
  • Responsible for analyzing impact on various cross-functional teams within Cisco IT including Manufacturing, Customer Advocacy, Quote to Cash etc.
  • As part of PLM data migration migrated legacy system data into Agile System.
  • As part of PLS1.3 worked and wrote FRD’s and TDD’s for ERP Adapter (bi-directional) integration with Agile PLM for ECO’s and BOMs.
  • Worked on Out Bound Interface to import Items, AVLs, AMLs, PSLs, Item specific updates and Item Attribute updates by using INCOIN API.
  • Wrote the Triggers on Agile objects and replicated data from SJ to PLS and ESM to PLS by using CDB replications on Items and BOMs and ECO’s.
  • Worked on export and Import of Agile related config database dumps.

Environment: UNIX, Windows NT, Oracle 9i, SQL, PL/SQL, Oracle *Forms 6i, Reports 6i, Oracle Inventory, Oracle Bill Of Materials, Order Engineering, Oracle Purchasing, Oracle Costing, Agile methodologies.

Confidential, San Jose

Oracle Apps 11i Techno functional Consultant/Analyst

Responsibilities:

  • Worked with the Business team to get the requirements and converted them into Functional Specification Documents based on BRD’s.
  • Buried FRU (BFRU) project is a scalable model to fix and correct the understated Install Base reporting due to Buried FRU’s.
  • Custom logic scripts were written as part of this release to identify and capture previous product TAN versions in the Buried FRU Table and correspondingly corrected the EDW and Quality Business Layer (QBL) Install base numbers for all the products for a five year history period.
  • Developed the Oracle Alert for BFRU Notification
  • Written the scripts for BOM Deactivation
  • Written the scripts for CLEI code upload from CLEI database into ERP
  • Written the scripts for BOM transfer from one Org to another Org and From GLO org to local Orgs
  • Written the LECO process logic change for Cisco LEAN project and Wrote the Shell Scripts for GECO and LECO notifications
  • Wrote the scripts for auto populate and auto slam for LECO Auto Slam Orgs
  • Tuned the BOM Explosion and Eng Local Copy Program
  • Wrote the scripts for changing sourcing rules and wrote the scripts to end date sourcing rules for LEAN Org’s
  • Wrote the scripts for auto test team to capture ECO changes on MRCS side
  • Monitored and fixed the issues and created the $U jobs for ECM tools
  • Wrote the scripts to flip the item status
  • Worked on monthly AVP process
  • Wrote the scripts for BOM Grading report to Identify BOM items that are not followed the BOM attributes policy, Provide a means to measure BOM attributes accuracy per BU and product family, To control the BOM structuring and attribute management process for all BOMs of Cisco design Pre and Post LEAN
  • Worked on RoHS release 3.0 as part of PID repository and PAS tool integration
  • Tuned couple of Programs as part of P2R Pacific Release

Environment: UNIX, Windows NT, Oracle 9i, SQL, PL/SQL, Oracle *Forms 6i, Reports 6i, Oracle Bill Of Materials, Order Engineering, Oracle Purchasing, Oracle WIP, Oracle Costing, Service Contracts, Install Base.

Confidential

Oracle Apps 11i Techno functional Consultant/Onsite Coordinator

Responsibilities:

  • Reviewed the AS-IS analysis documents received from Offshore, made changes where applicable and submitted to on-site team
  • Participated in Architecture Review Process and Technical design reviews.
  • Involved in the gap analysis for Oracle Applications 11i Upgrade Project
  • Involved in setting up and testing the business process for different types of Orders and Returns.
  • Communicated the key decisions, coding standards to Offshore team on a regular basis
  • Created the technical documentation for Service item setup forms and IC Engine forms by analyzing the AS-IS analysis documents in detail
  • Worked on Cisco Auto Cancellation Batch Program and Cisco Contract Item Types Download
  • Reviewed the migrated code (From 10.7 to 11i) pertaining to OM track, extensively tested and delivered with superior quality
  • Developed the Cisco customized alerts
  • Developed the triggers on TCA tables for Customer data replication
  • Used the TCA API’s extensively in custom batch processing to replicate the customer data between databases
  • Worked on Create and Maintain Price List will be implemented using Oracle Advanced Pricing and Defining the formulas to get the custom price using Advanced Pricing and worked on some of tasks on QP repricing
  • Implemented the Cisco 11i Purge & Archive framework in Everest release for Sales Orders, Interfaces and Receivables transactions
  • Created and Customized Workflows and embedding them into apps
  • Involved in resolving OM bugs by creating TARs
  • Pro actively found the performance issues, co-ordinated with Offshore for quick resolution
  • Involved in bug fixes, testing and preparation of documentation
  • Delivered the assigned WUDs of OM track on time and few WUDs delivered much earlier than the due date
  • Responsible for creating System Test plans and for knowledge transfer to offshore QA resources
  • Supported Enterprise test cycles to fix the issues related to OM

Environment: Unix, Windows NT, Oracle 9i, SQL, PL/SQL, Oracle *Forms 6i, Reports 6i, AR, Oracle Order Management (11i), Oracle 11i Quotes, Service Contracts, Install Base.

Confidential, Sanfrancisco

Oracle Apps 11i Techno functional Consultant

Responsibilities:

  • Collected, analyzed, converted the business requirements into functional specifications and then created the technical specifications
  • Data migration/conversion analysis/design/development for - Oracle Service Contracts, Oracle TCA, Oracle Installed Base
  • Worked on Install Base Post Processing Functions for Store deployment customer information, Store deployment contact details, Transfer Ownership, Update installed at address, Split installed base quantity, Add more parties like reseller, distributor & agent to the installed base record, Create a system and associate with the license and subscription, Create relationship type between products, Expire an installed base product, End date original license after customer receives a FOC, Update deployment contact information, Create an installed base product, Additional attributes to store more information, Store legacy data, the converted installed base records will have a sales order associated with it, this number will be stored in the additional attribute. The name of the attribute is Legacy Order Number, Mass Update feature, Handle returns, etc
  • Worked on extensiouly on Install Base API’s to Transfer Ownership, Create a system, Update installed at address, Split installed base quantity, etc
  • All Installed Base trackable records from the Platinum database will be converted into Oracle using this conversion
  • Developed a package to generate the Install Base data for downstream systems reporting
  • Worked on customer conversions by using TCA API’s
  • Customized the standard Oracle API for doing the Contract level and Line Level re-price
  • Worked on the interface will encompass all data elements that need to be extracted from Oracle ERP to the Onyx/CMS system at Confidential
  • Worked on Outbound interface which will send pick release information to ECOMM, Including all order/line information. It is an outbound interface from Oracle Order Management to ECOMM.
  • Standard Contract authoring form will restrict to add one product at a time. To solve this problem developed a new form to add the products to the contract.

Environment: Unix, Windows NT, Oracle 8i, SQL, PL/SQL, Oracle *Forms 6i, Reports 6i, Oracle Bill Of Materials, Order Inventory, Service Contracts, Install Base, Workflow 2.0/2.5.

Confidential

Oracle Apps 11i Techno functional Consultant

Responsibilities:

  • Assist with Business documentation
  • Business requirement specifications
  • User requirement specifications
  • Functional design documents
  • Business Solution Documentation
  • Technical documentation
  • Manuals
  • Techno functional experience in Order Management Module like Customers, Bill-To and Ship-To Sites and addresses, Items, ATO and PTO Configurations/Bills Of Material (BOM), Order Types, Order and Line Workflows, Transaction Types, Defaulting, Processing Constraints, Credit Checking, Order and Line holds, Price list, Discounts (Automatic and Manual), Attachments, Sales Credits, Sales Orders, Returns, Scheduling, Copying, Drop Shipment, Booking, Pick Release, Trips, Ship Confirm, Ship Sets, Deliveries, Service, Order Closing, Cancellation and Fulfillment, Order Import, Process Order, APIs, Reports.
  • Designed and developed the workflow sub process which handles the post booking validations, credit check and export processing
  • Developed a concurrent program to change the item status for the given items
  • Modified the packages to accommodate the 11i OM database changes
  • Extensively used Workflow Builder to develop the custom activities and notifications to build complex workflow processes as per the business scenarios.
  • Discussed with client and technical resources to ascertain custom reporting and workflow requirements
  • Worked on designing and developing AR transactions import process using Auto Invoice interface
  • Developed production completion report, MRP requirements by SBU, Custom supply and demand report, Custom capital request report using Reports 6i
  • Upgraded Revenue Adjustments Summary Report, MRO receipts report, Custom open PO by PO type, Receivables Interface Eligible Orders Report from Reports 2.5 to Reports 6i
  • Customized the workflow to populate the line level activity status into a line extension table for easy tracking and reporting
  • Customized the workflow to send notifications to customer service reps right after order booking
  • Monitored Oracle TAR’s and coordinating the Oracle support Team.

Environment: UNIX, Windows NT, Oracle 8i, SQL, PL/SQL, Oracle *Forms 6iReports 6i, Oracle Order Management (11i), Work In Process, Inventory, Purchasing, Bill of Materials, AR, Workflow 2.0/2.5.

We'd love your feedback!