Solution Architect Resume
4.00/5 (Submit Your Rating)
SUMMARY:
- Proven 15+ years of technology consulting experience in the areas of Data Quality, Data Integration, Master Data Management, Metadata Management & Data Governance solutions enabling clients to rely on their data and take effective decisions. Extensive experience in leading multi - disciplinary teams, managing end-to-end complex global implementations and delivering tangible business outcomes.
- Expert at Fusion suite of solutions especially in the areas of PLM - IM, PD and PIM
- Experienced in Oracle PIM/PLM Modules like INV, EGO, FND etc...
- Extensively worked on architecture and implementing MDM concepts like Trust, Validation rule, Match & Merge, Message Trigger setup, package, Batch group setup, User exists.
- Extensively worked on Informatica Data Quality (IDQ) in Analysis, data cleansing, data matching, data conversion, exception workflow management, and reporting and monitoring capabilities.
- Expertise in Informatica B2B DT.
- Worked on excel, pdf, HTML, SWIFT and XML unstructured data conversion.
- Expertise in Informatica PWX for Oracle EBS, SAP ECC/BW, Web Services.
- Expertise in Real Time Informatica PWX for JMS.
- Well versed with REST APIs of Oracle HCM and SCM modules in cloud, instrumental in integration of webservice authentication involving Salesforce, Fusion Middleware.
- Have designed and developed POCs for EDQ connectors mainly CDH-EDQ, JDE- EDQ.
- Working Experience in PIM features like Bulk Item Imports, Migrations, Data Imports, UDAs, Role Based Security, Business Events, etc.
- Worked with Oracle for 5 years developing skills like Oracle Applications Framework (OA Framework), ADF, Apex, SQL, PL-SQL, Unix and Java/J2EE technologies
- Worked has Informatica (ETL) architect in System Analysis, Design, Development and Testing fields of Data Warehousing with specialization in real time data integration (with JMS Power Exchange).
- Informatica Administrator in upgrade from PowerCenter, IDQ and MDM platforms.
- Worked on integration of various data sources like Oracle 9i/10g/11, Teradata, Greenplum, SAP ECC, Oracle EBS, JMS, XML and Flat Files.
- Worked on Performance Tuning of Informatica ETL, IDQ, MDM.
- Proof of Concept exposure on Data Governance tools like Collibra Data Governance Platform, Informatica Axon, Enterprise Information Catalog.
- Trained on Hadoop ecosystem covering Hadoop, Hive, Hbase, Sqoop, Kafka, Spark.
- Exposure on tools like UiPath, Automation Anywhere, Informatica Big Data Management, Informatica Big data relationship management, Informatica Cloud Customer 360 & Tableau 9.3.
- Worked on different verticals - Financial services, Asset Management, Insurance, F&B, Manufacturing, Life Sciences, Retail, Distribution.
SPECIALTIES:
- Enterprise Data Management
- Cloud Solutions Architect
- Technology Design, Strategy and Implementation
- Quick Technology Adoption
ORACLE APPLICATIONS PROJECT EXPERIENCE:
Solution Architect
Confidential
Responsibilities:
- Facilitate requirement gathering session to understand requirements and map them in the BRD.
- Understanding customer pain points in maintaining data quality and deriving data quality rules.
- Played key role in analysis of business requirements, documentation, and validating functional testing
- Worked closely with Business users and conducting Process Play Backs and System Integration testing.
- Discuss and establish best possible integration solutions for client to integrate MDM system with downstream/upstream systems.
- Creation of unit test cases and perform the same on before release.
- Creating conversion strategy to move data from legacy system to Product Hub Cloud and CDM Cloud.
- Coordination with the MDM technical team and allocation of tasks.
- Conduct Close Room Pilots (CRP) and User Acceptance Testing (UAT).
- Serve as a single point of contact/responsibility for building and executing the technical implementation plan.
- Demonstrating PHC, CDM and iLink solutions to prospective clients.
Solution Architect
Confidential, Carlsbad, CA
Responsibilities:
- Document and detail end-to-end business processes.
- Solution design for New Item introduction and Change management for product data updates.
- Work with business teams and with Oracle to analyze and document Key design decisions
- Design and technical design documents for Adapters from PDH to end point systems like Hybris, TRAX and EDW.
- Solution and design of PDH to JDE connector.
Confidential, Temple, TX
Solution Architect
Responsibilities:
- Design and Configuration of processes for consolidating Products and Customers.
- End to End Design of PDH to Demantra, Inbound integration from AS400 and JDE to PDH and CDH.
- Worked with business users in building product Taxonomy.
Confidential, St Louis, MO
Solution ArchitectResponsibilities:
- Design and Implementation of Oracle PD, defining change management process for new parts through various product life cycles.
- Introduction of product BOM’s and solution of New BOM part introduction.
- Define and implement product rules- Match rules to find duplicates during item import and assignment /validation rules for product governance and standardization.
Confidential, Tustin, CA
Solution ArchitectResponsibilities:
- Design and Configuration of processes for Item creation, Configuration of Product workbench, Change management for product data updates.
- Cleansing of customer, supplier and product data using Data Quality tools like EDQ.
- Design and Integrate address cleansing service (Smarty streets) with EDQ
Confidential, St Louis, MO
Solution ArchitectResponsibilities:
- Develop AS IS and End state plan, Prototypes for EDQ and EDQP for data cleansing
- Designing the process flow using EDQ, EDQP and ODI
- ODI to get the third-party data from MIF, O&M and stage it for EDQ to access
- EDQ to compare the existing golden records from PS to find matches from MIF, OM and other data to pull related records into Stage
- EDQP Governance studio to visualize the data and make data changes to descriptions looking at other descriptions and there by standardize and regularize descriptions.
- EDQ match console for identifying true matches for new item introduction.
Confidential
Solution ArchitectResponsibilities:
- PDH and SDH solution design, providing solution to achieve the client requirements.
- Demonstration of PDH and SDH capabilities with prototypes for required functionalities.
- Make use of available technologies like EDQ and ODI to provide a streamlined and easy solution for conversion and data load options.
- Design integration strategy and propose best design plan based on client needs, Suggested batch integration with daily loads for Products and fort nightly changes to hierarchies.
- Solution design for Review, approval and change management for New Item introduction
- Design and technical design documents for Adapters from PDH to end point systems like Magento and Endeca.
- Real time Integration plan of PDH and JDE (transactional system)
MDM Consultant
Confidential
Responsibilities:
- Analyze functional/nonfunctional requirements and seek clarifications for better understanding of requirements.
- Defining data models on MDM (Customer, Products and Suppliers) for migrating legacy applications to Oracle MDM system.
- Working closely with PIM business owner at client location to define change management process, define workflows, templates and new product initiation process.
- Working closely with Order Management and Pricing team for implementing requirements.
- Design and Develop Oracle RICE objects based on requirements.
- Conversion of metadata like Catalogs, Attributes, Attribute groups, Pages, Search Formats, Display Formats and ICC associations.
- Conversion of data (Customers, Products and Suppliers)
- Integration of MDM systems with various web commerce portals and transactional systems
- Prepare the LLD/ detailed design documents based on HLD and briefing from Module Lead.
MDM Consultant
Confidential, Austin, TX
Responsibilities:
- Consolidate Agile and I-Drive, future and legacy systems and pull product data into PDH
- CNET integration to consume and refresh product attributes in PDH
- Create Product Data Service layer for downstream systems
Confidential, Alpharetta, GA
Solution ArchitectResponsibilities:
- Setup end to end change order process to act like new item introduction
- Setup a custom program to extract item information from R12 EBS, the program will extract all the ‘FPSTOCK’ items and associated data objects in batches of 5000 item records.
- Create an Item Batch for importing the item data file, using an ‘Data Steward’ user role, select the ‘Import Map’ file for loading item and associated data and upload xml file containing the data to be imported
- Setup New Product Data Synchronization to Fusion PDH that will be created to look into item and associated data tables/views to identify the items new and updated (based on custom date profile) item and associated data, the program will generate a tailored XML message and calls KPIT iLink to transmit file to Fusion cloud.
- Setup ‘Item Import Job’ process in Fusion PDH performs upsert (create/update) depending the availability of the item record in the system.
- Cleanse, profile, Deduplicate customers and create golden records using Oracle EDQ.
- Conversion of golden records into CDH and to store match records as Cross s.
- EDQ as a business rules engine to validate conversion data and the same process made available to external systems as a web service.
- Functional setups for configuring attribute groups, attributes and pages for both PDH and CDH
- Configuration of Workflow processes
- Conversion of items and customers into PDH and CDH
- Installations of EDQ on Apache Tomcat, server specifications, compatibility matrix build up.
- Installation of EDQP on WebLogic 11g and creation of DataLens and DSAs.
- Installation of Address verification with EDQ for validating customer and Vendor addresses.
- Create Jobs to Cleanse the existing JDE Address book data by using EDQ business rules functionality
- Create jobs for conversion for data that is inbound to JDE from various Source systems like Movex, Encap etc. The job would validate the data, transform the data to be loaded into JDE.
- Build a connector from JDE to EDQ to run jobs from JDE on EDQ and for bulk conversion.