Sr.sap Hana/ Bods Consultant Resume
San Francisco, CA
PROFESSIONAL SUMMARY:
- 9 years of information technology experience in SAP BW HANA, SAP Business Objects, Micro Strategy Tableau, SAP Data Services, Information Steward, Data Quality Management, Data warehousing OLAP and OLTP applications which includes 3Full Life Cycle implementations and production support in various domains like Utilities, Networking and life Science with Requirements gathering, Analysis, Design, Configuration, Develop, Test, Administration and maintenance of ETL Processes with SAP Systems, non - SAP systems, Databases, SAP Data Migration and Data Warehouses. Over 2 years of extensive experience in SAP HANA-Data Modeling, Data Provisioning, Reports development.
- Strong experience in SAP Data Migration from legacy systems to SAP systems with LSMW, IDOC and using custom functions.
- Expertise in implementing Rapid Marts, Batch, Real-Time Jobs, Performance Tuning, ABAP data flows and Complex scenarios using the most Transforms.
- Strong expertise in SAP HANA/S4HANA Data Provisioning/Replication techniques like SLT, SAP BODS 4.1/4.2 and Direct Extractor Connection (DXC).
- Extensively Worked on SAP HANA Modeler (HANA Design Studio), Designing Schemas, Creating Packages, Attribute View, Analytical View, Calculation view, Analytic Privileges, Stored Procedures and CE Functions.
- Exceptional record of delivering effectiveness, in design, development and implementation of business applications as Techno Functional consultant.
- Extensive experience of SAP BW/BI components like HANA in-memory optimized and Standard InfoCubes, DSO's, InfoObjects, MultiProviders, DataSource, InfoSets, Virtual Providers, SPO, Hierarchies, Start/End/Transfer Routines, Open Hubs, Transformations, Transient Providers, Composite Providers, IDocs, User Exits.
- Strong experience in Business Objects Data Services 4.2 and architecture as Administrator and Developer.
- Expertise in Database Design, Data Modeling, Data profiling, Data Quality, Address Cleansing, Data Cleansing, Data Migration, Data Conversion, Data Integration, Data Transformation, Data Remediation, Data Validation, Data Reconciliation, Data Marts, Data Mining, and Data Audits and Master Data Governance.
- 3+ years of experience in Information Steward for Data Profiling and rules.
- Expertise in SQL, PL/SQL Procedures, SQL Query Execution Plans, Database Design & Maintenance, Scripting, and Programming.
- Expert in Data Quality to Parse, Standardize, Cleanse, Enhance, Match, Consolidate, and Validate Data.
- Experience in Star, Snowflake schema defining Logical and Physical Data Modeling.
- Used SAP CRM IBASE Objects for hierarchical objects Data Migration to SAP.
- Expertise in Requirements gathering, Analysis, Design, Develop, Test, and Implementation of SAP MDM 5.5 and 7.1 solutions for Material, Product, Customer, Vendor, and Business Partner.
- Implementation of Projects using ASAP Methodology. Expertise in using Solution Manager, HP Quality Center.
- Experience in ALE, IDoc, RFC, BAPI, XML, WSDL, XSLT, WAS, LSMW, DB2, Oracle, SQL, and ABAP
- Good Exposure to Customer Master, Vendor Master, Sales and Distribution (SD), Material Management (MM), Production Planning (PP), Finance & Controlling (FICO), Human Resources (HR), Supplier Relationship Management (SRM), Customer Relationship Management (CRM) and Supply Chain Management (SCM).
- Capable to migrate, synchronize and replicate data among different operational systems and data sources.
- Experienced with Informatica Power Center.
- Strong leader with experience in developers and advising technical groups on ETL best practices.
- Team player with excellent communication and problem-solving skills.
- Efficient in working with Oracle 11g, MS SQL Server 2012 running on Windows & UNIX.
- Expertise in conducting corporate s for the end users.
- Used Scrum & Kanban Agile methodology.
TECHNICAL SKILLS:
ETL Tools: SAP BODS 4.2, Information Steward (IS), Data Quality Management (DQM) and Informatica
Reporting and Analytics Tools: SAP BO 4.2, Crystal Reports, Micro Strategy, Tableau and QlikView
Data Modeling Tools: SAP HANA/BW 7.4 and Oracle BI
Data Bases: Oracle, SQL and Teradata
Methodologies: Waterfall, Agile & JIRA
Defect & Incident Process: Remedy for Incident Management Charm for Change Management and HPQC for Defect Management
PROFESSIONAL EXPERIENCE:
Confidential, San Francisco, CA
Sr.SAP HANA/ BODS Consultant
Responsibilities:
- Created mapping documents for loading data into Reltio MDM Cloud application.
- Load IMS Health system data into Reltio MDM using BODS 4.2.
- Created Repositories, Users & assigned the security in the Multi-Team Environment.
- Worked in the BODS4.2 Environment, the entire project implementation was on 4.0
- Worked on the IDOC’S, Batch/ real time jobs to extract the data from the FTP Site using .NET code and load into the staging data-base for the multiple projects.
- Worked with the complex transformation and logics to stage data.
- Extracted Data from IMS Secure FTP site as flat file and loaded into Reltio using bods scripting & prepared a standard Incremental load template which enables the job failure/success notifications.
- Created Global variables to read the files from the shared location from the BODS Designer. Also worked on the FTP process to put & get the files on the data-exchange servers.
- Worked on all the standards & guidelines Best Practices Documentation, and also the functional & Technical design specification documentation.
- Migrated all the Projects of the team from DEV TO TEST TO PS TO PROD & scheduled the jobs as part of the Administrator.
- Worked on the Data Quality, Address cleanse transformations for cleansing the data.
- Provided the technical guidance & knowledge transfer of effectively using BODS4.2
Confidential, San Francisco, California
SAP Business Objects / BODS/Tableau Consultant
Responsibilities:
- Interacted with the end users to understand the business requirements and derived technical specifications from functional specifications
- Derived high level Technical Design Documents based upon the Functional Requirements Document
- Provide analysis and recommendations for ETL implementation to load data from SuccessFactors to SAP BW
- Developed 30+ inbound interfaces in SAP BODS to load the data from SuccessFactors to SAP BW by using SFAPI adapter
- Developed 9 Webi reports based on BW Bex queries showing different Detail and Summary Reports Like staffing status, schedule time to start and fill, candidate relocation and requisition audit, requisition dump etc.
- Created “Candidate All Activity Status” and “Relocation Eligible by New hire and Transfer” dashboards by using Tableau
- Developed Tableau workbooks to perform year over year, quarter over quarter, YTD, QTD and MTD type of analysis for candidacy progression status and requisition audit
Environment: SAP BO 4.2, SAP BW 7.5, BEx Analyzer, Query Designer, Micro Strategy, Tableau Oracle 11g
Confidential, San Francisco, California
SAP Business Objects / BODS/Tableau Consultant
Responsibilities:
- Interacted with the end users to understand the business requirements and derived technical specifications from functional specifications
- Derived high level Technical Design Documents based upon the Functional Requirements Document
- Provide analysis and recommendations for ETL implementation to load data to data warehouse
- Design strategy for the Business Objects Universe (FAS) and Business Objects Reports & Working on CDW data modelling
- Design & Maintenance Business objects semantic layers (Universe) Using BO IDT
- Implemented aggregate awareness on summary tables in universes BO IDT
- Developed 52 Webi reports and 2 dash boards based on universe (FAS Data Mart)
- Created different Detail and Summary Webi reports showing Utilization, Meter Read, Daily Payment, Monthly leak survey and leak open etc.
- Creating Year to Date and Rolling 12 months and Report reusable filters
- Created Universe Level Custom filters, Cascading prompts, Hierarchies, Custom Lov’s
- Created 130 Reports jobs for different business units using infoburst job scheduler
- Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau
- Developed Tableau workbooks to perform year over year, quarter over quarter, YTD, QTD and MTD type of analysis
Environment: SAP BO 4.2, Information Design Tool, BO Explorer, Analysis for Microsoft Excel,, SAP BW 7.5, Query Designer, Micro Strategy, Power BI, Tableau, Oracle and SQL server
Confidential, Indianapolis, Indiana
SAP HANA/BODS Consultant
Responsibilities:
- Involved in analyzing and designing ECC 6.0 to BI 7.3 Data Flow and responsible for the deployment solutions for FI-AP, GL and HR data extraction, IT Capital Projects Budget Analysis Reports, Version Comparison and IT Capital Dashboard.
- Extensively using Information Steward to help cleanse data, create Preload reports.
- Working with Tableau to extract data and create dashboard/reports for the management.
- Gathered and analyzed the requirement to replicate the ECC content in HANA using data provisioning.
- Leverage SAP SLT and Business Object Data Services (BODS) to acquire the data into SAP HANA Data Mart.
- Integrated BOBJ 4.0 and BODS 4.2 with SAP HANA SP11.
- Configured BODS ETL jobs in data services to call process chain in BW to send data to SAP HANA
- Resolved issues related with delta loads to SAP HANA via data services.
- Loading data into table definitions, suspending data loads, resuming data loads. Creating information objects (packages, attribute views, analytic views, calculation views Graphical and SQL script based)
- Involved in Monitoring and troubleshooting data loads.
- Created Technical specifications and standard modeling documents for SAP HANA dataflows.
- Performed user sessions for SAP HANA Information Models.
- Build complex BODS jobs to perform full/delta loads using Source/Target CDC techniques from SAP ECC to correspond SAP HANA tables.
- Created complex transformation's in BODS dataflows as per modeling requirements.
- Loaded SAP ECC tables using SAP SLT data replication.
- Performed SLT transformations and enhanced tables with missing fields.
- Performed data loads via SAP BW extractors (DXC) to HANA from legacy BW.
- Extracted Data from FI sources like Accounts Payable, General Ledger (GL) and enhanced
- Standard data sources like 0FI GL 4, 0FI AP 4, to meet the client's requirements.
- Worked with Production Support team to solve issues like failure due to invalid characters, data inconsistency and job failures.
- Performed data modeling and developed Analytical views for the transactional data, Attribute views for the master data, and the calculation views as final model in HANA Studio.
- Migrated HANA content from development, testing environment to Production environment using delivery units.
Confidential, Sunnyvale, CA
Senior SAP Data Migration Consultant
Responsibilities:
- Gathering the requirement from Business spec and functional spec
- Analyzed the Functional Specs provided by the process teams and fill the technical gaps if any before assigning it to a developer
- Preparation of Functional specification document (FSD) for respective objects.
- Preparation of Technical design documents for accordingly
- Will get the sign from respective team; later deals with whole development of corresponding objects
- Define & Develop ETL jobs to facilitate Data Warehousing
- Development of stored procedures and HANA models (Information views), Data provisioning and Transformation based on hierarchies & aggregation factor for BO consumption
- Build the multiple reports and Dashboards on Calculation View / Analytic View using BO Webi and Tableau
- Build multiple Analysis work books using SAP Analysis for office for Finance Users
- Interact with clients directly provides my inputs in various situations like during technical designing, object development and make sure the technical team is getting right inputs from the functional team
- Done performance tuning in ETL jobs while creation of various interfaces in various systems (will retrieve and push the data) like inbound interfaces (SFDC, Integrated Planning and Agile) outbound interfaces (Demantra, Hyperion, OBIEE and Serus and Marketing System)
- Have created the interface as Real time job to handle the process of standardizing the customer details i.e. Data flows from 5 boundary systems via CRM interfaces to DS, where in DS will have to standardize Name and address using SAP address directories and check for duplicate records against existing CRM data and return the results, that if duplicate found then return BP IDS with matching % or if no duplicate found then return standardize name and address details by using the Parsing, Correction, Standardization and Duplicate Matching techniques in DQM(Data Quality Management).
- Have handled threshold for determining Duplicate records entered in source CRM GUI screen through real time job using the Match Editor transformation to calculate the threshold percentage i.e. sum of the given weightage of different fields that match and Used the match code patterns the percentage similarity of entered record with the existing record is calculated
- Reviewed code and Technical Specs delivered by me, does self QC and put forward best practices; and make sure they adhere to those standards before moving the objects to Quality for further testing
- Preparation of Functional unit testing document
- Expertise in creation of ATL files and has Good support experience in maintaining a robust BODS landscape
- Good working knowledge in Installing Data services 4.X and worked on different Transport methods Direct importing of ATL, Shared directory, ftp and custom transfer to import the data from SAP R3/BI System
- Prepared ETL flows, data cleansing, matching, and auditing by creation of real time jobs using Data Services
- Experience with cleansing and profiling data in Data Services and experience troubleshooting issues with data type mismatches and formatting
- Extensively worked with Repository manager, Job server, Workbench, Information Steward, Administrator Console, Metadata Integrator in generating ETL Project, Business Intelligence, Data quality with Heterogeneous data sources including XML, R3, BI with Full load and Incremental load capabilities in a Batch or Real time data flow
- Used BODS Management Console for doing the administrative tasks live Scheduling, Monitoring jobs and for creation of users and data stores
- Extensively used ABAP data flow processing using Data services to improve the performance of the jobs
- Preparation of test scripts initiate with sing off process and will move to HPQC tool
- Dealt with the defects raised by the Functional Team during Integration and User Acceptance Testing and tracking their progress so that they are fixed on time
- Report Generation - Worked in BEX Analyzer in building Queries Created BO Webi reports based on BEx queries
Environment: SAP HANA Studio, SAP SLT, Webi Rich Client, SAP Analysis for office, SAP BODS 4.2, IS (Information Steward), DQM (Data Quality Management, Tableau 9.3 MS-Excel, SQL Server, SharePoint, Microsoft office, Windows 7
Confidential
SAP BOBJ/BODS Consultant
Responsibilities:
- Actively involved in analyzing the business requirements, functional specifications.
- Involved in meetings with the clients in order to implement business rules and in scoping for the objects.
- Delivered proven experiences in understanding the business requirements and delivered the solution to the client.
- Created/Updated Tags using Web Services for borrower’s account from US Department of .
- Analyzed, profiled, cleansed and integrated different customer data sources and consolidate customer information into single Master data mart.
- Documentation of all designs and ETL process flow changes/support.
- Created BODS Jobs, Work flows, data flows and scripts in Data Services to pull data from the legacy systems and load the data into SAP ECC from Debt Manager Database.
- Created and distributed mapping design and migration guide outlining SAP best practices.
- Designed and developed efficient data quality validation methodologies.
- Implemented efficient performance tuning procedures to eliminate the bottlenecks at source, target, mapping levels.
- Gathered and assessed issues from internal business units creating custom solutions to resolve roadblocks.
- Worked with Information Steward to profile the legacy data.
- Designing WSDL file to load client data using web services to Debt Manager and update data according to the requirement.
- Exporting the web services WSDL file and tested jobs using SOAP UI.
- Involved in migrating Vendor Master/Customer Master Data from legacy system to SAP ECC using IDocs.
- Migrated BODS jobs from DEV TO PROD using multi environment central repository.
- Used SAP AIO methodology for mapping, validation, enrichment and loading data into ECC.
- Involved in technical design of all the objects for SAP BODS jobs and migrations.
- Worked with Business Partners and Customer Master for migration.
- Worked extensively with data profiling tools (sap information steward 4.0) for ongoing project to understand the quality of data before loading into the SAP.
- Worked extensively with complex Jobs, work flows, data flows, try catch blocks, lookups, Match, Table Comparison, Pivot, Reverse Pivot, Data Transfer transformations.
- Involved in data profiling like source column data profiling, detail data profiling, used validation and address cleansing transformations by using data quality in SAP BODS 4.2.
- Involved in writing the custom functions as per business requirements for loading the data into SAP.
- Worked extensively with global variables and parameters on BODS 4.2.
- Identified bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs and redesigned the existing mappings for improving the performance
- Worked on setting up Tidal in test and then production environment to schedule daily/weekly/monthly jobs.
Environment: SAP BODS, SAP BO XI2 (Webi), Oracle 10g, MS-Excel, SQL Server, SharePoint, Microsoft office, Windows 7.
Confidential
SAP BODS Consultant
Responsibilities:
- Responsible for gathering information from clients regarding business requirements. Collaborated with business technicians to research existing business and system processes.
- Interacted with business analysts and End client to understand technical and functional requirements for migrating data from legacy Farm application to SAP CRM.
- Used Information Stewards tool to remediate data before loading.
- Worked with SAP CRM and SAP HANA as target system.
- Cleansed data using SAP DQM process.
- Worked with Farm Records/GIS Records from legacy systems.
- Used Ibase to implement Hierarchical Objects to load into SAP CRM.
- Consolidating data from two different systems and using complex transformations, send data to SAP CRM with IDOC.
- Installed BODS 4.0 on Windows server 2008 at client site and configure the whole Data Services application.
- Installed AIO templates for SAP ECC and CRM.
- Used customized Idoc to load data into SAP CRM.
- Load Farm Records Data into CRM Grantor Management.
- Used Business Objects Data Services 4.0 for ETL extraction, transformation and loading data from heterogeneous source systems such as excel and flat files.
- Developed various Workflows and Data flows where the data is extracted from the sources like Oracle 11g, then loaded into SQL Server staging tables. From staging tables data loaded into SAP CRM with different business rules.
- Used Business Objects Data Services data cleansing transformations for de-duplication, house-holding and address parsing.
- Created complex Data Services mappings to load the data warehouse, the mappings involved extensive use of transformations like Key Generation, SQL Query Transform, Table Comparison, Case, Merge, Map Operation, lookup function etc.
- Extensively working with local Repositories, Central Repositories, Job Server and Web Admin client tools.
- Used Data Services inbuilt functions like Aggregate Functions, Database Functions, Date Functions, Lookup Functions, Math Functions, System Functions and Validation Functions.
- Responsible for Debugging and testing of Data services Jobs.
- Created parameters, Global/Local Variables, Scripts, Projects, Jobs, Workflows, Data Flows and Administrating Web management Console.
- Configured the mappings to handle the updates to preserve the existing records using History Preserve Transformation, Table Comparison and key generation (Slowly Changing Dimensions).
- Involved in data profiling like source column data profiling, detail data profiling and used validation and address cleansing transformations by using data quality in Data Services 3.2.
- Responsible for scheduling jobs according to the client’s requirement.
- Set up rules and profiling for Data Remediation with Information Steward.
- Imported rules from Information Steward to Data Services.
- Run queries in SAP CRM to fetch data for clients.
- Created function module using SAP ABAP.
- Run Data Services Job and create reports for Data Remediation for USDA Midas project.
Environment: SAP BODS, SAP BO XI2 (Webi), Oracle 10g, MS-Excel, SQL Server, SharePoint, Microsoft office, Windows 7
Confidential
SAP BODS Consultant
Responsibilities:
- Requirement gathering and Business Analysis.
- Design processes to minimize data quality risks and proactively identify potential data issues.
- Created mapping for Employee and their Salaries, created UNIX script to run the jobs on scheduled night time and sent an error/success log to responsible parties.
- Used validation transformation to validate data coming in two forms, compared them and sent a report.
- Used Audit function to capture row count before and after the data load and sent statistics by mail.
- Loaded data into Data Warehouse using complex logic during the mappings.
- Conversion Requirement Specification CRS creation.
- Data Profiles using profiler repository.
- Used GPO Data Dictionary for Address Cleansing for US.
- Implemented Data Cleansing method to de-dup/cleans data.
- Wrote UNIX shell scripts for scheduling and moving data from one server to another and embedded them in Autosys to run them at night.
- Use different transformations to extract the data from legacy according to the business logic in DSE.
- De-duplication/matching is performed using DSE.
- Conduct source system data profiling, uncovering source system issues, defining an appropriate remediation methodology, and initiating remediation activities.
- Reasonable for the development of various components such as Maps, Process Flows, Exception Handling, Scheduling, functionality of pulling appropriate data from Legacy systems and applying transformation logic.
- Conduct data analysis activities including data assessments, detailed data profiling, data cleansing, data remediation, data reconciliation, conceptual modeling.
- Perform data profiling and communicate with the Information Custodians to capture and document data anomalies and potentially define needed correction/cleansing logic.
- Design processes to minimize data quality risks and proactively identify potential data issues.
- Define and implement auditing process for complex data.
- Develop functional specifications for specific data objects encompassing extraction and transformation.
- Design and assist the business in data reconciliation.
- Address Cleansing and Data Cleansing has been done using DSE Data Quality.
- Using complex transformation like PIVOT/Reverse PIVOT to align data in correct form for Data Warehouse.
- Check-in/Check-out objects in and out of central repository.
- Use Management Console to run batch jobs.
- Create Job/Workflow/Dataflow and scripts/local/global variables using DSE.
- Use Conditional/Try/Catch/while loop to perform DSE jobs.
- Migrate publication data from legacy to SAP ECC 6.0 involving Media Master/Vendor Master/Customer Master, Pricing Condition etc.
Environment: SAP BODS, SAP BO XI2 (Webi), Oracle 10g, MS-Excel, SQL Server, SharePoint, Microsoft office, Windows 7