We provide IT Staff Augmentation Services!

Bods Architect And Lead Developer Resume

2.00/5 (Submit Your Rating)

MichigaN

SUMMARY:

  • A seasoned Agile Data Warehouse, Business Intelligence & Data Warehouse ETL Dimensional Semantic Modeling Architect, SAP Data Migration, Governance & Conversion Expert. 15+ years of Extensive Experience in SAP Business Objects Data Services (BODS/BODI), & Information Steward, Data Profiling & Data Quality. Expert in all areas of ETL space: Source/target data analyst, data modeler, data architect, business analyst, requirement analyst, & ETL developer.
  • Deep BODS experience, including SAP EIM suite - DS, DQ, DM & DQM, RDS, BODS interface with SAP MDG & MDM and Information Steward; used all versions of BODS, 4.2.5, 4.1, 4.0, 3.2, 11.7, 6.5 and earlier versions. Managed and governed BODS technical architecture and design, Managed & coordinated BODS code promotion process, BODS user management, repositories management, server management, Performance tuning.
  • More than 15 Successful projects on SAP Data Migration, Data Modeling, Architecting landscape, Sizing, Administration, Deployment, Upgrade & Optimization, Development, Testing, ETL/BODS Framework, BODS Audit jobs, BODS Code migration/Promotion, Change management, Production Rollout and Production support after go-live, using SAP BODS & its tools (Rapid Data Migration to SAP ERP & SAP CRM, DQM for SAP, Information Steward, Rapid Marts, All-in-One pre-built Jobs for SAP).
  • Extensive Experience working on Data Integration, Master Data Conversion, Data Migration & Data Cleansing in SAP ECC/CRM/BW/HANA & non SAP environments using Data Services. Used SAP Rapid Deployment Solutions (RDS) packages (v1.603 & V3.42) for Rapid Data Migration to ECC. Multiple project experience of implementation of SAP Best Practices Data Migration (BPDM) packages. IDOC processing, reprocessing & error handling experience. Rapid Data Migration to SAP ERP and SAP CRM V3.42 which is compatible with BODS v4.2.5.
  • Expert Level hands-on Knowledge of Integration of SAP BODS DQM with SAP ECC, BW & CRM systems. Implemented Business Address cleansing & Data Cleansing in SAP environment (ECC & CRM).
  • Data Extractions from SAP Applications (using Extractors, tables, iDocs & ABAP data flows) & load to SAP ECC/BW (using iDocs, Master & Transaction SAP structures, LSMW). Expertise to execute and resolve issues related to LSMW Workbenchand its recording.
  • Expert level exposure and hands-on experience in SAP EIM suite DS, DQ, DM, and Information Steward.
  • Experience Extracting Data from PeopleSoft. Used BODS as web-service in SAP ECC & CRM environment and used web-service as source in BODS jobs/data flows.
  • End-to-end experience: Installation, Configuration, Job Server Clustering, Development, Upgrade, Testing, Deployment, Support & Administration.
  • SQL Expert: HANA SQL, SQL Server & T-SQL, Oracle & PL/SQL, Teradata & its utilities, IBM DB2
  • Data Extraction & load from/to all major databases (SQL Server & T-SQL, Oracle & PL/SQL, Teradata & its utilities, IBM DB2), flat files & spreadsheets.
  • Information Steward 4.2.5 (Data Insight, Cleansing Package Builder, Metapedia, Metadata Management), Installed and Configured Information Steward (IS) v5.2.5, Defined profiling rules for Information Steward (IS) v4.2.5 Identified scorecard requirements & Configured Scorecards for Information Steward (IS) v4.2.5 SAP Information Steward for Data Quality Platform: Experience in Match Review Configuration setup & Match Strategy Changes. Created new rules for data quality score card and executed integration projects. Used 'Data Insight' data quality module of Information Steward extensively for basic profiling, rule based profiling, rules development and scorecard implementation.
  • Experience with Information Steward Installation and configuration. Configured for SAP Authorization, worked as Administrator/developer. Users, Groups (Administrators, Analyst, Rule Approver, Scorecard Manager, User, Data Steward, & CPB User) & Access privileges (roles etc.) management & Administration. Rights for connections, projects, tasks, folders & objects. Used extensively for data quality & rule validations.
  • Experience working with BW process chains. Experience working on getting data loaded through web services, configures RFC webserver on Management console.
  • Experience monitoring experience BO & IPS using CMC servers running with optimal performance. Have experience with Fix service packs and major releases, configuring and supporting SSO (Single Sign on) using Windows AD in and SAP/BOBJ environment CMC skills adding users, group security, access levels, folder security, application security, authentication methods, user alias mapping users to AD groups.
  • In depth experience in BODS in implementation, support & maintenance operations.
  • Extensive hands-on experience in Data Services Configuration, development, migration and troubleshooting. Strong Knowledge of Data Integration, Data Profiling and Data Quality. Heavily used DI transforms such as Query, Validation, Case transforms as well as DQ transforms such as Match, Associate, Data Cleanse & Address Cleanse. Many years of experience Performance optimization/Tuning jobs for high volume and complex transformation scenarios, Configuring DS components including job server and repositories. Expertise to analyze current business requirements and architect solutions within the SAP BI Business Intelligence - BODS Data Services ETL & BO Business Objects environment. 20+ years’ experience to Develop data Semantic models in, HANA, SQL server, Oracle, DB2 & Teradata data mart and relevant SAP BODS ETL mappings.
  • Used BODS to import (converts into NRDM), export metadata for XML (hierarchical) documents (files or messages) then Used & Processed XML files as source and Target. Defined Format of XML file or message (.xsd or document type definition (.dtd)
  • Extracted and familiar with data and table structures with following modules:
  • SAP Production Planning, Customer master, Material master, vendor master and COPA, SD, MM tables. Migrated legacy data to SAP ECC including following: Customer master, Material master, Vendor Master, Cost Centers, GL accounts, and other financial entities.
  • SAP applications worked with: SAP ECC, Service Excellence IS Retail, AFS (Apparel & Footwear), CRM & Production Planning (PP).
  • SAP Business Objects Rapids Marts for SAP Applications: All in one solution from SAP.
  • Implementation of SAP Business All-in-One Best Practices (AIO/BP) Data Services Pre-built jobs (Preconfigured Migration Objects) for Rapid Data Migration to SAP ERP & SAP CRM. Experience in Data Migration techniques like migration with AIO packages and data quality transformations, receiving and sending iDocs and integration with SAP ABAP to Data Services.
  • Experience in Integration SAP suite of BI products with Business Objects including Business Objects Data Services (Rapid marts), Business Objects 4.1, Crystal Reports and SAP Integration Kit.
  • Strong familiarity with Central Management Console & Lifecycle Management.
  • Few of many clients I worked for: Kellogg, Confidential, Komatsu, Heinz, Confidential, Confidential, Newport, Covance, Levi’s, Apple, Ticket Master/Live Nation etc.
  • Recent Roles: BODS ETL Architect, SAP Data Migration Lead, Master Data Lead, Legacy Data Migration Lead to ECC/CRM, ETL Architect, Data warehouse Architect, Data Warehouse Manager, BODS Lead consultant, Data Migration Lead, ETL Lead, onshore/offshore Lead/Developer.
  • Expert in writing BODS custom function.
  • Extensive Experience with Data governance & its best practices including convergence of data quality, data management, data policies, business process management, and risk management surrounding the handling of data in an organization.
  • Expert level experience in job lifecycle, deployment, Source Control and Methodology (Dev/QA/Prod), Enterprise Best Practices/Reusable Patterns/Frameworks (Logging, Error Handling, Notification, Reporting), Audit/Monitor/Growth Plan of Jobs.
  • Expert level experience in Cleansing, Enhancements, Standardization, address cleansing & Advanced ETL performance tuning for Data Services jobs.
  • Provided technical leadership in following areas: Data Integration, design, development, optimization, testing, and deployment & Administration.
  • Highly detail oriented, strong commitment to timely execution of tasks.
  • Strong Data Analysis, Modeling skills.
  • Strong documentation, communication and organizational skills.
  • Very comfortable on client site interacting with directors, managers, DBAs, developers, end users, testers, functional, non-technical staff, off shore resources and all other team members.
  • Professionalism and good sense of client service. I do things right in first time, 95 per cent of the time.
  • Experienced in working with cross-functional teams. Excellent team player with good organizational, communicational, analytical and logical skills. Collaborated and achieved consensus among diverse groups. Provided task delegation and workload management to the other ETL developers.
  • Ability to handle multiple tasks and work independently as well as in a team, experienced in interacting with Business/Technology groups.
  • Created & maintained high standards of data integration including high level & low level design documents. Created BODS development standards.
  • ETL code, test plan & test result, deployment/migration document and knowledge transfer to production support.
  • Source System Analysis and Semantic Data Modeling using ERWin & other tools.
  • Ability to design scalable models for Performance goals.
  • Knowledge transfer and client staff training as last phase of project.
  • Several Projects on Data Warehousing (using Agile Methodologies), Data Integration, Data Migration, Data Conversion, Data Quality, & MDM (Master Data Management) implemented successfully from end-to-end (design, installation, configuration, development/upgrade, Testing, deployment & support).
  • Metadata in Repository Tables & Views, Recovery Mechanisms. RDBMS Database tables, ERP SAP R/3/ECC/BW (using iDocs, Extractors & Tables), Flat-Files & Spreadsheets used as ETL Sources, Optimization Techniques used (Push-down, Bulk-Loading, Lookup Cache, Parallel Execution, Table Partitioning) Multi User ETL jobs Development environment Installed, configured, managed & maintained. Central & Local Repositories Management & BODI jobs migrations, 24x7 ETL Production Support.
  • Developed several BODS ETL jobs for Star Schema Data warehouse including audit BODI ETL jobs & synchronization of BODI ETL jobs. Load balancing in Server groups. BODI Job Scheduling, execution & monitoring. Expertise in creating Workflows using object library and tool palette to execute data flows. Expert in creating and administering the metadata Data Integrator repository in a multi user environment. Expert in publishing batch jobs and real time services via web services using Data Integrator Administrator/Access server.
  • Exploited most of features of SAP BusinessObjects Data Services 4.2.5 (BODI/BODS): Auditing, Scripting Language, Global & local Variables, Email Alerts, Debugging, Profiler Server & Repository, Job Server(s) (port 3500) Configurations, Datastore Configurations, Access Servers (port 4000), System Configuration Set Ups, Job Scheduling, Administrator (port 8080), Impact and Lineage Analysis, Data Quality (Profiler), Try-Catch, BODI functions. Error handling, & trace, statistics error logs. Implemented Change Data Capture (CDC) logic. Created & Configured Microsoft Excel workbook file formats on UNIX platform using Management console/Administration and Designer.
  • Expert in designing and development of simple and complex transformations in Data Flows. Expertise in developing, testing, debugging, documenting, and deployment of complex Batch jobs and Real time jobs to extract data from a variety of data sources. Expertise in transform the data, and load the data to specified destinations using SAP Business Object Data Services (ETL). Expert level Experience in cleansing data in Business Objects Data Integrator using various data types and mappings. Expert in implementation of Recovery Mechanisms for unsuccessful batch jobs execution. Expert in SQL writing to perform Complex DML statements (Selects, Updates, Inserts, and Deletes) on database tables. Expert in performance tuning & optimization.
  • Data Warehouse Architecture using Agile methodologies (scrum/Sprint), Data Modeling,
  • MDM (Master Data Management) Implementation.
  • Dimensional Data Modeling, Star Schema/Snowflake Schema (Kimball/Inmon methodologies),
  • Data Marts, Data warehouses Design and Load processes.
  • Writing Detail Design Specifications & Detail Technical Specification.
  • ETL, OLAP, OLTP and Report delivery methodologies.
  • Relational Database Systems like Oracle, SQL Server, DB2, Teradata, and Informix. Unix/Linux Platform including Perl, awk, sed, vi editor, and shell scripts.
  • SAP HANA in-Memory Database: SAP Landscape Transformation (SLT) Replication Server: Downloaded & installed SAP HANA from Service market Place (SMP) on Linux machine, experience un-installation, stop & starting HANA servers. Extracted data from Customer master, material master, COPA (Cost & Profit Analysis) & several other tables from SAP ECC 6.0 system & loaded to HANA database using Data Services (thru SLT Replication Server). Created connection to Business Objects Information Design Tool (IDT). Created information Space in explorer using HANA tables.
  • SAP HANA Modeling using HANA Studio, Created packages, Attribute views, Analytic views, Calculation views. SQL Editor: Created schemas, tables (row & column storage) & procedures. Development, Debug, Modeler, Team Synchronizer, Admin Console Perspectives used. Worked on Proof of Concept (PoC). Worked on Sandbox system, & Amazon cloud based HANA servers. Completed HANA training by Infogen, working on HANA certification.
  • SAP HANA Database connected to Data services. Experience taking backup & restore of HANA Database. Creating users is HANA Studio by importing file, & other methods, Rights, Authorization & access privilege management for users to HANA objects. SAP Visual Intelligence connected to HANA. Roles created using HANA Studio.
  • Created Scripts, complex Stored Procedures in HANA studio. Data Analysis, Variables and Restricted Measures created & used in Analytic views. Views secured by manipulating access, Created Analytic Privileges. Created several views (analytical, calculation & attribute) based on SAP tables for COPA (Cost & Profit Analysis), Material Mater, Customer Master. Undergone SAP HANA training. Experience SAP HANA programming including SQL, SQL Script, and CE Script, Administration including managing schemas, Semantic data model objects, import/export content. SAP HANA Security including User Management, Roles, and Privileges.
  • More than 20 years of hands-on experience in System Analysis, Design, Development, Documentation & Project Management in the fields of data Warehouse, Client server technologies which includes several years of experience in Data warehousing tools including SAP Business Objects Data Integrator (BODI) / Data Services (BODS) /DQ/DQM & Business Intelligence reporting tools SAP Business Objects Administration, Universe Design & Development, Crystal Reports. Extensive Experience in SAP BusinessObjects Data Services 4.1, (BODI, 14.1).
  • Over 14 years of experience as SAP Business Objects Data Services BODS/BODI Data Warehouse Architect, ETL Architect.
  • Over 14 years of development and 9 years of Administration & Management experience with Data Warehouse applications.
  • Over 20+ years of professional experience and successfully delivered several projects in the US IT Industry.
  • Over 20+ years of Design, Development and implementation experience of product and solutions (SDLC).
  • Over 12 years of experience in managing technical teams on shore and off-shore.
  • Sr. Business Intelligence Data Architect, having experience of successfully completing full life cycle Data Warehouse Implementation from requirements till post production support.
  • Ability and experience to lead the design, development and implementation of data warehouse solutions in fast paced environment. Maintained and distributed project schedules for complex projects. Managed and communicated with vendors and business users to resolve complex issues.

TECHNICAL SKILLS:

ETL Tools: BODI/BODS/Data Services v4.2.5, Data Quality, Information Steward, Informatica Power Center 8.1 & v8.6

Scheduling Tools: Tidal, Maestro, Appworx to schedule BODS jobs.

Databases: HANA, Oracle 10 & 11g PL/SQL, MS-SQL SERVER 2008 & 2012, T-SQL. MongoDB, noSQL

Database Tools: SQL Query Analyzer, Management Studio. Procedures, DB2 UDB 8.1, Teradata 7.1, BTEQ, MS Access 2007, 2010, 2013, Informix, Ingress.

Master Data Management Tools: SAP MDM, SAP MDG.

ERP Applications: SAP R/3 Service Excellence, ECC BW. SAP ECC, SQVI.

Database Development & Administration tools: Toad v9.6 for Oracle (PL/SQL, Stored Procedures) SQL Assistant for Teradata, Multi-Load, Fast Export Teradata Utilities, and SQL Assistant.

Data Modeling Tools: CA Erwin Data Modeler, ER Studio, Embarcadero, Toad Data Modeler v3.3, Visio. Handling logical and physical data models. E-R Diagrams, Translating E-R Data objects into Relational Constructs, Resolving Relationships, Normalizing a Data Model, Implementing a relational Data Model.

Business Intelligence Tools: Business Objects Enterprise XI R2 SP3 & 3.1 & Crystal Reports, Webi, Xcelsius.

O.S. Utilities: Perl Scripting 5.8 (UNIX), C-Shell, Korn-Shell, Sed, Awk, vi editor.

File Transmission and Encryption Utilities: TCP/IP, FTP, sFTP and PGP/GPG encryption, WS-FTP, Telnet

Source code management Tools: MS Team Foundation Server (TFS), MS Visual SourceSafe, CVS & Eclipse version control software., Perforce

Job Scheduling Tools: AutoSys, UC4 V8 Scheduler.

Other Tools: Quality Center. Track-It, DevTrack, SnagIT, APOS Business Intelligence Tools Real Time Monitor, Info scheduler, Bursting Manager. Jira. Oracle StatsPackOperating Systems HP-UNIX, Windows, Linux, UNIX-AIX (Korn, C Shells), MS-DOSHardware HP 0, UX-10, AIX, 11 UNIX System DRS/NX6000. UNIX Sun Solaris, IBM Compatible, Motorola-68030. INTEL 80386 (UNIX). Dell Pentium 400. NCR Pyramid.

PROFESSIONAL EXPERIENCE:

Confidential, Michigan

BODS Architect and Lead Developer

Responsibilities:

  • installed Hadoop adapter on BODS Management console, created data store in designer to use Hadoop adapter. Used Hadoop as source and target in BODS.
  • Used HDFS file format to use Big data. Installed pig and hive clients on Hadoop edge node. AOS, CLS, ZBB & Joywave Jobs successfully deployed in production. Developed and extensively tested master data refresh job and deployed in production successfully. KNA, KLA, Purchase Master, HR, Foundation, Material and several other modules for master data.
  • Created, modified several SAP BODS Jobs to support the Enterprise dashboards and reporting database.
  • Performed complex analysis of data impact and lineage that links dashboard data metrics to the database sources including complex SAP BODS Jobs, Workflows, and Data flows. Provided data using BODS for major dashboards that display KPI of the organization.
  • Responsible for coding and unit testing of applications capable of handling large volumes of data in a batch environment.
  • Assisted in the detailed design of the ETL application. Assisted in the triage of code breaks if required Updated the existing development related technical documents and making sure that they are always current. Used Hadoop on Linux platform.

Tools: Used: IPS 4.1 SP5, SAP Business Objects Data Services (BODS) v4.2.5, Information Steward v4.2.5, Oracle 11g, MS Access, Excel as source and target, Toad, Store Procedures, views, sequences. Oracle Database 11g Enterprise Edition v11.2. CMC, CCM, SQL Server 2014 Stored Procedures, T-SQL, Data Services Management Console, Designer, Repository Manager, Central Management Console.

Confidential, Santa Clara, CA

SAP ETL Data Services/Information Steward Architect & Lead Developer

Responsibilities:

  • Objective of this project was to determine master or golden record from customer data.
  • BODS Match transform was used heavily and Match review configuration feature of Information Steward also used extensively.
  • Data Cleansed then created groups based on first name last name break key and match criteria for email, phone number, address, city, state and Post code. a) Delta for source data. b) Cleanse & standardized all data.
  • Created match groups and filtered all groups where master record identified.
  • Clerical review of Match groups in Information Steward.
  • Filtered all groups where master record identified, Clerical review of Match groups in Information Steward
  • Loaded all data with history (SCD2) to Target database.
  • Completed match review process, Integrated Information steward 4.2 with result and managed groups and pushed the data back to BODS for processing. Incremental match processing. Address standardized and validated using Address directories.
  • Configured email in Information steward
  • Scheduled task for content type (address, name, phone, SSN etc.) data profiling in IS.
  • Created several rules & Scorecards for all quality dimensions (i.e. Accuracy, Completeness, Conformity, and Uniqueness etc.).
  • Created & published data cleansing solutions (DCS) using data cleansing advisor (in Information Steward) for content type data (person, email, phone, address etc.)
  • Data cleansing solution used in Data Services workbench and deployed in data services repository/designer. Output of this used to find highly confident groups and suspects groups and unique rows.
  • Match review was conjured in Information Steward for suspect groups so thru clerical review of them master record data can be prepared.
  • Rows were moved out and move-in and for unmatched rows new groups were created.
  • Created data cleansing rules based on data validation advisor.

Tools: Used: IPS 4.1 SP5, SAP Business Objects Data Services (BODS) v4.2.5, Information Steward v4.2.5, Oracle 11g, MS Access, Excel as source and target, Toad, Store Procedures, views, sequences. Oracle Database 11g Enterprise Edition v11.2. CMC, CCM, Data Services Management Console, Designer, Repository Manager, Central Management Console.

Confidential, Chicago

SAP ETL Data Services/Information Steward Architect & Lead Developer

Responsibilities:

  • Information Platform Services (IPS), SAP Business Objects Data Services (BODS), Information Steward (IS) architected & deployed successfully using four windows 2012 (R2) servers in each environment development, QA and in production. Performed detailed sizing for BODS/IS processing power. SAP vendor master (LFA1, LFB1, LFM1 etc.) data including purchasing Org & company code information data cleansed & successfully migrated back to SAP ECC using iDoc.
  • Used function ZINBOUND STATUS FOR DI to get status of each iDoc processed on SAP ECC. Configured Partner profiling on SAP ECC side. Used SAP Rapid Deployment Solutions (RDS) package v3.4.2 which works with BODS 4.2.5.
  • Several issue resolved during the whole deployment. Trained client staff on BODS and Information Stewards. Configured Active Directory (AD)/ Single Sign On (SSO) authentication using CMC. Configured connections for Information Stewards for BAAN, SAP ECC & SQL Server using CMC and so many other tasks in the projects like users and groups creation and configured the repositories so it won’t ask password multiple times. created central & local repositories & Registered them in CMC, fixed issues with job server not working from laptop etc. etc.
  • Fixed issue related to dynamic ports as client security did not allow them then used fixed ports.
  • Configured RFC on Management console and resolved issue related to that. Developed, Configured, exported and imported rules in information, migrated to other environments and to BODS. iDoc partner configurations issue resolved. RFC configuration issues resolved. Downloaded media (software) from SAP market place. Issues resolved with BAAN vendor Master data (TTCCOM020220). Installed postal and eLot directories on 3 BODs job severs. In 3 environments. Created several profile task including Address, column, content, Dependency, Redundancy, Uniqueness etc.
  • Created schedules for Profile and Rules tasks in CMC. Information Steward Connections configured in CMC for following i. For data Profiling, ii. For data that failed rules iii. For Data Review.
  • Created databases required for IPS, BODS & IS. Created IS repository and Job server dedicated to IS repository.
  • Created ODBCs required for IPS (CMS & Audit) and BODS job servers to work. Configured failed database for Information Steward profiling. Data cleansing advisor: Setup, cleanse and match result. Scorecards: Configured for end users. Exported and imported projects and objects in Information Steward. Created and associated custom attributes.
  • Configured match review configurations with Best record strategy: a) Main Strategy b) 1st tie breaker strategy c) 2nd tie breaker strategy. Created Windows scripts (using al rwjoblauncher.exe) and configured with external scheduler REDWOOD to trigger & schedule BODS Jobs.
  • Created scripts by exporting BODS jobs from .bat/.sh & .txt files using exec command. Created and executed LSMW workbench recordings to load master data to SAP ECC.
  • Identified, developed, created, tested, debugged, documented and maintained data quality scorecards in SAP Information Steward as well as BODS data jobs that were used to create Stage tables.
  • Audited data in accordance with the Master Data Governance program that defines and monitors the process of creating, modifying, storing and deleting data. Developed and delivered management dashboards that describe the state of master data quality across the enterprise.
  • Helped with developing new KPI's for business processes and data improvement.

Tools: Used: IPS 4.1 SP5, SAP Business Objects Data Services (BODS) v4.2.5. Rapid Deployment Solutions (RDS) packages V3.42. iDoc CREMAS05, Migration Services application, ODP/Extractors, SAP Landscape Transformation Replication Server LTRS (SLT for ABAP/RFC and non-ABAP source systems) add on v2.0. SAP ECC, SAP HANA Studio V2.0.8, SAP HANA DB V 1.00.82.00394270. Microsoft SQL Server .0.5058, Microsoft SQL Server 2008, Oracle 11g, Oracle 9i, MS Access, Excel as source and target. Information Steward v4.2.5, Oracle, PL/SQL Developer, Store Procedures, views, sequences. Oracle Database 11g Enterprise Edition v11.2. SQL Server2012, T-SQL, Stored Procedures. SAP ERP R/3 4.6/ 6.0. SAP ECC 6.0 (R701), Data Services Workbench, Data Services Management Console, Designer, Repository Manager, Central Management Console.

Confidential, Pittsburgh

Global BI ETL and BOBJ/BODS Architect/HANA Modeler, SAP Data Services ETL Architect & Lead Developer

Responsibilities:

  • SAP HANA Modeling using HANA Studio: Modeled Objects for NSV Project (below), created packages, Attribute views, Analytic views, Calculation views. SQL Editor:
  • Created schemas, tables (row & column storage) & procedures. Development, Debug, Semantic Data Modeler, Team Synchronizer, Admin Console Perspectives used.
  • The purpose of this project is to create a global platform for over 30 countries to analyze their sales data at lowest possible level (Channel, Product and Customer).
  • Various analytical reports include Global Daily Sales Performance report, Price Volume Mix Analysis, Product Category performance for various zones/regions, etc.
  • Global BI platform is created on SAP HANA using SAP Business Objects (BOBJ) to map to various local ERP/DW systems (BPCS, SAP R/3, SAP ECC, Oracle BI, etc.) to collect data on daily basis and process it centrally to facilitate:
  • Local finance teams with data submission every day/month
  • W HQ team with quicker data analysis and provide guidance to local sales teams
  • Architected from scratch and designed all dimension and fact tables, created ETL design documents for all countries (based of unique source systems) all countries we divided in to 8 zones and zones will have individual countries or group of countries and a delta job would run once a day for each zone, there were several changes like how to handle loads for zone so that they completed successfully and do not step over one another. Several custom functions using BODS script were created.
  • Designed, developed tested and migrated to QA and prod from development within the given deadline. Biggest bottleneck was getting clarity from business folks. Used V-lookups extensively during testing.
  • I made sure that best practices are followed in coding and knowledge transfer has been conducted and completed successfully. Participated in discovery sessions and workshops. Functional specifications were completed to provide feedback to business and management related to issues and risks surrounding data governance initiatives. Developed and tested all technical solutions per functional specifications and conducted user training as needed. Data governance software (SAP MDG 7.2) customized and configured & implemented for customer and vendor data. As data governance lead I was responsible for planning, design, establishment and execution of the data governance framework across all entities and branches.
  • HANA Packages, Stored Procedures, Schemas, HANA Security, user rights. HANA Analytic Views Calculation views, Attribute views were used. Scheduled BODS batch jobs using Automation tool: IBM Tivoli Workload Scheduler (TWS) Maestro Job Scheduler, previously Unison.

Tools: Used: SAP Business Objects Data Services (BODS) v4.2.3. iDocs, PSA/DTP (BW), ODP/Extractors, SAP Landscape Transformation Replication Server LTRS (SLT for ABAP/RFC and non-ABAP source systems) add on v2.0. SAP ECC, SAP MDG 7.2, SAP HANA Studio V2.0.8, SAP HANA DB V 1.00.82.00394270. Microsoft SQL Server .0.5058, Microsoft SQL Server 2008, Oracle 11g, Oracle 9i. IBM Tivoli Workload Scheduler (TWS) Maestro Job Scheduler v9.1. MS Access, Excel as source and target. Information Steward v4.1, Oracle, PL/SQL Developer, Store Procedures, views, sequences. Oracle Database 11g Enterprise Edition v11.2. SQL Server2012, T-SQL, Stored Procedures. SAP ERP R/3 4.6/ 6.0. SAP ECC 6.0 (R701), Data Services Workbench, Data Services Management Console, Designer, Repository Manager, Central Management Console.

Confidential

SAP Data Services Migration Lead Consultant

Responsibilities:

  • Used Information Steward to access and create scorecards for data quality. Utilized Information Steward for MDM reporting purposes. Developed Information Steward Exception Reports for fleet management Project. Scheduled reports and notified end users when reports have exceptions. The source for these reports was MDM. Used BO EIM products (Data Services/Data Integrator, Metadata Manager and Information Steward) extensively. Full life cycle experience with Information Steward 4.1 and Data Services profiling, cleaning data objects. Deep understanding of data quality management methodologies and approaches to data profiling and cleansing. Experience with data quality dimensions, rules, metrics, e.g., Accuracy, Completeness, Conformity, Consistency, Integrity, Timeliness, Uniqueness, Validity, Relevance, and Availability to a business rule. Used Information Steward’s Cleansing Package builder, Implemented best practices, development standards & Roles and Responsibilities for Data Services and Information Steward.
  • Used BODS to import (converts into NRDM), export metadata for XML (hierarchical) documents (files or messages) then Used & Processed XML files as source and Target. Defined Format of XML file or message (.xsd or document type definition (.dtd). Processed XML Workers claim files in BODS jobs.
  • Client acquired 2 new companies (Innuce & FIT) in Europe, new processes and modules created to extract data from their systems & integrate to existing fleet management OLTP, Marts & Central Data Warehouse systems (using Agile methodologies (scrum/Sprint)) & applications (Insight, EVOS, GDT, Lease Wave, Intelli-Fleet, Infinium). Following activities & tasks were performed:
  • Installed & Configured ETL environment, Developed ETL Framework, Error Handling routines (Trap, store, process, & inform) & re-start-ability feature of BODS, configured Clustering for multiple job servers & server groups. Designed, created, implemented, and tested Dataflows, workflows, scripts, custom functions and jobs to extract data from various data sources. Troubleshooting and performance tuning which reduced several jobs from several hours to less than one. Deployed Dataflows, workflows, scripts, and jobs to QA and Production environments using central repositories and ATL files.
  • Developed and configured Semantic data model from scratch for ETL processes and Universes in Enterprise Data Warehouse Project environment. Created ETL processes and managed entire ETL multi-system architecture. Created datastores and files formats for ETL team. Developed complex scripts/custom functions. Coached & mentored client staff by giving them hands-on training of BODS ETL tool.
  • Complex & mission critical mappings developed & deployed. Data review/scrutiny of source and target tables related issue resolved. Appropriate logic used for error handling, data cleansing & for data quality issues. SAP Data Quality installed & configured for US and Germany Address cleansing. Created and maintained best practices and naming convention documents for development, testing and bug reporting. Issues found and fixed to optimize jobs using BODS ETL tool. Code changes applied and migrated to production. Issues due to massive data volume resolved during history load. Standardization of field mappings, data dictionary, coordinated offshore ETL efforts. Installed & configured Information Steward for Data Quality rules.
  • ETL Architected around following Main objects
  • As data governance architect/lead for SAP MDG, provided expertise in data architecture, master data, reference data, dictionary, glossaries, profiling and quality. Supported data governance project management of new system implementation for database integration. Acted as a liaison among stakeholders, which include management, US entities, consultants, and system vendors to facilitate effective and efficient communication.
  • Data governance project “Data Source Catalog” (DSC) supported number critical functions and controls including Records Management (and compliance with retention regulations such as Dodd Frank), Archiving, and eDiscovery (through the identification of a data store’s owners and custodians, facilitating timely retrieval of data from the sources in response to legal or regulatory requests for data). It also provided the additional benefit of supporting IT planning, Data Architecture, application integration, rationalization, data sourcing, enterprise archival strategy, and information lifecycle management processes in general.

Tools: Used: SAP Business Objects Data Services (BODS) v4.2., Information Steward v4.1, Oracle, PL/SQL Developer, Store Procedures, views, sequences. Oracle Database 11g Enterprise Edition v11.2. SQL Server2012, T-SQL, Stored Procedures. SAP ERP R/3 4.6/ 6.0. SAP ECC 6.0 (R701), SAP MDG 6.1, Data Services Workbench, Data Services Management Console, Designer, Repository Manager, Central Management Console. MS Access, Excel as source and target.

Confidential, Dallas & Crestview. FL

Legacy Data Migration Lead to ECC/CRM /SAP Data Services Senior Consultant/Architect

Responsibilities:

  • Extracting data from SAP 4.6 /ECC 6.0 tables for Production Planning. Pulled data for several reports like BOM (Bill of Material BO, Routing operations, Open Close PO, Cost of Poor Quality etc.) using ABAP Data Flows. Interacted with Business Users and Managers in gathering business requirements. Involved in meetings with functional users, to determine the flat file layouts, data types and naming conventions of the column and table names. Worked on Dimension modeling as per business needs.
  • Configured and managed Repositories using administrator console as per business requirements. Promotion (Migration) of Jobs, work flows, data flows and other objects from Development to Production repository.
  • Created mappings using Designer and designed workflows using Workflows to build Data Warehouse as per business rules. Most of the transforms used were like the Case, Merge, and Sequence Generator, Query Transform, Map Operations, Table Comparison, SQL, and Validation Transforms
  • Configured the mappings to handle the updates to preserve the existing records using History Preserve Transformation (Slowly Changing Dimensions Type-2). Developed and modified changes in data flows according to the business logic.
  • Resolved technical issues and identified architecture and code modifications needed to support changing user requirements for multiple Data Services jobs and application. Recovered Data from Failed sessions and Workflows. Experience in debugging execution errors using Data Services logs (trace, statistics and error) and by examining the target data. Analyze the types of business tools and transforms to be applied. Tuned performance for large data files by increasing block size, cache size and implemented performance tuning logic on sources, workflows, data flow’s and SAP ECC 6.0 target system in order to provide maximum efficiency and performance. Involved in writing the Unit Test Cases.
  • Responsible for user training sessions for and assisting in UAT (User Acceptance Testing)
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Implemented security for Central repository by group and object level. Scheduled BODS batch jobs using Automation tool: IBM Tivoli Workload Scheduler (TWS) Maestro Job Scheduler, previously Unison.
  • Management Console, Designer, Repository Manager, Central Management Console, SQL Server 2008 R2, T-SQL, Stored Procedures. IBM Tivoli Workload Scheduler (TWS) Maestro Job Scheduler v9.0

We'd love your feedback!