- Strong experience, working knowledge and an in - depth understanding of SAP Data Services (BODS), Data Quality/FLDQ, Information Steward, SAP S4 HANA, SAP ECC, SAP CRM, Data warehousing
- Have led several data migration projects involving master data and transactional data for objects like Business Partner, Installed Base, Material, PP, SD, Customer, Vendor, Purchase Order, Inventory, BOM, etc. using BODS
- Played a key role in designing the processes related to BODS. Designed a Global template/Framework to be followed across the project during development of BODS Jobs
- Experience in architecture design, data modeling, data extraction, data cleansing using BODS(FLDQ)
- Experience in migration through AIO Jobs for Idoc and LSMW recording methodology.
- Data extraction through SAP Rapid Marts, ABAP data flow, etc. using Data services (BODS)
- Experience in Validating, Tracing and debugging batch jobs
- Created Workflows and Dataflow for data extraction from flat files for Master and Transactional data.
- Experience in Installation of Data Services, Information Steward, SQL Server
- Solid Experience in SAP Hana 1.0, SQL Server 05/08/R2/12, Oracle 11g, DB2
- Have experience in SAP, specializing on SAP-ABAP Conversions
- Have working experience in developing LSMW Conversions programs
- Implemented Rules and Data Quality Scorecards in Information Steward.
- Expertise in EDW extended star schema including Info Cube design, DSO design, Info Objects, Info Sources, Transfer, Update rules/Transformations.
- Experience working on Change Data Capture on both Source and Target level and implemented SCD (Slowly changing Dimension) Type 1, Type 2 and Type 3.
- Pushed data out of the SAP BI using BODS and Open hub services.
- Experience in working with BODS using with different data sources (Flat files, Oracle, SAP ECC, JDE ERP, Sales force, Microsoft SQL Server).
- Database Architect who has extensive knowledge and experience in the Business Intelligence and Data warehousing - dimensional modelling
- Extensively worked on development of mappings and worked with BODS Transformations like Map Operation, Table Comparison, Effective Date, History Preserving, SQL, Key Generation, Query, Case, Merge, Validation, Pivot, Reverse Pivot, Row Generation Etc.,
ERP: SAP S4 HANA, SAP R/3 - ECC 6.0, CRM
ABAP: LSMW (Recording/BDC), IDOC’s, BADI, RFC FM
ETL Tool: Data Services (BODS) 4.2/4.0/3.2, Data Insight/Information steward, DQXI, SSIS
Software Tool: Microsoft Visual Studio, BIDS, STATA, Rational Rose, MS Visio
Databases: SAP Hana 1.0(SPS 10,11), Microsoft SQL Server 2005/08/12, Oracle 11g
Confidential, Kansas City, MO
Data Migration Lead
- Extensively worked on SAP ECC/CRM data migration/BODS, conversion and interface projects and was involved from the inception of the project till go-live for multiple releases including requirements gathering, analysis, design and implementation of USDA’s Farm Records, Acreage Reporting and Business Partner objects.
- Worked extensively with the business and legacy system developers to understand the legacy database and gather mapping requirements for the migration.
- Was involved in creating functional specification documents and mapping document for legacy vs. target data field mappings.
- Was involved in installation of SAP Data Service, configuring CMC and setting up repositories using Data Services Repository Manager
- Involved in configuring real-time jobs for Business Partner USPS directory validation, suggestions, quarterly updates, etc.
- Led development of SAP CRM Installed Base object for 6 million farms, 10 million tracts and 50 million fields using SAP AIO Best Practices/BPDM methodologies.
- Played key role in designing, implementing and deployment of Data cleansing, Data Quality(FLDQ) and setting up real-time jobs for data quality, quarterly dictionary updates, etc. to achieve a 100 percent clean data with absolute zero tolerance for missing conversion data.
- Used Information Steward for data profiling, customer interface of profiling results, DQ results, updates, etc.
- Involved in design and development including data-modeling, data extraction from SAP(ABAP Dataflow), scheduling, monitoring and data loading into the Enterprise Data Warehouse.
- Played a critical role in performance optimization of ETL objects to handle a volume of over 50 million fields and 12 million Customers and meeting the tight cutover schedule
- Was responsible for several mock conversion cycles, generating reports on test results for business and develop strategy to resolve the issues.
- Expert level understanding in database schema design and development, developing stored procedures, functions, database performance tuning, database maintenance, installation, user security, etc.
- Worked closely with CRM team in developing the Function Module which is used for data remediation by Data Services
- Developed complex data flows and jobs using different transformations such as SQL, pivot, map, validation, table compare and sequence generator, global address match etc.
- Work closely with government leads and analysts in cut-over planning activities.
- Provides regular status tracking and reporting of project progress and manage contingencies.
- Have been supporting the MIDAS application post go-live performing data remediation, developing various reports, fixing bugs working closely with the business.
Environment: ETL Tool: SAP Data Services (BODS) 4.0/4.2, Data Quality, SAP Information Steward
Other Technologies: SAP CRM, SAP Hana, Oracle 11g, etc.
Confidential, Washington D.C
Data Migration SME
- Implementation of various migration objects including Material Master, S&D, HR, Financial Accounting (FI/CO), etc.
- Involved in High Level design, Low level Design document, unit test case and integration test case preparation, performance, and improvement of ETL.
- Extensively worked with Local Repositories, Central Repositories, Job Server and Web Admin Client tools, Migration Services.
- Have good Experience in Web Admin/Management Console (i.e. Scheduling, monitoring jobs).
- Involved in Administrative Tasks like Repository Configuration, Job Server configuration, Central Repository configuration, Job Scheduling, Monitoring.
- Involved in data profiling like source column data profiling, detail data profiling and used validation and address cleansing transformations by using data quality in SAP BODI 12.2.0
- Implementing AIO methodology in the development of Data Services jobs. Designed and developed simple and complex transformations for migration of MS-Access & SQL Server source into SAP R/3 Via IDOCS and LSMW.
- Worked on BODI/BODS transformations like Query, Hierarchy Flattering, Query, Map, Pivot, Reverse Pivot, Table Comparison, Temp Tables.
- Worked on US Address & Global Address cleaning, Match transformations to clean the data by suing Quality transformations in BODI/BODS.
- Developed scripts & custom functions to make the code the reusable and used Look up functions to reference the reference table data.
- Identified bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs and redesign the existing mappings for improving the performance.
- Applying Business Rules as required to the source data and converting to the SAP required format.
- Involved in Installation, Configuration, Maintenance, Migration and upgrade of several versions of BODI up to BOXI 4.0/3.2,
- Involved in Migration of Jobs and workflows from Development to Test and to Production Servers to perform the integration and system testing.
- Extensively used TRY/CATCH blacks to Handle Exceptions and writing Scripting to automate the Job process.
- Check the validation against SAP T tables & reconciliation.
- Applying various functions and formulas over the source data.
- Analysis of data to make sure whether the data is according to the requirements.
- Extensively worked with SQL Server like Stored Procedures, querying, T-SQL.
- Design of a dashboard capable of displaying reconciliation exceptions and sending alerts.
- Defined the SIT Test Cases / Test Scripts, Review of Unit test cases, Integration test cases and Involved in SIT Test Plan and Test Case Creation for Unit Testing and System Testing.
- Executing / validating the test cases in QUALITY CENTER 9.0 and raised the defects for failed test scripts.
- Created UNIX shell scripts to execute commands, examine text files from external source systems on different servers, automate the process of file transfers.
Senior Data Migration Specialist
- Develop SAP focused migration solutions from legacy to SAP ECC/CRM and large-scale enterprise data warehouse solution.
- Was responsible to mentor, guide and lead a team of four, ensure they follow laid procedures, quality standards and deliver optimal design objects
- Development of AIO Jobs incl. Mapping, Validations (Lookup, Mandatory, Format), Enrichments for Customer Master, Vendor Master, Material Master, BOM, Purchase Order, PIR, etc.
- Install BODS, Info Steward, configure data dictionaries, create repositories, manage version control of DS jobs, load/maintain lookup tables using Migration services.
- Work with analysts to understand legacy system, create mapping document, implement and generate profiling results, check legacy data referential integrity, provide results using information steward to business stakeholders and evaluate risks and solutions.
- Solely responsible for design of a Framework of data services jobs involving Data Extraction, transformation and loading. Briefly, the designed framework and processes are as follows: Extract the data AS-IS and stage, apply filter conditions and scope the data, implement cleansing and transformation rules - share results with clients using Information Steward.
- Develop data services jobs to load source data from multiple source systems to staging environments.
- Data remediation of legacy data and repeat various cleansing strategies till the data is cleansed and load data into SAP ECC.
- Develop custom and standard AIO jobs, perform mapping, validation and transformation. Reconcile the source data(legacy SQL Server database) with the target data and report.
- Performance tuning, configuring SAP Idocs in SAP ECC, batch processing of Idocs in SAP, etc.
- Design architecture and development of data warehouse solution using dimensional modelling techniques - Star Scheme/Snowflake.
- Develop using SAP Rapid Marts to extract data from SAP system to load into large data warehouse solutions developed in SQL Server and Netezza.
- Design and develop metadata management framework for data services jobs to perform nightly load into data warehouse from SAP that handles scheduling, monitoring/notifications, error handling, etc.
- FirstLogic (SAP Data Services DQ today) - Involved extensively in Data Cleansing, Standardization, Matching, Data Loading and maintaining Data dictionaries. FirstLogic for data quality by creating break-groups and matching customer/prospects records, addresses cleansing, etc.
- Installed global address cleansing and USA address cleansing packages. Performed column, address, dependency, redundancy and uniqueness profiling.
- Used Information steward’s data Insight to analyze the data integrity and completeness using Data Profile, adding Column queries of different types, referential Integrity and Custom Query and maintain trend reports
- Communicate with business owners and engaging them into conversions related to matching rules, profiling information and helping them understand data quality
- Develop complex stored procedures, function, performance optimization by indexing, database portioning, scheduling backups (full, transaction log & differential) and perform day-to-day routines on SQL Server
- Used Subversion and later Microsoft TFS for version control of SQL scripts, stored procedures and other objects.
Environment: ETL Tool: Data Services/BODS 3.2, SAP BO Info Steward, SSIS, Microsoft Visual Studio/BIDS
Other Technologies: Agile Development, Netezza, SQL Server 2012/2008/R2, C#, VB.Net, Subversion (SVN) and Microsoft TFS
- Develop customer focus MDM solutions, data migrations from legacy to SQL Server and develop ETL packages in Data Integrator and gained CRM experience
- Design and develop the Metadata management framework for both source data and ETL processes
- Develop Data Integrator jobs to consume data from disparate sources like flat files, Excel and SQL Server and stage them
- User a combination of DI and FirstLogic (SAP DQ today) for creating break-groups and matching customer/prospects records, addresses cleansing, etc.
- Worked extensively in building complex stored procedures, functions (udfs), etc
- Developed custom vb.net code to download FTP, SFTP files
Environment: BO Data Integrator, SQL Server 2005, FirstLogic (SAP DQ), VB.net