Bods Consultant Resume
Bothell, WashingtoN
SUMMARY:
- Over 8+years of IT experience in Analysis, Design, Development and Implementation of Data Warehousing and other applications using Object Oriented Analysis, Design and Methodologies
- Over 7+ Years of Experience in ETL with Different Versions of Business Objects SAP Data Services/Data Integrator
- Over 4+ years of Business Intelligence experience with complete life cycle Implementations using Business Objects XIR3.1/XIR2/6.5 (Supervisor, Designer, Info view, Business Objects Reporting Business Objects SDK, Xcelsius, Web - Intelligence, Publisher, BO Set Analyzer, Data Integrator
- Extensive experience of over 6+ years Working with ETL Data Conversion and large data sets
- Experience in using Teradata tools and utilities like BTEQ, SQL Assistant, Fast Load, Multi Load
- Possess strong Data Modeling skills, has experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications.
- Worked on DATA MIGRATION FROM LEGACY SYSTEMS TO SAP SYSTEMS.
- DATA MIGRATION using SAP BEST PRACTICES AND ALSO LSMW.
- Experience in developing data models in HANA 1.0 SP5/6/7.
- Expertise in Data modeling- Attributes Views, Analytical Views, Calculation Views, Store Procedure Business rules in to Decision Tree and Analytical Privileges
- Hands on experience in SAP HANA Data provisioning using SAP BO Data Services 4.1 and have knowledge SLT Replication server.
- Experience with DATA Extraction Transfer Loading ETL for loading data from SAP ECC System to SAP HANA System.
- Having good knowledge on SAP HANA Architecture. Well versed with importing Meta data, table definitions, performing initial load and replication, resuming and suspending data loads using SAP HANA studio
- Worked developing, configuring and maintaining, in SAP BODS, data extraction, transformation and loading for different databases like Oracle, Teradata, DB2, Sybase ASE and MSSQL
- Experience in conducting JAD sessions, interacted with various business users and acted as a liaison between the business and development teams.
- Designed and developed Data Marts by following Star Schema and Snowflake Schema Methodology, using industry leading Data Modeling tools like ERWIN and EMBARCADERO ER Studio.
- Experienced in optimizing performance in relational and dimensional database environments by making proper use of Indexes and Partitioning techniques.
- Performed various operations like Data Cleansing, Data Scrubbing, Data Profiling and maintained data governance.
- Extensive knowledge and worked with Data Service Management Console (DSMC).
- Designed, developed, tested and supported Extract, Transform and Load (ETL) processes necessary to load and validate data warehouse using UNIX shell scripting.
- Well versed knowledge on SQL scripts on various Relational Databases.
- Experience working on performance tuning and optimization of Teradata queries
- Experience in SAP SYBASE tools like SRS and Power Designer. Also in SAP Sybase ASE database.
- Expertise in Metadata management, performed address profiling, dependency and redundancy using SAP Information steward
- Bridge gap between Business and IT using SAP Information Steward for effective Data Governance by profiling data and give instant visibility into data quality levels with end to end impact analysis and data lineage
- Responsible for Enterprise master data, metadata management, data quality, data governance strategy planning and implementation.
- Experience working with Data Quality Management, Metadata Management, Master Data Management, Process Modeling, Data Dictionary, information Stewardship, Data Profiling, Data Quality, and Data Model Standards.
- Created metadata management program with business and technical definitions and data lineage.
- Used Information Steward Metapedia to create a business glossary of terms related to the business data.
- Experience in working with Data Resource Management team (DRM), Enterprise Metadata Repository team (EMR), Corporate Data Dictionary team (CDD), Integrated Governance Board (IGB) for data quality, data Integration of enterprise data assets.
- 4+ years of experience with Data Quality of data profiling, parsing, cleansing, standardization and matching capabilities to make sure data is correct.
TECHNICAL EXPERIENCE:
LANGUAGES: Unix Shell Scripting, SQL, PL/SQL, SQL Plus, Transact SQL, ANSI SQL, Java, C, C++
RDBMS: Oracle 10g/9i/8i/8.0/7.x, IBM DB2 UDB 8.0/7.0, MS SQL Server 2012/2010/2008/2005 , Teradata,Sybase
ETL: Business Objects Data Services XI 3.1/3.2/4.0,4.1,4.2 BO Data Integrator, SSIS, Pentaho
Environment: Sun Solaris 2.6/2.7, HP-UX 10.20/9.0, IBM AIX 4.2/4.3, MS DOS 6.22, Win NT, Red Hat Linux, Win 3.x/95/98, Win 2000, Win XP. MS-DOS
Business Intelligence: Business Objects 4.0/XI 3.1 / R2 (CMC, CMS, CCM, Designer, DeskI, Web Intelligence, InfoView Dashboard Manager, Performance Manager, Tableau, Qlikview, SAP Information Steward
Technology: SAP Business Objects 4.0,4.1 Business Objects Data Services XI 3.1/3.2/4.0,4.1,4.2,SAP Information Steward 4.0,4.1,SSIS
MAJOR ASSIGNMENTS:
Confidential, Bothell, Washington
Environment: SAP BO Data Services 4.2,SAP Information Steward,SAP ECC, SQL Server, SAP ISU,SAP HANA
BODS Consultant
Responsibilities:
- Involved in gathering business requirements, functional requirements and data specification.
- Used SAP Info Steward Data Insight to analyze data quality, data profiling and to define validation rules for data migration in Data services.
- Created Data Quality scored with various quality dimensions defined by the rules using Information Steward.
- Extensively worked on troubleshooting the errors.
- Data profiling was performed using profiler in Data Services.
- Created local and central repositories.
- Scheduled Batch jobs using Data Services Management Console.
- Created Jobs, Work flows and Data flows according to the Business requirements and implemented the business logic.
- Built Data Services jobs to extract data from Source System (SAP ECC)
- Build Data Services jobs to cleanse the data based on the SAP Information Steward (IS) rules captured and configured
- Shared result set for review by Data and Business SMEs at PSE
- Build Data Services jobs that use IS rules as validations to generate the data set for manual cleansing Build DS jobs / load scripts to load cleansed data in to SAP ECC
- Created the developer, administrator and profiler users.
- Created the data stores to communicate source and target with BODS
- Worked with different transformations like Map Operation, Query, Data Transfer, Merge, Key Generation, Pivot, SQL and Validation.
- Developed jobs to extract customer Master data from ECC cleanse it based on various cleansing rules and load it back into SAP ECC
- Used the ABAP BAPI’S to load data into ECC.
- Worked on both global address cleansing and USA address cleansing transforms.
- Performed column, address, dependency, redundancy and uniqueness profiling.
- Created validation rules and rule bindings for data profiling according to business requirements.
- Created Data sources to load data into SQL Server (Staging database) before and after perfoming cleansing on the Extract tables.
- Generated Data quality reports on the dash board for the data analysis purpose.
- Used query, match, validate transforms and look up function for data integration purposes. And also break and audit functions to observe the execution.
- Performed address cleansing by using USA address cleanse.
- Worked on Extracting and loading the data into SAP ECC.
- Experience in debugging execution errors using Data Integrator logs (trace, statistics and error) and by examining the target data.
- Worked on the check in and checkout of the objects. Managed the version control.
- Identified the bottle neck of the flows and used different types of performance tuning techniques like source based, target based and degree of parallelism.
- Used Information Steward Metapedia module in Configuring Business Glossary terms, grouping them into Categories and creating Custom Attributes to Metapedia Terms and Categories.
- Built the CRUD roles and Workflows based on the Data Governance requirements.
- Have setup User Security by assigning users to various custom groups according to their access privileges.
- Worked on Information Steward Metadata Management Module
- Configured various Metadata Integrators to connect to SAP ECC, CRM, BW, BODS, BOBJ, Power Designer,Oracle,SQL server
- Configured MITI Integrator to connect to AWS Cloud.
- Scheduled the Metadata integrators to extract metadata from various source systems and loaded Information Steward Metadata Repository with the collected metadata objects and relationships.
- Generated Impact, Lineage and Usage Reports on Metadata Objects and relationships.
- Have Set up User Security for all the Metadata Management Users.
Confidential, Chicago, Illinois
Environment: SAP BO Data Services 4.1,SAP Business Objects 4.1,SAP ECC,SAP Information Steward, SAP HANA,IBM DB2
SAP BODS /BO Developer
Responsibilities:
- Worked with business process owners, stakeholders to understand and document the requirements. Interacted with the manager to understand the business requirements and implemented the business rules to redesign and enhance the existing Data Flows.
- Designed, developed and implemented solutions with data warehouse, ETL, data analysis,and BI reporting technologies.
- Responsible for day to day operations and technical deployment to ensure solution is in compliance with quality and regulations, meeting user’s requirements, and applying standards and best practices where appropriate.
- Participated in the review and approval of the technical transformation requirements document.
- Used technical transformation document to design and build the extraction, transformation, and loading (ETL) modules.
- Performed source data assessment and identified the quality and consistency of the source data.
- Extensively worked on Data Services 4.1 for migrating data from one database to another database.
- Strong experience of BODS administration and Configuration.
- Strong ETL performance tuning experience.
- Extracted metadata from SAP(CO-PA) and loaded into SAP HANA using SLT Replication
- Managed the ETL environment with regular updates to items such as address libraries which have been received from our vendors
- Worked with tables involving Hierarchies and resolved them using Hierarchy flattening whenever required.
- Experience with complex data integration projects, including multi-source system, multi-subject area, multi-data entity data environments
- Involved in performance issues while doing Full load and delta load using data services.
- Implemented various performance optimization techniques such as caching, Push-down memory-intensive operations to the database server, etc.
- Define collaborative metadata management solution for end-to-end view of sources, transformations, target stores, and uses, with lineage diagrams, glossary, end-user portal, business rules, confidentiality, privacy, and security,
- Implemented Server groups to use advance performance tuning feature such as parallel data flow, dividing dataflow in to sub dataflow and Degree of Parallelism.
- Made modifications to existing universes by adding new tables in the data warehouse, creating new objects and exported the universes to repository.
- Responsible for designing, building, testing, and deploying Dashboards, Universes and Reports using Web Intelligence through Business Objects.
Confidential, ENGLEWOOD CLIFFS, NEW JERSEY
Environment: SAP BO Data Services 4.1,SAP Information Steward 4.1,SAP ECC, Toad, SQL Server, SAP Business Objects 4.0,Tableau 8.2
Sr BI Developer
Responsibilities:
- Responsible for gathering customer requirements, ETL Column and field level mapping, acquiring customer acceptance, and managing ETL process from initial design to final implementation and deployment.
- Documented the analysis of the existing ETL jobs with information about the data flow, tables, stored procedures, source, target and their equivalent sources.
- Defined database Data Stores to allow Data Services/Data Integrator to connect to the source or target database.
- Used Business Objects Data Services/Data Integrator for ETL extraction, transformation and loading data from heterogeneous source systems.
- Fixed the data quality issues which would be impacted when at source due to the change in the Source system.
- Worked on the Low level design for all the impacted BODS jobs due to the change in the Source system
- Involved in data migration from legacy systems to SAP using AIO methodology and also LSMW.
- Extensively used SAP Best practices to load the data.
- Worked with most complex, medium and as well as low level objects while loading the data. Involved in meetings with the process team in order to help the process team work effectively and accurately completely the mapping of data from legacy system to the SAP system.
- Involved in technical design of all the objects that are involved with SAP AIO methodology and LSMW.
- Worked extensively with work flows, data flows, try catch blocks, lookups and also validation.
- Worked on the vendor master, customer master and material master IDOCS: CREMAS, DEBMAS and MATMAS.
- Interacted with both Legacy and Functional teams to design the mapping to load data into Customer Master IDOC (DEBMAS06).
- Worked extensively in loading of master data from the legacy system to the SAP system.
- Worked extensively on SAP Information Steward.
- Extensively completed data quality management using Information steward and did extensive data profiling.
- Profiled source data, implemented cleansing rules and metadata information.
- Integrated Metadata Management, data quality assessment, data quality monitoring, cleansing package builder for data stewards and business analysts to collaborate and govern trustworthiness of data.
- Used Information Steward Metapedia in providing a central location for defining standard business vocabulary of words, phrases, or business concepts.
- Created and maintained data dictionary (metadata) of database objects and their use and meaning.
- Verification of duplicate records with regard to the parent objects in order to take off all the duplicate records
- Established and maintained the overall data conversion design of the system.
- Assisted with the design of data conversion strategies.
- Analyzed legacy system data quality and performed data cleanups.
- Designed, developed, tested, and maintained Tableau functional reports based on user requirements.
- Mastered the ability to design and deploy rich Graphic visualizations with Drill Down and Drop down menu option and Parameters using Tableau.
- Worked closely with Business users. Interacted with ETL developers, Project Managers, and members of the QA teams.
- Converted existing BO reports to tableau dashboards
- Created different KPI using calculated key figures and parameters
- Developed Tableau data visualization using Cross tabs, Heat maps, Box and Whisker charts, Scatter Plots, Geographic Map, Pie Charts and Bar Charts and Density Chart.
- Provided metrics on data quality and conversion activities.
- Created Scripts like starting script and ending script for each job, sending the job notification to users using scripts and declaring the Local and Global Variables.
- Created and administered Local and Central Repositories for multi user environment.
- Migrated and tested jobs in different instances and validated the data by comparing source and target tables.
- Automating all error handling, error escalation, and email notification procedures
- Performed Unit Testing and Integration testing for the redesigned BODS jobs.
- Worked on setting up metapedia business glossary terms and trained business users and to easily search for the content using familiar business vocabulary and synonyms and efficiently navigate to terms through the Information Steward interface.
- Trained Business users to understand different terms related to each other and other Information Steward objects in Metadata Management (such as BI reports and Universe) and Data Insight (such as rules or scorecards).
Confidential, IRVING,TX
Environment: SAP BO Data Services 4.0,Erwin 8,Teradata, Oracle 10g/11g, Toad, SQL Server 2012,Tableau 8.1
SAP BI Developer
Responsibilities:
- Interacted with business analysts and End client to understand technical and functional requirements for creating new Job. Written Data Services/Data Integrator Scripts for File existence check, Daily incremental read, Daily Incremental update, workflow level Script read, workflow level Script update, workflow level Error Handling and Job Dependency.
- Defined database Data Stores to allow Data Services/Data Integrator to connect to the source or target database.
- Created Business Requirement documents (BRD’s), such as SRS & FRS and integrated the requirements and underlying platform functionality.
- Translated Business Requirements into working Logical and Physical Data Models.
- Developed physical data model and designed the data flow from source systems to Teradata tables and then to the Target system.
- Designed the technical specifications document for Teradata ETL processing of data into master data ware house and strategized the integration test plan and implementation.
- Developed complex Multi load and Fast Load scripts for loading data intoTeradata tables from legacy systems.
- Used advanced data modeling concepts such as Family of Stars, Confirmed Dimensions, and Bus Matrix in order to handle complex situations.
- Designed and Developed Use Cases, Activity Diagrams, and Sequence Diagrams using Unified Modeling Language (UML).
- Involved in the analysis of the existing claims processing system, mapping phase according to functionality and data conversion procedure.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ER Studio.
- Data modeling in Erwin; design of target data models for enterprise data warehouse (Teradata).
- Created dashboards for analyzing POS data using Tableau 8.0.
- Consistently attended meetings with the client subject matter experts to acquire functional business requirements.
- Developed case studies to understand new product launch, promotion effectiveness, trend detections, Seasonal forecasting.
- Created Scatter Plots, Stacked Bars, Box and Whisker plots using, Bullet charts, Heat Maps, Filled Maps and Symbol Maps according to deliverable specifications.
- Expert level capability in Tableau calculations and applying complex, compound calculations to large, complex data sets
- Created custom Function's Date range, Time functions, array functions, and Conditional formatting functions in the reports.
- Scheduled extract refresh for weekly and monthly reports.
- Worked on workbook Permissions, Ownerships and User filters.
- Used Business Objects Data Services/Data Integrator for ETL extraction, transformation and loading data from heterogeneous source systems.
- Defined database Data Stores to allow Data Services/Data Integrator to connect to the source or target database.
- Extracting Data from different sources like SAP ECC and flat files and loading them into requested target flat files, excel sheets, SQL Server Tables in Data Services/Data Integrator.
- Integrated DQ workflows with DI ETL jobs with cleansing and matching logic.
- Cleaned data from different sources and matching among the source data and matched against the corporate data to find the Master and subordinates and avoid duplicates entries into the system.
- Used Data Insight to profile the source data to generate the quality of source data.
- Created Scripts like starting script and ending script for each job and declaring the variables local and Global Variables.
- Defined separate data store for each database to allow Data Services/Data Integrator to connect to the source or target database.
- Tune the simple and complex transformations in Data Flows.
- Used Business Objects Data Services/Data Integrator interactive debugger to test the jobs and fixed the bugs.
Confidential, Houston, TX
Environment: SAP Business Objects 4.0 /SAP Business Objects Data Services 4.0,Oracle 11g,SAP HANA, SAP BW 7.3
SAP-BO/BODS Consultant
Responsibilities:
- Developed the universes using the Information design tool of BO 4.0 and published them.
- Converted the universes from BO 3.1 to BO 4.0 and published them
- Involved in using Crystal Xcelsius to Transform Microsoft Excel spreadsheet data into interactive flash media files for the upper level management.
- Scheduling the migrated crystal reports using BI Launch pad and emailed to the end users
- Used Import Wizards for migration of the Development to QA and QA to Production environment.
- Involved in the scalable of server based on Server performance and loads.
- Created queries(BEX Queries) using Business Objects Explorer on the SAP BW for the queries to be used for Universe Design.
- Responsible for Centralize support for Business Object suite.
- Used SAP BW as source and HANA as target with SAP Data Services as the Integrator.
- Developed Jobs in SAP Business Objects Data services which do the ETL between BW and HANA
- Designed and developed queries to generate reports in BEx Analyzer.
- Worked extensively on query designer.
- Ensures the smooth operation of the system from day to day and recovers system in the event of system failure to ensure continuity of service.
- Developed Data models (Attribute/Analytical/Calculation Views) using the SAP HANA Studio
- Extensively used Structures, cell definition, calculated key figures, restricted key figures, new selection, new formula, exceptions and conditions.
- Responsible for Analysis, Design, Migration and Implementation of Business objects suite XI 3.1 on Development environment.
- Provides technical consultant and advice enhancement and modification of these systems
- Responsible for in gathering user requirements.
- Design, implement, distribute, and maintain universes based on Enterprise Data warehouse
- Implemented the rows level security to filter firms based on Sales Group user login.
- Used Star Schema and Snow Flake Schema in the data modeling phase and Contexts and Aliases for resolving for resolving Loops and context in the Universes
- Created breaks, master detail, sorting, drill down, slice and dice, alerts, joins, hierarchies, ranking features to generate Adhoc, automated and complex Web Intelligence and Desktop Intelligence reports.
- Created custom calendar to schedule the Web Intelligence reports.
- Utilized Business Objects built-in @functions (@aggregate aware, @select, @prompt, @where) to improve the performance and to overcome intricacies related to data integrity.
- Designed and Created Dashboard using Xcelsius, Qaaws, Live Office for Proof concept for Xcelsius Products.
- Responsible for Xcelsius Dashboard System Implementation.
Confidential, Austin, TX
Environment: SAP Business Objects Data Services 4.0, SAP Information Steward 4.0,Life Cycle Manager, Teradata, SAP ECC 6.0,SAP BW 7.1
ETL Developer
Responsibilities:
- Participated in requirements gathering, analysis and structured Business Intelligence platform based on user requirements
- Created Data Flows to load data from flat file, CSV files with various formats into Data Warehouse.
- Involved in creating batch jobs for data cleansing and address cleansing.
- Involved in creating the batch jobs for de-duplicating the data for different sources.
- Worked in the BODS 4.0 Environment, the entire project implementation was on 4.0.
- Worked on making full load jobs in to incremental loads using Map Operation, Table Comparison, and Auto Correct Load.
- Extensively used Query Transform, Map Operations, Table Comparison, Merge, Case, SQL, and Validation Transforms in order to load data from source to Target Systems.
- Extensively used lookup, lookup ext functions in Data Integrator to load data from source to Target using a transform Table
- Designed & developed error catching procedures in BODI jobs.
- Scheduled and Monitor jobs using Data Integrator Management console.
- Defined separate data store for each database to allow Data Integrator to connect to the source or target database.
- Worked extensively with data profiling tool sap information steward 4.0 for ongoing project to understand the quality of data before loading into the SAP.
- Created Scripts like starting script and ending script for each job, sending the job notification to users using scripts and declaring the Local and Global Variables.
- Created and administered Local and Central Repositories for multi user environment.
- Develop jobs to extract data from SAP Extractors & Tables and SAP BW DSOs
- Migrated and tested jobs in different instances and validated the data by comparing source and target tables.
- Automating all error handling, error escalation, and email notification procedures.
- Generated several Metadata reports for Data Integrator mapping, and job execution statistics.
- Mentored end users in working with the Business Objects user module and provided technical expertise.