Business Analyst/business Intelligence Analyst Resume
SUMMARY:
- ProfileHaving diversified cross - industry background focused on corporate and business-to-business initiatives for established Fortune-500 clients & led in some IT projects.
- Strong experience in Strategy, SCRUM/Agile, Roadmaps, Enterprise Architecture, Data Management, Data Governance & Ownership/Master Data Management(MDM, including Corrective and Preventive governance)
- Data Stewardship & Database Analysis, Design & Development, Business Process Management, Business Intelligence Analysis & Performed Advance Analytics with Big Data & Data Vault Structures, Performed ETL/ELT(Executes the
- Data Integration strategy in data integration projects with architecture & Processes definition)
- Data Quality(Collaborates in the definition and development of Data Integration technical Design, Development & its data standards), Data Warehouse & Business Intelligence Design, Legacy Re-engineering, Risk Assessment & Project Management standard and involved in Agile/Water Fall Methodologies.INDUSTRYFinancials (Banking, Brokerage, Asset Management)
- Costing & Accounting (General Ledger, Accounts Payable, Accounts Receivable, Cash Management), Insurance, Pharmaceuticals, Retail, Manufacturing, Supply chain/Inventory Management, Health & Medicaid
- Telecommunication, Research & Development.Core competencies and major deliverables
- 10+ years in Data Modeler/ Data Analyst
- 5+ years working as Process Modeler/Sr. Business Analyst/Business System Analyst/Solution Designer
- 5+ years BI Analyst & Data Warehouse Designer/BI Architect (ETL/ELT)/Data Migration & Data Governance Analyst
- 6+ years as Enterprise Data Architects/Information Architect
- Big Data experience in AGILE environment.
- Knowledge and experience all areas of project life cycle in a highly structured change management environment using both proprietary methodologies, Agile Techniques and RUP (Rational Unified Process).
- Good understanding of Project Process and ability to analyze business problems and identify solutions. Worked extensively with the team to meet business requirement, create prototypes and analyze workflow.
- Experience in dealing with web services, web technology related application and experience in infrastructure design changes. Further I have got ETL experience and further created ETL conceptual design and data mapping using SSIS, Informatica, SAS-DI & exposure to DataStage.
- Experienced in creating Business Requirement Documents, User Requirement Specification, and Functional Requirement Specifications. Experienced in creating Data Flow Diagrams, Use Cases, State Diagrams, Sequence Diagrams, Component Diagrams, Use Case Diagrams & Activity Diagrams. Also having been working with MDM technologies like MDS of SQL server and Informatica MDM.
- Well versed using BI and data integration tool sets like SQL Loader, SSAS & SSRS, OBIEE EM, Cognos, Business Objects, Crystal & Excel Reporting, Tableau & exposure to Domo. Good working experience in all the sprints of the project using SCRUM.
- Skilled in performing Gap analysis and Impact Analysis by identifying existing technologies, documenting the enhancements to meet the end-state requirements. Extensive understanding of Operations Management and Logistics. Exposure to Change Request Management tools like Clear Quest.
- Define, refine data architecture, Data Standards, Data Security & policies, relevant technology, data architecture process.
- Create Data Flow - a new feature of Azure Data Factory (ADF) that allows you to develop graphical data transformation logic that can be executed as activities within ADF pipelines.
- Create Azure Data Lakes, Azure Analytic and databases in Azure portal for getting data storage as service for big data computation.
- Team player and self-starter with excellent communication, coordination, documentation, responsible & accountable, project planning and interpersonal skills.
TECHNICAL SKILLS:
Technical Environment: MS Office 2007 Suite (Visio, Word, Excel, MS Project), Lotus Notes, MS SharePoint
Business/Modeling tool: Erwin, Enterprise Architect of Sparx, ER/Studio, Power Designer, Proforma, Rational Rose, RUP & Designer
Database: Oracle/Oracle EBS, MS SQL/SQL Server, Azure Data Lake, Azure Analytic Unit, DB2, Teradata & Sybase, IDMS, DBS/R
ETL/Migration: ETI, SAS-DI, SSIS/DTS & Data Bricks-Spark, ADF(Azure Data Factory), Dataflow, Power Center 8.5 of Informatica, RPD, OBIA, ODI-Oracle Data Integrator (Fusion MiddleWare), DataStage, Exposure to Talend & IBM Infosphere
Business Intelligence tool: SQL Loader, SSAS/SSRS, Power BI, OBIEE EM, Cognos, Business Objects, Crystal & Excel Reporting, Tableau & exposure to Domo; GUI Tool as WIN SQL, TOAD,ISPF & SQL Navigator
CRM/Applications: Siebel, PeopleSoft, Oracle Application, Lawson Application, Manugestics, Exposure to MS Dynamics and SAP modules
Operating Systems: MSDOS, Windows XP /Vista /V7, Unix/Linux/AIX, Unix/AIX & IBM OS/TSO, exposure to COSMOS; Project Methodologies/Frame Work/Concept as Agile SDLC/Scrum/Iterative/Spiral, Water Fall, SSADM, RAD, BPMN, BPEL, Taxonomy, Enhancement & Maintenance(EMM), Zachman Framework, TOGAF, Design Patterns, UML, Rational Unified Process (RUP), MDM(Master Data Management)/MDS & DQS, Exposure to SIX SIGMA;
Programming Languages: PL/SQL, T-SQL, SQL, USQL, Base SAS, Fortran, Cobol, C, C++, Pro C, Java 2, VBA, Java Script, HTML/DHTML, Python, XML & exposure to Pascal/NoSQL; Testing Tool as LoadRunner
Scripting Language: Manual Testing, WinRunner for GUI testing, Exposure to Selenium; Version Control as PVCS, Continnuous
Technology: Desktop, Microsoft, Microsoft Azure Portal, .NET, Oracle, Teradata & IBM Mini/Main Frame; Industry Specific Models as IAA - Insurance, FSLDM - Financial, IFW Framework (BDW’s model of IBM specific to Financial Institutions)
Servers: IIS, Web Servers/Oracle Web Servers, IBM Server/Enterprise Servers/SOAP, API, Cloud Computing
Requirement Analysis tools: Rational Requisite Pro
Defect/Test Tracking Tools: Rational Clear Quest, Test Director, Exposure to SOAP/JSON/REST & JIRA
Enterprise Security: Protocols, Controls and best practices.Exposure to NETEZZA, Hadoop, Mapreduce, Hive, Kafka, Sqoop, IBM Information Analyzer
PROFESSIONAL EXPERIENCE:
Confidential
Business analyst/Business Intelligence Analyst
Responsibilities:
- In this assignment, revisit the insurance domain, gathered requirements for various insurance domain like Auto, Property, Heavy Equipment, Genesis Liability, Alberta Housing, Kindergarten functionalities with its validation with SME(subject matter expert) & started understanding all business requirement/spreadsheet documents and also very high level business processes understanding.
- I have reviewed various insurance documents & validated all transaction and data mentioned in various line of business (LOB) spreadsheet documents with respective processes and respective data elements defined in them.
- I have created data mapping for each line of business (LOB) with defined primary keys, natural/business keys and created organization structure using key fields like core members and non-core members structure in insurance domain in Figtree Application Software as Software as a Solution (Saas).
- I helped the client in maintaining data sources like spreadsheet of various line of business with related data as sources & migrated the spreadsheet data of each LOB to backend office databases as target database using generic load processes of back office applications as ETL flow with data mapping defined by me for each LOB. Also helped the client further to create insurance s templates in the back office application for printing them for renewal process. Created use cases for each LOB in insurance industry domain.
- I have created document management process and storage, retrieval and printing of these insurance s documents.
Confidential
Data Architect
Responsibilities:
- In this assignment, reviewed Oracle PeopleSoft (Payroll Application, Student Registration & Finance) and other functionalities with its validation of all its business requirement documents with defined KPI’s identified by business analyst. Initially it was two months contract and there after got extension for five months.
- Reviewed all KPI’s with business stake holders & validated each KPI’s matrices with respective processes and respective data elements.
- Designed & created dimensional matrix of all KPI’s required by business stake holders with respective data mapping with defined primary keys and natural/business keys. Further created ETL conceptual design and provided solution to data migration strategy.
- Identified data sources for each dimensional matrix with proper granularity defined for each Facts table (KPI’s as output) that have all sources defined for each functional domain. Created source data mapping documents for respective functionalities. Created conforming date, time & Project as conforming dimensions using T-SQL store procedures of SQL Server 2016. Performed advanced ability to complete root cause analysis and proactively form recommendations for improvement in reporting using Power BI & also created processes for Excel reporting ability to answer KPI that client has documented as their needs.
- Created conceptual / logical data models of an Enterprise Warehouse for each KPI’s functional areas & validated all these data Marts / MDM(MDS/DQS) models with respective business stake holders who knows data very well, also performed data governance for all conformed dimensions in the project with its ETL implementation using SSIS and SSAS.
Confidential
Analyst/database analyst
Responsibilities:
- In this assignment, reviewed P6, Ecosys, Arm/Risk, Tempus functionalities with its validation of all its business requirement documents of these KPI’s functionalities with business analyst. It was initially two month and later on got extension for about two months and later on project scope was diverted to another technology.
- Reviewed KPI’s with business stake holders & validated KPI matrix with respective processes and respective data elements.
- Created dimensional matrix of all KPI’s required by business stake holders with respective data mapping with defined primary keys and natural/business keys. Further created ETL conceptual design and provided solution to data migration strategy.
- Provided data sources as data vault for each dimensional matrix with proper granularity defined for each Facts table (KPI’s as output) that have all sources defined for each functional domain. Created source data mapping documents for respective functionalities. Created conforming date, time & Project as conforming dimensions using T-SQL store procedures of SQL Server 2012. Performed advanced ability to complete root cause analysis and proactively form recommendations for improvement in reporting using BI tool and also created processes for Excel reporting ability to answer KPI that client has documented as their needs & prepared BI reporting using Power BI.
- Created conceptual / logical data models of an Enterprise Warehouse for each KPI’s functional areas & validated all these data Marts / models with respective business stake holders who knows data very well, also performed data governance for all conformed dimensions in the project with its ETL implementation using informatica Power Center 9/10.
Confidential
Technical analyst/data engineer
Responsibilities:
- In one assignment, I reviewed release documents myself for deployment of OBIEE components to various environments & migration of ODI - Oracle Data Integrator components being developed in development environments too. Initially it was one month contract and then got extension till its completion.
- Migrated Catalog, XML files to OBIEE analytic & RPD to OBIEE EM of various Development, Quality and Production environments.
- Integration of EBS modules to data warehouse target and created data mapping in ODI tool, created data lineage and validated all mappings with team members for its quality check.
- Moved all required source data into stage area & data warehouse using SDE, SIL and PLP phases of oracle ODI using Designer tab. Performed trouble shooting of ODI failures using Operator tab of ODI and changes in the database schema using topology tab of ODI & maintenance of ODI object security tab of ODI & validated these staged data sets being getting created during project execution from the client’s consultant who knows data/subject area very well, also performed data quality checks using manual testing.
- In another assignment, I prepared myself & got certified in ‘Information Security’ & ‘Management of Information’ e-Course at client’s location. Provided output that insert/update/delete operation which have been coded in C# in all interface documents that was handed over to me by the end client / team lead.
- Created data mapping documents for moving NAICS/NOCS information into defined target schema. Identified data sources of NAICS/NOCS and created DDL & SQL script to move NAICS/NOCS data into target schema.
- Created data mapping after analysis of source and target data sets and validated all mappings with subject matter experts.
- Moved all required source data into stage area using SSIS of SQL Server & validated these staged data sets from the client’s consultant who knows data very well, also performed data governance for MDM initiative of SAS
Confidential
Data modeler/data engineer
Responsibilities:
- Understood the business/ technical culture of the company while doing on boarding in Sun Francisco, California. (Walk thru document management/business processes). Trained/exposed myself in an open source Apache Hadoop File System. This assignment was two months duration and later got extension five months extension. I have got opportunity to work with Azure Data Lakes and Azure Analytics technologies using USQL with Azure portal in place.
- Prepared for client’s technical/ business requirements & also prepared data/ process flows diagrams just to understand various assignments of prospective client’ where the company plans to send me for consultancy work.
- Reviewed various projects & their architecture & also evaluate their requirements for the contribution that need to be imparted in various projects at respective clients when selected. Created in house business requirement documents, conceptual/logical data & process models & created their respective architecture docs for various projects; and
- Understood data flow diagram - DFD of various processes in the company and their data warehouse applications needs.
- Planned for architecture documents with data models-conceptual and logical using modeling tool Erwin and targets implementation.
Confidential
Solution designer
Responsibilities:
- As a Solution Designer, understood reporting requirements under ‘LEGO’ project already been gathered and have meetings with subject matter experts/domain to understand ‘AS IS’ & ’TO BE’ requirements to get defined work under the project executed. Understand the target schema - multidimensional data modeling. This contract was budgeted to 3 months only as it was budgeted for this work to meet client’s project plan.
- Proactively created and automate reports & intelligence via Tableau to drive improvements across our business.
- Created package deck that talks about use case/story about all scoped reports in general and to LEGO initiative program (Ontario Regulation of telecom for reporting needs). Use Case of LEGO reporting requirements have been reduced to just 3 reports and analyze/validate the data feeds for these 3 reports in Oracle database as target and identify data sources/objects of these 3 reports that comes from Super System (SS) legacy main frame;
- Used metadata server of Informatica to define the data lineage from various sources in SS-Super Systems to target already defined as oracle database. Identifying data mapping, SDS documents- very high-level documents to provide input to DDD- detail design document. Get ETL design signed off for its implementation using Power Center 9 of Informatica as data migration activity; and
- Also involved in Source Cable data migration, considering code/logic written in PL/SQL language for data migration implementation from legacy system SS-Super Systems as source to Oracle database as the target. Modify SDS-System Design Specification, data mapping/ data lineage & DDD- Detail Design Document as per the solution defined for Source Cable Migration activity/ task.
Confidential
Solution designer
Responsibilities:
- As a Coordinator perform requirements gathering sessions/ meetings to understand ‘AS IS’ &’TO BE’ H/W & S/W infrastructure of various tracks like TEST, TEST2, PPE, PSE, PROD (SEO & MAN), DEV environments. Perform analysis and modification of deployment schedule for various environments and conceptualize the planning of various s/w application releases to get deployed. Document & create plan for deployments of S/W (application code & system software) & H/W (hardware) on various environments;
- Created scoped documents, gap analysis doc & impacted list based on decision request (DR) collected during meetings with business team members/subject matter expert (SME) and understand their legacy interfaces like CPS (Brokerage), SAS (financial accounting), win fund (clients) with their respective data sources to feed ODS IGFS (Mutual Funds) & ODS IGSI (Stocks) and finally targeting CAI databases (multidimensional data modeling) which talks to downstream application like pathway, CAV, Reporting DB server application using Power BI tool;
- In another project as data modeler/integration consultant, understand requirements already gathered, also understand ‘AS IS’ & ’TO BE’ requirements in the industry, performed analysis of ‘AS IS’ data models and create conceptual model around critical data elements/base elements and its dependent data elements identified during data harmonization process & helped to create database in Microsoft Azure Environments using USQL.
- Included conceptual model’s image about the critical data requirement package deck (consisted of its respective business glossary, its data definitions, and its usage from business perspective) being created by business/technical analyst by having discussions with subject matter experts (SME) and helped in the creation of Azure Data Lakes and Azure Analytics with Azure portal in place with respective database.
- Performed advanced ability to connect data from multiple sources to a central database repository. Performed advanced ability to complete root cause analysis and proactively form recommendations for improvement in reporting & also created processes for Excel reporting ability.
- Created package deck that talks about use case/story about business glossary of various data elements being used and its business lineage (information governing rules/data governance) & its data lineage (transformation, data rules and processes about source/target data domains) in general to RDARR initiative program (Risk Data Aggregation & Risk Reporting) and taxonomy about data elements domains of RDARR initiative program;
- Used and understand FSLDM models and build its equivalent related conceptual models being built around CDE - critical data elements with respective dependent data elements, all being used in the selected critical enterprise reports. Created logical models - normalized using FSLDM template models.
Confidential
Data analyst/dba
Responsibilities:
- As a consultant performed requirements gathering sessions/ meetings to understand ‘AS IS’ & ’TO BE’ data requirements, performed data analysis and understand process flows & their semantic/conceptual model and further used Zachman Framework to perform data gap analysis of Resource Connection Store (RCS) & Historical Parameter Store (HPS) database domains. Also, identifications of correct data sources being sourced to RCS & HPS. It was three months duration and later on extended for another three months.
- Documented analysis plan, performed data analysis & in the end created data audit document for RCS, HPS & its upgrade. Used data harmonization process to define business data glossary, data quality rules, data mapping about data elements in Zachman Framework.
- Understood all data development and integration initiatives using ETL & BI tools, including data analytics and data warehousing (multidimensional data modeling for data migrations point of view).
- As an ETL thinker, reviewed various ETL existing stored procedures having various already identified data sources & landing schema defined in RCS & HPS as high-level schema and analyze their correct data mappings/ data lineage for analysis requirement
- Supported and maintained the existing applications and ETL jobs with minor/or no changes;
- Further created resources to services conceptual model, resources information model and services information model after using Zachman Frame work in doing analysis of data gaps identified.
- Analysis & design of respective end client’s business requirements to achieve at main data audit task enterprise wide.
Confidential
Data warehouse/BI analyst
Responsibilities:
- As a BI Analyst/ Data Integration Analyst performed requirement gathering sessions, analysis and created process & data flows for insurance functionality. Performed advanced ability to connect data from multiple sources to a central database repository such as SQL server. Performed ability to complete root cause analysis and proactively form recommendations for improvement in reporting and also created processes for Excel reporting ability for client’s BI requirements to answer KPI. It was 1 year and later on extended 4 months.
- Created & modified logical data models, DDL, created project plans for data warehouse design and its implementation. Further help in data conversion projects & consolidate data from ESIS, Optimal & Lawson financial databases having Customer Insurance & Lawson financials application running on it;
- Created various Data Marts derived from consolidated data repository to support company’s financial & business reporting requirements, Analyzed, designed, developed, tested and documented ETL programs using SSIS/T-SQL store procedure, SSAS & exposure to use SSRS;
- Supported and enhanced the existing ETL processes, ensured that ETL processes meet performance and operational requirements & supported various other projects connected to optimal database changes if any & thought about respective data marts/data warehouse & respective business intelligence reports changes if any in the final release of respective applications. Prepared test plan & test cases of all changes required.
- Performed data management & its governance for various data management initiative program - to create ‘business data glossary with business lineage, data quality rules, metadata documentation, data lineage mappings about all data elements to get sourced which are being used in insurance optimal applications & Lawson financial package domains.
Confidential
Architect/ BI analyst
Responsibilities:
- Created data models in both OLTP & OLAP environments and its integration process to have solution of various business requirements gathered for data warehouse needs in medical insurance/ Medicaid migration. It was 2 months and later got extension 4 months extension.
- Helped in making data consolidation & data delivery cycles complete using various integration & information delivery using Oracle, SQL server & SSIS/SSRS & SSAS of data integration/reporting tools and BI tools. Used the change management process learned during project implementation.
- Created various cubes for analytical reporting requirements, provided execution and maintenance of the ETL data migration solution. Performed advanced ability to complete root cause analysis and proactively form recommendations for improvement in reporting and created processes for Excel reporting ability. Explored all databases, servers and storages to meet company analytics requirement & executed my facets expertise too.
- Evaluated the Netezza from company’s investment point & figuring out the right combination of database, servers and storages that will fit optimally to meet the analytics requirements.
Confidential
Data analyst/architect
Responsibilities:
- Created Business Requirement Documents (BRD) for data migration & changes in the data requirements of various functionalities in health/ Medicaid migration sector;
- Conducted modification of various business processes in health industry to cater the needs of medical practitioner, patient’s requirements and created data mapping/ data lineage documents to help data migration work being implemented using various data integration tools and for reporting using various BI tools;
- Used the change management process & performed advanced ability to complete root cause analysis and proactively form recommendations for improvement in reporting and automate for Excel reporting ability
- Analyzed, designed, developed, tested and documented ETL programs. Executed advanced ability to connect data from multiple sources to a central database repository such as DB2 & SQL server.
- Performed all data integration initiatives using ETL & BI tools like DataStage and SAS DI, including analytics using cognos/SSRS and SSAS & DAX query language & design data warehousing, application migration, or consolidation and information delivery & provided Facets expertise;
- MDM analysis & design of respective end client’s business requirements to arrive at data profile activities using company’s centralized data repository which is like information governance catalogues that supports data management processes like business data glossary, data quality rules, metadata documentation, data lineage mapping using SQL server technologies & exposure to MDS;
- In another project understood Tina as the main product (Document management, Business Processes like record management, BOM etc.) for Aerospace/ Defense, Oil & Gas and Geometrics information needs;
- Supported and maintained the existing ETL processes through scheduler.
Confidential
Solution designer
Responsibilities:
- Documented SDS-System Design Specification and SRS- System Requirements documents and reverse engineer of existing PL/SQL code and modify existing SDS & SRS as per code. Plan activities within software development life cycle (SDLC). Allocate resources, IT tasks and create IT project plan & their execution.
- Performed testing of functional modules as per test plan and test cases being created by me for each business requirement/functional requirements. At times performed regression testing of some changes in the application being done due to business needs.
- In another contract with Alberta Finance, used the change management process learned during project implementation. Understood Oil & Gas / Geomatics information needs. Created CDO model with listed prices at various gas station as data element and generate XML schema from this respective CDO models of the listed prices coming from various gas station.
Confidential
Data designer/engineer
Responsibilities:
- As an Integration Analyst, understand financial business requirements like bonds, stocks, mutual funds and retail banking requirements & create business requirement documents (BRD) and seek its approval;
- Defined roadmap of data integration projects and perform reverse engineering of existing reports for arriving at correct sources;
- Performed advanced ability to complete root cause analysis and proactively form recommendations for improvement in reporting and also created processes for Excel reporting ability that supports data management business needs.
- Provided identification of correct sources for data integration activities and created data mapping/ data lineage documents. Helped in implementation of data mapping / ETL processes using Power Centre as integration tool sets & IBM Infosphere.
- Created reporting needs and use BI tools to fulfill these requirements. Used the change management process learned during project implementation. Performed advanced ability to connect data from multiple sources to a central database repository such as Oracle
- Performed data mapping implementation using Power Centre 8 for the entire data integration team members using real time integration, advanced data transformation & data virtualization.
Confidential, IL
Data & process modeler
Responsibilities:
- As Analyst/ Data Modeler & Process Modeler, created business requirement documents (BRD) after having discussion with subject matter experts (SME) of various business units in insurance industry sector;
- Performed requirements gathering process, identify processes with its respective data elements need to complete the business requirements;
- Created process models using BPMN, respective data models and create process entity matrix too;
- Created models using various tools & submit all process and logical models for its approval by business stake holders/SME. Taxonomy Design to meet analytics requirements;
- Performed data mapping/data lineage implementation using SSIS tool for the entire data integration team members using real time integration, advanced data transformation; and
- MDM analysis & data design of various subject domains in insurance sector with data catalogue creation for each domain & this solution supports data management processes like business data glossary, data quality rules, metadata documentation, data lineage mapping.
Confidential
Data modeler/engineer
Responsibilities:
- Understood retail industry processes (which includes MM, PM and Sales Order) and its business setup by reading documents;
- Created geographical conceptual model and process models using BPMN;
- Validated all models created with subject matter experts & domains team members;
- Created/ modified interface specifications and data mappings/ data lineage. Created logical models/ warehouse models as centralized repository having metadata definition conforming to information governance catalogue processes that supports data management processes like business data glossary, data quality rules, metadata documentation & data lineage mapping using modeling tool. Created advanced ability to connect data from multiple sources to a central database repository such as SQL & Oracle
- Performed advanced ability to complete root cause analysis and proactively form recommendations for improvement in reporting and created processes for Excel reporting ability for respective reporting needs.
- Performed all data integration/migration/ETL initiatives using packages, including analytics and data warehousing, consolidation, and data governance and movement of data from legacy to relational technologies.
Confidential
Designer/bi analyst
Responsibilities:
- Created ETL process & design. Performed process understanding using Siebel applications as sources;
- Identified reporting needs of medical instrument manufacturing industry. Identify data sources built in Siebel database layer and created stage areas for all Siebel sources using oracle PL/SQL store procedure;
- Created data mapping/ data lineage documents and implemented them using data integration tool sets/ programming & analytics technologies of Oracle. Created advanced ability to connect data from multiple sources to a central database repository such as Oracle. Created various reports to help management to do forecasting and hence make sound management decisions in the company’s interest. Created universes for various BI needs for reporting;
- Performed advanced ability to complete root cause analysis and proactively form recommendations for improvement in reporting and also created processes for Excel reporting ability
- Performed Enterprise Data Integration & Metadata Management Solutions using packages, Data Governance, Data Quality, Data Synchronization & Data Warehousing techniques; and
- Conceptualized MDM - master data management processes while creating data central repository having information governance catalogue that supports data management processes like business data glossary, data quality rules, metadata documentation, data lineage mapping, etc. & its data consolidation processes definition as ETL.
Confidential
Technical business analyst
Responsibilities:
- Understand wireless/ wire line telecommunication processes (Billing, Event Processing & Provisioning system) defined using Siebel processes and its respective data requirements defined in the middle layers mapped to back end data layer of Siebel;
- Created conceptual ETL design and its implementation using data mapping/data lineage documents as input;
- Validated all existing data models in data layers of Siebel along with its respective business components in middle layer defined having proper data mapping of UI, middle and data layers of Siebel analytics/OBIEE using companies’ data identified data sources. Performed creating universe relevant to BI task related for reporting;
- Performed data quality check and create processes to remove any data/ process issues happening in the Siebel order processing and all issues captured in remedy; and
- Created reports for all trouble shooting tickets captured. Perform data scrubbing activities to remove data errors if any.