We provide IT Staff Augmentation Services!

Business Data Architect / Business Data Reporting Analyst/ Business Systems Analyst Resume

5.00/5 (Submit Your Rating)

Sunnyvale, CA

PROFESSIONAL SUMMARY:

  • Extensive experience with Business Intelligence, DataWarehousing, ETL processes and Business Intelligence reporting projects.
  • Extensive experience in Business Analysis and Database Analysis.
  • Extensive experience with Data Analysis, Data quality, Data transformation, Data integration.
  • Extensive experience with writting ETL (Extraction, Transformation and Load) Processes to capture, transform and Load/ flow company and enterprise data from various applications to DataWarehouses.
  • Did extensive work in Interacting with Users, Verifying Business Processes with users, documenting Business Processes and clarifying them with users when needed and get them implemented by developers.
  • Did extensive work in Data Engineering, Data Analysis and reverse engineering SAS programs(using SAS Procedures, SAS Macros e.g. getdrgs, callmod, memdata etc.).
  • Participated in User Acceptance Testing, Verifying Business Processes and Rules being delievered to the users.
  • Extensive work in PL/SQL, SQL, SQL*Plus, PRO*C, C, Unix Shell Scripting. ORACLE 7.3 and ORACLE 8i and Oracle 9i and Oracle 10g in UNIX and Windows NT environments as developer.
  • Experience with ETL tool Informatica Power center 7.1 and Informatica data explorer (IDE/IDQ),
  • Extensive experience with developing, enhancing and maintaining ETL(Extract, Transform, Load) processes for Data Mart.
  • Good knowledge of tools and techniques, Business intelligence, DataWarehouse, Electronic Banking, E - Commerce, and Marketing datawarehouse etc..
  • Did reconciliation of data between SUN's SunAware DataWarehouse and Oracle 11i ERP Applications (AR, OE & GL).
  • Supported and enhanced the Marketing Datawarehouses systems of Hewlette Packard and interacted with business users for business requirement and their issue resolutions.
  • Extensive experience in gathering and clarifying requirements and verifying requirements in both face-to-face with Business as well as in virtual-meeting envs.
  • Have excellent interpersonal, verbal, and written communication skills.
  • Ability to work independently as well as a team player.

TECHNICAL SKILLS:

LANGUAGES: PL/SQL, SQL, SQL*Plus, ORACLE*Forms 3.0, ORACLE*Forms 2.5, PRO*C, Unix Shell Scripting, Perl, C language,Oracle Forms 4.5 and Forms 6.0, Reports 2.5, Reports 6.0, Export, Import and Sql*Loader utilies.

SOFTWARE: Metavance (HealthCare System), TERADATA, Oracle 10g, Oracle 9i, Oracle 8i, Oracle Enterprise Manager 2.0.4, ORACLE 7.3/7.1/7.0,9i,10g, ORACLE Developer/2000, Designer2000, Business Objects, UNIX Shell Scripts, Informatica 7.1, Visual Basic 3.0, PVCS Tracker, VSS (Version Control s/w), Erwin, SSRS, System ESS (an ERP Package), BRIO (a reporting tool for DataWarehouse), Oracle Internet Application Server(IAS), BugTraq (a tool to keep track of Bugs and their status etc. During reconciliation), SQA (Automated Testing Tool), SQL Navigator, Hadoop/Hive, Tableau, TOAD(A tool to create Oracle Applications), SQL Developer, INFORMATICA DATA EXPLORER (IDE).

OPERATING SYSTEM : Windows NT Workstation 4.0, Windows NT Server, HP-UNIX, Windows 95/98, Sun Solaris Ver 2.51, and Unix( Server ), Sun Solaris Release 5.8 .

PROFESSIONAL EXPERIENCE:

Confidential, Sunnyvale, CA

Business Data Architect / Business Data Reporting Analyst/ Business Systems Analyst

  • understood the current manual individual attributes data reports generation process and understood the process of combining individual reports into consolidated attributes Report. understood their current manual data quality process and understood their currently generated data quality Analytical Reports.
  • Automation of the manual process of Attributes Flat File creation by the dynamical processing.
  • Automation of the manual process of Data Quality Check Rules processing and individual Rules Violation files generation by the dynamical processing.
  • Integration of all existing rules into the dynamic Automated process.
  • Publishing Attributes file and Rules files on the sharepoint automatically thru batch process.
  • Automation of manual process of AddParts Data Quality check Rules individual reports files creation.
  • Added a New Automated process for the creation of AddParts Attributes Flat File generation process.
  • Identified New Data Quality Rules meeting with Business users/Configuration Analysts.
  • Analysis of APC data based on the attribute functionality given by the user and identifying the new quality check business rules as per users input.
  • Identifying data quality issues and their cause. Discuss issue with user and propose data quality check rule to highlight impacted items.
  • Presented the data in a meaningful, understandable visual Excel format to the users
  • Verifying the new identified business rules with the business/CAs.
  • Integrating the new identified and validated rules in the automated process for generation of analytical report.

Environment: ORACLE 11g, SQL, PL/SQL, SQL Developer, Toad, APC (oracle Advance Product Catalog), MS Excel, Visio, Power point, QuoteEdge (quoting tool), Windows Task scheduler, Sharepoint (for publishing analytical reports), ALM(for Issues and enhancement tracking).

Confidential, San Jose, CA

Business Intelligence Reporting Analyst/ BI Systems Analyst / BI Data Analyst

  • Understanding the current BRIO dashboards, DPA job and staging jobs to understand the process and flow of the SIP system data.
  • Analyzing and interpreting the data issues in the Brio dashboard for unmapped quotas reported by user and back tracking the issues to find out the root cause.
  • Analyzing and interpreting the data issues in the Brio dashboard for Global sales summary reported by the user and finding out the root causes of the issues.
  • Creating Pivot in Brio to analyze and investigate the data issues in the Brio reports.
  • Discussing the identified data issues with the user and finding the resolution of it with them.
  • Gather the requirement from the user and preparing the Business requirement document for OE BBB Summary report enhancement as per the user requirement.

Environment: ORACLE 11g, SQL, PL/SQL, SQL Developer, MS Excel, Visio, Power point, BRIO (Reporting), JIRA(Issue and project tracking tool), DPA (Decision Point Warehouse Administrator).

Confidential, South San Francisco, CA

BI Reporting Analyst / Sr Data Analyst /Business Systems Analyst

  • Identifying, Analyzing and interpreting the patterns of data issues in the complex data sets of various modules of GPRS system.
  • Finding the technical solution and business logic for fixing the missing or incorrect data issues identified.
  • Writing & maintaining the Data Mappings and Transformation logics to fix missing and incorrect data.
  • Writing Oracle SQL and PL/SQL scripts to implement the logic to fix the data issues and enhance the data quality.
  • Assisting programmers with writing ETL logic and resolve any data or business process related question they have.
  • Enhancing Tableau reports and dashboards analyzing its query and modifying the queries as per business need.
  • Creating Ad hoc Oracle data reports for presenting and discussing the data issues with Business.
  • Addressing data issues identified in GPRS data and clarifying with Business about the strategy to fix the issues and presenting complex identified data information in an understandable manner for the Business.
  • Working with Business users to identify root causes for data gaps and negative patterns of data and developing corrective actions accordingly.
  • Coordinating and providing technical details to compile the Incident Reports for the validation process to validation team for the data issues and their fixing logic identified.

Environment: ORACLE 11g, SQL, PL/SQL, TOAD, ERStudio Data Architect 8.5, MS Excel, Visio, Tableau, Power point, Business Objects(Reporting), TRUmigrate (ETL tool).

Confidential, pleasanton,CA

BI Reporting ETL Analyst /Sr BI Data Analyst

  • Addressed issue reported by Business Users in the standard dashbaord reports, and analysed the issue to identify the cause of it.
  • Get the reporting issues resolved by the developers by identifying the root of it whether it is report related issue or ETL related issue.
  • Creating Ad hoc data extracts/ reports as per users needs.
  • Data validation, data cleansing and data Reconciling between ERP and OWS environments in Prod for various functional areas like AP, GL, Inventory and SC.
  • If there is any discrepancy found in these envs then investigating and analysing the issue and get it fixed in the code by the developer.
  • Reconciling and validating data between OWS and MDW environments environments in Prod for various functional areas like AP, GL, Inventory and SC.
  • Notifying users of any discrepancy between these envs so that they are aware of its effect on the cognos reports.
  • Validating and on going reconciliation of data between ERP and MDW and addressed any gaps identified between them, identified the cause of it and get it resolved and fixed in the code.
  • Cognos is used for creating reporting dashboard.

Environment: ORACLE 10g, SQL, SQL Developer, MS Excel, Visio, Cognos, Cognos Query Studio, Informatica, Peoplesoft EPM,.

Confidential, San Francisco, CA

Lead Business Intelligence Data Analyst / Business Intelligence Reporting ETL Analyst

  • RSM,
  • TMR,
  • Specialty Benefits
  • Working as Senior Data Analyst / Lead ETL Analyst for the implementation of Enterprise Data Warehouse (EDW) and Trend Monitoring Reports (TMR).
  • Leading team of five people to get the current TMR system Reverse Engineered SAS Code(using
  • SAS Procedures, SAS Macros e.g. getdrgs, callmod, memdata etc.) and to carry out the Current State Analysis of TMR application.
  • Reverse Engineered the existing TMR system, which consists of SAS programs, Oracle Data Marts, MSAccess programs and DBs and XLs final reports.
  • Identified and explored all the other Sources needs, apart from datamarts, like corporate reserve data, completion factor data, cac data etc., which are provided by other deptts and being used to generate the current TMR reports.
  • Conducted JAD Sessions with the Business Users and Technical Owners to review the Reverse Engineered Outputs and Current State Analysis Outputs, Flow Diagrams and all identified other sources etc. to verify the correctness of our understanding of the current system and get the Sign off from them.
  • Working in identifying the Business Logics from the currently running SAS programs and get them verified and reviewed by the Business Users and Technical Owners, to be used for ETL mappings.
  • Working on ETL Mapping documents, from EDW as the new implemented source to TMR reports with the identified Business Rules in SAS programs.
  • Working on source to target mappings for EDW and understanding EDW model by interacting and working closely with Data Modelers and SMEs and participating in Logical / physical Data Model sessions/reviews.
  • Understood the existing Data Model for the source data marts in order to integrate them and map them to the new implemented EDW.
  • Analyzing healthcare related key subject areas - CLAIMS, Capitation, Member, Group, Subgroup, Provider and Product, HMO, PPO by looking and analysing the Sources and Source Data. Analyzing data on large databases as the database holding data for these key areas are large size databases.
  • Having working sessions with Data Modelers to review new EDW Models for above mentioned subject areas and meetings with SMEs to understand existing IA datamarts in order to map the IA marts to new EDW source to target.
  • Lead the effort of Analysing ETL business rules from Rosetta existing Datamarts to new designed Enterprise DataWarehouse (EDW).
  • Creating ETL Mapping documents for Rosetta Data Marts, as source, to new modeled Enterprise DataWarehouse(EDW) as target, for Subject areas like CLAIM Headers, CLAIM Lines, Groups, BENEFITS, Group Benefit, Membership & Revenue, Member Benefit etc..
  • Coordinating with Developers to clarify any issues/clarifications related to ETL mappings business rules from ODS and Rosetta Datamarts to EDW and clarifying them with Business.
  • Addressing developer’s any data related issues and profiling the Source data to address them and clarifying them with Business to define the proper business rules to handle Source data issues.
  • Addressing Post Production issues and UAT issues with Business and getting it resolved with developers, for the Data coming from multiple sources of ODS and Rosetta Datamarts to EDW.

Environment: ORACLE 10g, SQL, SAS Enterprise Ver 4.1.0.471, SAS Programing, ERWIN data modeler, INFORMATICA DATA EXPLORER (IDE/IDQ), Informatica, Visio, Business Objects, TOAD, Mingle(defect tracking tool), MSAccess.

Confidential, Pleasanton, CA

Senior Business Intelligence Data Analyst/ Business Intelligence Analyst/Business Systems Analyst

  • Worked on Revenue Cycle as Business Systems Analyst. Worked on ETL mappings and business rules from Source System CLARITY to new Staging Datamart being implemented in Teradata, and from this staging Datamart to Medifinance extract target tables.
  • Coordinate with developers to resolve any ETL rules related issue as well as identified missing elements from the staging datamart which are needed by the Medifinance reporting system and get their business rules clarified from the business and add the the mapping rules for them in the ETL document for the developers.
  • Working on Membership Datawarehouse (MDW) of Confidential including the areas like Members, Benefits, Products, Groups, etc as Database Analyst and Business Systems Analyst, and same also for Enterprise Datawarehouse (EDW).
  • Membership Datawarehouse (MDW): It extracts data from FS and CM Sources and maintains data for 5 CM Regions (OH,CO,GA,NW,HI) and 2 FS Regions (SCAL and NCAL) to provide this data to our Customer projects.
  • The Data in MDW is Enterprise level Membership Data and it holds various business functionality data like: Eligibility, Eligibility Plan, Health Record Id, Group, Group Contract, Contract Plan, Group Demographics, Group Hierarchy, Medicare-Medicaid, Facility Assignment, Org Unit, Product etc etc.
  • Understanding Source Data from these systems of FS, CM and TMS Regions coming into MDW, also understanding business processes and data mappings of enterprise to department / module level for these regions and for various above functional modules.
  • Doing Impact Analysis for the MDW/EDW based on the requirements from customer projects in the areas like Members, Benefits, Products, Group etc. and get implemented in the etl code or data model any data or business process related changes or enhancements required by any of the customer projects of MDW.
  • Doing ETL Mappings Specs for these seven regions for FS and CM and coordinating and communicating with ETL developers to get Source to Target mapping developed or enhanced as and when needed.
  • Owning the ETL Mapping Specs for EDW Datawarehouse and also for MDW. Addressing and resolving any clarifications needed by developers and downstream systems about the Mapping Business rules and reflecting the same changes in the mapping specs.
  • Working on source to target mappings for EDW/MDW and understanding EDW/MDW model by working closely with Data Modeler and SMEs and participating in Logical / physical Data Model sessions/reviews.
  • Also discuss and coordinate with Data Modeler to implement the changes in the Data Model as and when needed by the downstream systems.
  • To Verify and Validate Business Processes of these various functional modules data flows from Source to Target (MDW) as and when needed based on the bugs detected or reported and get them changed in ETL code with developer and Data Model as per the requirements.
  • Did Users Acceptance Testing along with Business Users and other IT for MDW 1.0 go-live to verify the business functionality of various modules as mentioned above and also to validate some of the users final reports, to verify that these reports data from MDW are exactly same as they are being generated by users currently.
  • Worked on BigQ as Business Systems Analyst.
  • Worked on Missed Opportunity and Hospital Performance Improvement Dashboard projects as Business Analyst for Requirements gathering, by interacting with Clinical Staff in face-to-face meetings/sessions, they were key people for generating missed opportunity related and Hospital Performance Improvement related reports.

Environment: TERADATA, ORACLE 10g, SQL, SQL Plus, SQL Developer, ERWIN data modeler, Informatica, Unix O/S, Remedy, Quality Central.

Confidential, Sanfrancisco,CA

Sr Data Analyst/ Datawarehousing Analyst/ Business Systems Analyst

  • Understanding Business requirements for Interfaces from Metavance Healthcare system to the Downstream Systems ( MetaVance is a HIPAA-ready system that supports consumer-directed health plans, medical management and claims processing. It helps you better administer health insurance products such as HMO, PPO, CDHP, indemnity and hybrid managed care programs) .
  • To understand data generated through health information system, Metavance, related to the areas CLAIMS, Members and Providers etc.
  • Participating in Business Requirement gathering and verifying sessions for various interfaces.
  • Understanding Business Requirements and clarifying them with users when needed and get appropriate changes made to the ETL code or Data Model as per the requirement.
  • Doing Interaction with Users to clarify the Business Processes and Business needs.
  • Working with business users, to understand, analyze and document the requirements.
  • Doing UAT of interfaces with Business users and verifying Business Functionality being delievered to the users along with Business Users.

Environment: Metavance ( MetaVance Is a Fully Integrated Enterprise System For Healthcare Organizations To Administer Program Benefits) . )Business Objects Data Integrator, Oracle 9i, Oracle 10g, SQL, SQL plus, SQL Developer, UNIX O/S, WinSCP

Confidential, South SanFrancisco, CA

Business Data Reporting Engineer

  • As Business Intelligence and DataWarehouse Reporting Engineer creating reporting sites for various client‘s promotions as and when they start new promotions
  • Generating adhoc BI Reports as per user’s daily Requirements to help analyse their business strategies. And for that understanding their Business Processes and verifying with users.
  • Also helping design the DataMarts for their DataWarehouse and to bring data from the OLTP system into DataWarehouse to create adhoc reports from this Datawarehouse.
  • Did Interaction with Users, held Business Requirement verifying sessions.
  • Working with business users, to understand and analyze their Data and Business Process requirements.
  • Generating BI Data reports as per user requirements and needs.

Environment: Oracle 9i, Oracle 10g, PL /SQL, Oracle Stored Procedures, Java for front end, SQL, SQL plus, SQL Developer, TOAD, UNIX O/S

Confidential, South San Francisco, CA

Datawarehousing/ETL/Business Intelligence

  • Working on Commercial Data Warehouse, did ETL Business Processes verification Between source and target.
  • Understanding the Mapping Documents and business rules/processes for source to Stage and stage to target Transformations.
  • Doing Data Validation between source to stage and stage to target, like counts And lookup /custom business rules etc.
  • Validating the Informatica Mappings as per the design document source to target Mappings and business rules.

Environment: Oracle 9i, Informatica Power Center 7.1.3, PL/SQL, Oracle Stored Procedures, SQL, SQL Plus, TOAD, UNIX O/S,

Confidential, San Jose, CA, .

Sr. Datawarehouse Developer Analyst/ IT Analyst/ETL Developer

  • Understanding the requirements and business processes and clarifying them with The Business Owners.
  • Understanding the BI Reporting needs of users and implementing them in the design and code.
  • Working for the Global Workforce DataWarehouse (GWDW) of Confidential .
  • Working for the Cisco University Data Mart of GWDW, to generate reports for management To help in decision making processes.
  • Understand and improve the data design and model.
  • Wrote Oracle Scripts to generate data for Report creation purposes.
  • Doing Development of ETL Processes from source to staging and then Staging To Data Mart tables, for the Cisco University Data Mart.
  • Performance Tuning of the queries being used in ETL processes.
  • Also Testing the ETL processes for the correctness of business functionality.
  • Writting Packages and procedures to create Reports for BI which will be displayed by the front end.
  • Once data is extracted from source, transformed, Loaded in Data Mart and processed in the Data Warehouse for the BI Reporting purposes then the BI tool Siebel Analytics picks up this data to generated BI reports.
  • Working in Channels group on Mobility project. Writing backend APIs to be called from the Java Front end screens.

Environment: Oracle 9i, PL/SQL, Oracle Packages, Oracle Stored Procedures, SQL, SQL Plus, TOAD, UNIX O/S, UNIX Shell Scripting, Siebel Analytics

BI Systems Analyst

Confidential, Cupertino, CA

  • 4U data mart has the Star schema. Working on 4U Data Mart to verify the integrity of the star schema and do Data and business processes verification. Understanding data model and Also to verify validity of the business logic being implemnted as per the business requirements. Data gets populated through ETL programs in the staging tables from the different sources and from staging tables it gets populated in the targets. There are one time loading and then there are incremental loading of data. All ETL and data loading is being done by Informatica mappings. Informatica ETLs are moving and processing data from Source to the Target tables.
  • And then the Business Objects reports picks up that data from the Target tables for reporting purposes and various reports are generated. To make sure that Business Objects is picking up the correct data coming from the Source to BO and having them joined and processed correctly in Universe 4U, and also correct data from the Source is being mapped to the Classes and Subclasses of the BO and being reported correctly in the BO reports.
  • Understanding the business requirements of the Data mart and making sure data is being processed and populated correctly from staging to the target tables. And also making sure by interacting with the Business Owner that the BO reports are as per the users needs and also are displaying the correct data as expected by the target users.
  • Carried out coordination role with on/off shore teams.

Environment: Oracle 9i, Informatica Power Center 7.1, Business Objects 6.1, PL/SQL, Oracle Packages, Oracle Stored Procedures, SQL, SQL Plus, Appworx scheduler, TOAD, Unix O/S, Unix Shell Scripting,.

Confidential, Palo Alto, CA

Datawarehouse Analyst /BI Reporting/Prod Supp for HP / ETL / Datawarehouse Develop

  • Did maintenance and enhancement for following HP Datawarehouses:
  • Marketing DataWarehouse1 (MDW1), PartnerOne (MDW3)
  • Analyze and validate quality and accuracy ofthe production data also analyzing and validating built-in business rules and prcesses in above data warehouses programs.
  • To address, analyse and get resolved all user issues related to these systems data or business processes by interacting and clarifying with users.
  • Communication with the Business Users to understand their adhoc data reporting needs and get those Reports generated by developers from datawarehouse.
  • Communication with the Business Users to address and get resolved the data or business process issues and also to identify the changes in the existing business processes, and also identify the weaknesses and bugs or enhancements in the existing business process and get it implemented by developers.
  • Identify and do Data Validation of the data quality and correctness of the data coming in from the source systems, identify the causes of the data errors and data issues and get them fixed at the source systems.
  • Writing Oracle Stored Procedures or Oracle scripts to generate Reports for users from the marketing datawarehouse or to do work around of the problems in production systems till they are fixed at the source system.
  • Using TOAD for all database and data querying related activities like querying tables, tracking job status, updating tables etc..
  • Scheduling jobs using Cron and maestro schedulers in Unix.

Environment: Oracle 9i, Informatica Power Center 7.1, Business Objects 6.1, PL/SQL, Oracle Packages, Oracle Stored Procedures, SQL, SQL Plus, Appworx scheduler, TOAD, Unix O/S, Unix Shell Scripting,.

We'd love your feedback!