Business Intelligence Architect/data Architect Resume
Houston, TX
SUMMARY:
- Dynamic, result oriented DW/BI Professional with 13.5 years of I.T experience focused primarily in the areas of BI/DW performing various roles in End to End implementation.
- Vast experience as Data Architect for over 10 years with ability to deliver data management vision in support of business goals of the organization.
- Functional knowledge encompasses Government, Sales and Marketing, Insurance, Telecom, Retail, Manufacturing and Financial domains.
- Worked extensively on Client server and BI applications & acquired in - depth knowledge of various ETL tools (Informatica, SSIS, Oracle Warehouse Builder, SAP BODS, ODI, Pentaho), Data Modeling Tools (Erwin, Visio, Sybase Power Designer), BI Tools (SSRS, SSAS, Micro strategy, OBIEE, SAP Business Objects XI R3.2/ R3.1/R2/ R1/ 6.5 (Designer, Desktop & Web Intelligence Rich Client, Info View, Crystal Reports), Databases (Oracle, Teradata, SQL Server, Netezza) and Unix Shell Programming.
- Strong experience in translating business requirements into conceptual, logical and physical data models.
- Management and implementation of database models, data flow diagrams, database schemas, db scripts, DTD schemas and confirming to data standards in support of a robust data management infrastructure.
- Management and development of an enterprise tailored taxonomy.
- Extensive experience as QlikView designer in creating dashboards, using straight tables, pivot tables, Containers, Line Charts, Bar Charts etc., granting access to users, publishing the reports.
- Worked as Tableau developer with 9.0 and 9.2 versions, well versed with Tableau desktop and Tableau server products using different data sources like SAP HANA, SQL Server and Excel data sources.
- Good knowledge in SAP HANA modeling and implementation of data modeling using packages, Attribute view, Analytic View and Calculation Views.
- Sound Knowledge with SAP BODS 4.1 ETL for loading data from SAP ECC system to SAP HANA with exposure to SAP HANA Data provisioning using SAP LT Replication Server (SLT).
- Gained Working knowledge by self- on Big Data Technologies Hadoop (HDFS), No SQL, IoT, Python etc.
- Extensive experience as DW/BI Architect - Job functionality involved proposal writing, project estimation, scoping, SOW finalization, warehouse tool evaluation, design, modeling, development and performance optimization of several BI/DW solutions encompassing complete life cycle of implementation. Regular reporting to client and various internal stakeholders and delivery as per quality, time and budgetary norms.
- Experienced in Relational Databases like SQL Server, oracle, Sybase, Db2 etc.
- Production Support on Oracle, DB2, SQL Server, Informatica, SSIS and WebLogic platforms covering upgrades, multiple environment builds and superior user experience
- Broad experience designing and implementing MDM for agency wide data strategy.
- An agent of change with skills in gathering, analyzing and writing requirements pertaining to a new product/business process, understanding the overall architecture and domain & facilitating seamless implementation of new product/systems/processes to meet changing business scenarios.
- Self- on Defining and Designing MITA -SOA framework. For a COTS application, gained in-depth understanding of defining business processes and capabilities, understand MITA Goals and Objectives, define the Legacy systems and migration strategy, define Standards, identify the business services and common services, develop a data model and Technical capability matrix.
- An effective team player with strong communication, relationship management, analytical, coordination and planning skills. Comfort in interacting with people and clients at any level of business for smooth project execution.
- An efficient team leader in guiding teams towards optimal design solutions, implementing the technical design, provide peer code review, represent team in seeking approvals for production implementation, leading team towards meeting deadlines and deliverables.
- Possesses strong experience in SDLC (Software Development Life Cycle), Agile Methodology. Skilled in OO Analysis and Design, Data Modeling 3rd Normal Form, ER Modeling.
- Deep understanding of Inmon/Kimball methodologies and MDM Processes for establishing single version of truth. Strong ability to appropriately apply these DW/BI Concepts with excellent problem solving and analytical skills to disaggregate and structure problems.
- Providing post-implementation, application maintenance and enhancement support to the business stake holders regarding the BI/ETL solution.
TECHNICAL SKILLS:
Job Function: Gathering functional specifications, Design of Program Specifications, developing High-level and low-level data flows, creating reports & dashboards, Implementing MDM, developing data to object level mappings and workflows, Testing, QE & Implementation
Databases: Oracle 11g, Teradata 13.10, SQL Server 2016/2014, DB2, MS Access, HPDM
ETL Tools: Informatica Power Center 10.1/9.6, IDQ, IDE, Oracle Warehouse Builder (10g), Pentaho DI, SAP BODS 3.x/4.x, SSIS
Data Modeling: Data Modeling 3rd Normal Form, ODS Data Modeling, Dimensional Modeling (Snowflake and Star Schema), Knowledge of QMS Methodology, CA Erwin Data Modeler 7.3, SAP HANA Studio, Sybase Power Designer 16.1.0, Visio Pro 2013/ Office 365
Business Intelligence Tools: Oracle Forms 12c/11g R2, SSRS, SSAS, YOTTA, OBIEE, Micro Strategy 9.2, SAP HANA, SAP BW 7.3, SAP BusinessObjectsXIR3/6.5.1/5.x (Universe and Report Designer, Web Intelligence), Cognos 10/8, BI Framework Manager, Report Studio, Query Studio, Analysis Studio, Metric Studio, Event Studio, Crystal Reports, XCelsius4.5/2008, Power BI, QlikView 11.0, Tableau 8.0
Other Tools: TIVOLI, SQL Navigator 6.5, VISIO, ERWIN, TOAD, NEXUS, CITRIX, CONTROL-M (Batch Scheduling), REMEDY and Quality Center (Defect Tracking), VSS and Clear Case (Version Control), Knowledge of QMS Methodology.
Languages: Sql, PL/SQL, T-SQL, C, C++, Java, Java Script, XML, HTML
OS: Linux, UNIX (Shell Scripting) & Microsoft Windows
PROFESSIONAL EXPERIENCE:
Confidential, Houston TX
Business Intelligence Architect/Data Architect
Responsibilities:
- Understanding the AS-IS existing system of manually hand-written reports sourcing from excel/csv data sources/tracker files.
- Transforming, Standardizing and publishing HPE’s Market Share and Size data, which is consumed by other business groups across HPE
- Architect to make the HPE’s Market Share and Size data available across common platforms.
- Performing data profiling
- Representing the AS-IS and TO-BE states of the systems using data flow diagrams, UML Class diagrams in Microsoft Visio.
- Creating Conceptual, Logical and Physical data models and implementing the data model in database using SSDM (SQL Server Data Management Studio) Visual data modeling Interface.
- Design and Develop Data Lake to ingest data (ELT) from various sources, store data, enforce data security, Process and transform data, Provide insights.
- Designing Metadata Catalog and data lineage (End to End Source to Target Mapping Matrix).
- Implementing MDM in SQL Server to enable Enterprise-wide One View of data. Golden Copy of Master data .
- Experience in Master Data Services features include hierarchies, granular security, transactions, data versioning, and business rules.
- Worked on Master Data Services Add-in for Excel to manage data and create new entities and attributes
- Data is sourced from multiple sources, standard rules and business processes are implemented, Data Quality is performed, Golden Version of Master data is now made available to all customers.
- Changes are cleansed, checked, tracked and audited along with information about who is making the change on all key dimensions.
- Design a centralized, scalable database to host the HPE's SSDM (Share Size Data Mart). Conform to an enterprise wide standard taxonomy.
- Architect & Model the SSDM aiming towards single version of truth.
- Writing T SQL Queries using various objects like views, tables, Joins, Sub Queries, triggers, stored procedures, and other advanced SQL concepts.
- Excellent SQL tuning skills and ability to trouble shooting and recover data failures.
- Develop an industry standard database architecture that will allow for the storing and reporting of HP Market Share and Size data.
- Integrate the various Share and Size data sources into one common, cloud based database with a best-in-class taxonomy.
- Migrate legacy HP Sizing code sets from SAS and other application architectures into common sizing data mart.
- Automate the end to end data loading process, working with specific share vendors to develop a common methodology for file delivery.
- Practice agile principles using SCRUM during project life cycle.
- Participate actively in daily Scrum Stand-up meetings, responsible for conducting backlog grooming, Sprint planning/review/Retrospective meetings.
- Work on various components of SSMS tool like Object explorer, Template explorer, Solution explorer to develop/debug queries and scripts.
- Creating ETL source to target mapping specification documents and sharing with team in India and Mexico for ETL code development.
- Lead Offshore and Onsite development teams in ETL and Tableau development. Provide design and test case guidance. Involve in QA testing or defect analysis.
- Design and Develop SSIS code employing several transformations including but not limited to Fuzzy lookup, data cleansing etc.
- Design and Create OLAP Cube in SSAS, define metrics, hierarchies, junk dimensions, define KPIs.
- Designed OLAP cubes with star schema and multiple partitions using SSAS.
- Design and create SSRS reports and dashboards per business specifications. These reports are designed to source from SSAS cube. MDX queries were also written.
- Used Microsoft Power BI Power Query to extract data from external sources and modify data to certain format as required in Excel, and created SSIS packages to load excel sheets from PC to database.
- Used Power BI Power Pivot to develop data analysis prototype, and used Power View and Power Map to visualize reports.
- Created Tableau dashboards with rich graphical visualization enabling parameter input, drill down and drop down and roll up capabilities.
- Developed Tableau data visualization using Cross tabs, Heat maps, Box and Whisker charts, Scatter Plots, Geographic Map, Pie Charts and Bar Charts and Density Chart.
- Publish and share the reports with business users and project manager over Tableau Server.
- Worked with Research Analysts and the Business users to understand the requirements and layout of the QlikView executive dashboard.
- Developed executive dashboards in QlikView with rich graphical visualization enabling parameter input, drill down and drop down and roll up capabilities on Geography, Quarter and Category dimensions.
- Design and Create database objects to be able to source the QlikView dashboard.
- Dealt with huge data volumes close to 50 Million records in the dashboard.
- Performance tuning of dashboards was carried out to limit the response time to be under 5 secon Calculation.
- Publish and share the reports with business users and project manager over QlikView Server.
- Implemented data level security using Section access so that specific region teams can view their data.
- Developed and designed the dashboards with different types of charts in QlikView. Designed visualizations like pie chart and bar chart.
- Documented the technical part in Qlikview, so that it can be reviewed and shared with the team and also can be used for later purpose.
- Train end users on how to access and report against new integrated data marts.
Technology/Software: CA Erwin 7.3 (Data Modeler, Data Navigator), Informatica Power center 10.1, MS SQL Server 2016/2014, MDS, DQS, SSMS (SQL Server Management Studio), SSIS, SSRS, SSAS, MDX Programming, Power BI, PL/SQL Programming, Microsoft SharePoint, Microsoft .Net Technologies, TSQL, NoSQL, Predictive analytics, QlikView 11.0, Tableau 8x.
Confidential, Washington DC
Sr. Enterprise Data Architect
Responsibilities:
- Understanding the current system - ITS (Integrated tax system) primarily a mainframe-based application system written in COBOL with DB2 planned for migration to Visual Studio .NET based application interface on SQL Server 2014 Database with Integration, Reporting and Analytical capabilities.
- Understand all the current interfaces to be able to articulate the unified data needs.
- Effective interactions with all stake holders and executive team to develop requirements, define, present and implement solutions.
- Identify the operational vs strategic/analytical reporting requirements.
- Designing to interface with the New Source System MITS which is a replacement to ITS. Creating data flow diagrams, UML diagrams in Visio to represent both the current to new systems.
- Implement the best industry practices of enterprise information strategy, enterprise data warehouse, master data management, design/model data, data integration and transformation, data analysis, data mapping, data governance, and data quality to obtain optimal data and translating it into meaningful and actionable information for making strategic business decisions.
- Managing design and Implementation of Agency Wide Data Strategy with main focus on MDM.
- Prepare Technical Design documents to project the solution approaches, identify the optimal solution and represent in ARB for design approvals.
- Design CDC and historical data retention strategy, recovery strategy.
- Design and implement the conceptual, logical and physical data models using Erwin on Oracle 11g db environment and SSMS Visual data modeling interface for SQL Server DB.
- Install the database model to Oracle 11g database.
- Continue maintaining the Erwin Model and incremental changes / enhancements are performed as per requirements.
- Design and development of SQL, PL/SQL packages, procedures, functions, triggers etc.
- Identify areas of improvement and be able to make balance recommendations.
- Tool evaluation and be able to recommend a cost effective unified environment setup for the currently diversified source and target application systems.
- Ensure for data quality, Single version of truth, avoid redundancy, and model it right in the first place.
- Implementing MDM to provide Enterprise wide 360degree view of Master data to all customers.
- Working knowledge on Microsoft Dynamics CRM module.
- Lead ETL development effort by designing the solution and providing Technical design specifications of Source to Target mappings, Traceability Matrix, Mapping diagrams to help develop the ETL code in SSIS & Informatica.
- Assist in QA testing and Defect Analysis
- Constantly coordinate with ETL team members to ensure accurate implementation as per requirements.
- For MITS Project, Designed and deployed reports with SSRS tools, utilizing Dynamic and Cascading Prompts, sub-reports, charts, parameterized reports, conditional and dynamic reports. Set up Advanced functions, dashboard, Drill-Trough/Drill-Down, KPI/Indicators in reports.
- Involved in working with SSAS cubes and developed various dashboard reports.
- Performance tuning by analyzing and comparing the turnaround times between SQL and SSRS.
- MDX functions were used while designing reports in query studio.
- Practicing agile methodologies of project development life-cycle with iterative and incremental deliverables. Thus, focusing more on end-user requirements and meeting them in quick turnaround times and enhancing if any feedback received.
- Identify the current requirements for Oracle Forms developed in 2005 and be able to re-write them in COGNOS.
- For One Financial View, Designed & developed the reports using Cognos 10 Report Studio, Query Studio, Analysis Studio reports.
- Worked as an administrator on Cognos suite of products.
- Design & development of multiple reports using various templates like list reports, cross-tab reports and chart reports.
- Involved in working with OLAP cubes both in SSAS and Cognos and developed various dashboard reports.
- Extensively worked on Analysis Studio to develop multi-dimensional reporting.
- Developed Standard Reports, Charts, Drill Through Reports, Master Detail Reports, Map Reports Using Report Studio and Query Studio.
- Created complex dashboards with pie charts, crosstab and Multi prompts in Report Studio.
- Developed various reports using functionalities like Render Variables, Conditional Blocking, Cascading Prompts, and Conditional Formatting.
- Exercising POCs for the following projects:
- Legacy DWP Decommission
- Migration of Oracle Forms on 10g to 11g
- Rewrite Oracle Forms to COGNOS reports
Technology/Software: CA Erwin 7.3 (Data Modeler, Data Navigator), Informatica Power center 10.1, IDQ, IDE, MDM RE, Informatica Axon, Oracle 11g, Oracle Forms, MS SQL Server 2014/2008, SSIS, SSRS, SSAS, PL/SQL Programming, HPSM, Agile Principles and Tools, IBM Cognos 10.x, SAP BO, Microsoft SharePoint.
Confidential, Durham, NC
DB Engineer/ Database Architect
Responsibilities:
- Closely work with business, delivery & technical stakeholders in developing requirements, and planning for implementation
- Perform analysis and review of Business, Functional and Data Requirements.
- Assess Use cases and Story points using Top Team Client and JIRA for backlog items.
- Practice of Lean and Agile Software development methodologies.
- Implementing Scrum to track iterative, incremental deliverables and continued customer interaction, responding to changes/enhancements quickly
- Bi-weekly deployment to production thereby enhancing productivity.
- Use of Kanban board to visualize the progress of project per team member, understand the dependencies/showstoppers/issues and to communicate the status.
- Define the technical requirements of the solutions.
- Using Sybase Power Designer, Define, Design and Implement the conceptual, Logical and Physical data models for Master data, data and star schema data models for Oracle DB.
- Writing Procedure, Functions, Triggers, Views etc. in Oracle, SQL Server, DB2, Complex SQL etc.
- ETL & PL/SQL Stored Procedure design & implementation for product and internal tool features.
- Act as Scrum master (In Rotation) as well as Development lead in monitoring the progress of project, ensure timely deliverables, represent team in Architecture review boards to get design exits and code exits.
- Handle data life-cycle and deployment scripts and utilities.
- Great exposure to high volume OLTP Challenges.
- Work together with development team to improve application performance by tuning queries.
- Work on requirement to Tag FMR & PGA Funds which are non-MACS Offerings, as ‘MACSALLOWED’ to allow GAA MACS team & Portfolio Managers to access research published on MACS offerings. In this regard, designed and implemented the creation of IB data structures in VPADMIN schema.
- Work closely with MDM team to Master Data. If any changes to key dimensions, work with the team for making appropriate entries, define business rules, maintain DQ (Data Stewardship), Audit Changes, notifying required personnel.
- Work closely with DBA’s to set-up DB Structures and ensure performance optimization, restartability and re-usability and avoid redundancy in data.
- Reconciliation and evidence creation for data movement from source systems to target database (DW).
- Close association with APP and Services team to ensure accurate integration of the data tier
- Coordinate receiving sign-offs from various teams across various stages of SDLC
- Attend CAB/Go-No Go Meetings to ensure successful install of the release.
- Coordinate with Business to ensure the business needs are met.
- Coordinate with L3 to close on High Priority defects in production (If any).
Technology/Software: Sybase Power designer 16.1.x, Informatica 9.2.0, IDQ, IDE, Oracle 11g,PL/SQL Programming, Unix Scripting, Autosys, Clear Case, GitStash, HPSM, Agile Principles and Tools, Scrum, Kanban Board, Oracle APEX, MDM
Confidential, Houston, TX
Solution Architect
Responsibilities:
- Closely work with business, delivery & technical stakeholders in developing requirements, and planning for implementation
- Apply the knowledge of Data Warehousing and Business Intelligence to design and develop enterprise solutions which are focused on improving specific business processes with HP.
- Responsible for the overall solution and its implementation for Mar2014 Release.
- Work in Agile mode implementing Scrum and tracking daily progress of 14 member globally diversified team with regular stand-ups. Identify any showstoppers or dependencies much ahead in the life-cycle.
- Works with Development and Business team members to build appropriate data views and architecture.
- Creating database model diagrams for various subject areas like Demographics, Sales, Order, Shipment etc. (Conceptual data model, Logical data model and Physical data model) and representing in architecture forums to receive database signoffs and coordinate with DB teams to get the models installed.
- Integral part of MDM team to model Master Data, define standard rules, business processes, Data Quality parameters, Determine End-User Group permissions, Audit Changes on Master Data.
- Provide single copy of s to Master data corporate-wide. This data was used by Global customers.
- Experience in Designing, Building and Maintaining the Universes, resolving the issues like loops and Traps using Alias and Context, designing complex objects using @prompt, @ Aggregate aware function.
- Lead ETL development effort in Onsite/Offshore model. Provide needful guidelines and designs. Represent team in ARB (Architecture review board) to seek approvals and exits for successful deployment of project solution.
- Implemented Security Features of Business Objects like row level, object level to make the data secure.
- Define the technical requirements of the solutions.
- Works with team members to manage data within the Data Warehouse/Business Intelligence structures.
- Performing Data profiling.
- Work on closing any gaps in requirements to mapping data elements in the EDW (Finance).
Technology/Software: Informatica 9.2.0, SAP BOXI 3.1 (Designer 12.3.0.601, Infoview 12.1.0, Web Intelligence), Yotta, Agile Methodologies, DIAL, YMS, HPDM, Neo view, Tibco, Tibco Spotfire,Share point, MDM
Confidential, Madison, WI
Business Intelligence Analyst
Responsibilities:
- Closely work with business, delivery & technical stakeholders in developing requirements, and planning for implementation
- Understanding BRD thoroughly to Prepare and seek approval on technical high level and low level data design documents.
- Working with ERWIN Data Modeler 7.3 version to design the Star Schema Data Model (All three - Conceptual, Logical and Physical data models).
- Designing the data model for adding RESUME data pertaining to job seeker into JCS DM.
- Model the data featuring SCD type 1, type 2 dimensions and include DW Plumbing columns.
- Creation of indexes to facilitate performance tuning.
- Creation of reusable data structures namely TIM DIM and PRC CNTRL table for purpose of restartability, CDC and performance optimization.
- Implementing data models iteratively and incrementally in agile manner to address the immediate need and refine based on feedback received thus ensuring productivity and user satisfaction.
- Creation of Change Management requests and acts as DBA liaison to get the data model installed in DEV/ACC/PRD environments.
- Was involved in end-to-end development of whole of ETL solution for this project.
- Process Improvement:
- The existing system purges and reloads data on weekly basis and doesn’t pull incremental data. In this context, Designed and developed a prototype for facilitating CDC pull of data on daily basis which minimizes the load time and provides stake holders with most recent data which is near to real-time. Most performance optimized code. The prototype is successful in implementation and same is followed in adding RESUME data to JCS DM.
- Retired the legacy Argent scheduler jobs and converted all of them to run on Control-M.
- Creation of Data Lineage document from the Metadata repository using Meta data queries. This document is of value addition to business stake holders in mapping the Object from Report all the way back to Legacy application screens.
- Create and present High level & low-level ETL flow design.
- Reconciliation and evidence creation for data movement from source systems to target database (DW).
- Own Creation of mappings, sessions, workflows using Informatica 9.1
- Own Creation and use of control-m scripts to automate workflows on production for one-time/daily and weekly loads.
- Conducting reviews (design & code self/peer reviews)
- Lead the team and in course Provide Guidance to 2 FTEs of DWD with Informatica 9.1 and help them up to pace with developer’s experience.
- Involved in Interactions with end users regularly for requirements gathering.
- Involved in designing and developing Database Views, Universe Design as well as report development to suit the Business Requirements in Business Objects.
- Analyzed all tables and created various indexes to enhance query performance.
- Created Cascaded Prompts, conditions at Universe level for various reports to specify the reporting parameters as per business requirements.
- Designed and Created Contexts for resolving Loops as well as fan and chasm traps in the Universes.
- Worked on enhancing the Performance of SQL Queries in Oracle.
- Worked with the business directly to make sure reports are meeting the Business requirements.
- Design and develop reports in quick turn-around time and get them reviewed. Quick implementation of feedback and work towards deliverables in agile manner.
- Involved in testing for reports that were already developed to make sure the reports contain correct data.
Technology/Software: Informatica 9.1.0, SAP BOXI 3.1 (Designer 12.3.0.601, Infoview 12.1.0, Web Intelligence), Oracle 10g/11g, Windows, BMC Control-M 7.0.000(Control-M EM, Desktop, Reporting Facility), CA Erwin 7.3 (Data Modeler, Data Navigator), Tivoli 6.0NT, SQL Navigator Professional Edition 6.5, Agile Methodology, MS Visual Studio 2010( TFS tools, Source Control)
Confidential, Austin, TX
Business Intelligence Analyst
Responsibilities:
- Solely responsible for maintaining the entire business objects environment.
- Experienced in Micro Strategy suite of toolset (Architect, Desktop, Web)
- Create Micro Strategy Dashboard and Scorecards
- Created reports as per the User requirement using the Universes as the main data Providers
- The reports were generated using Business Objects functionality such as Prompts, Filters,
- Designing of MOLAP Cube to facilitate Drill down, Roll up etc.
- Responsible for testing the reports.
- Ensure the accuracy of data moved from source to target database in ETL and also validated
- Was involved in end-to-end development of ETL solution for this project.
- Closely work with business, delivery & technical stakeholders in developing requirements, assessing technology, and planning for implementation
- Understanding BRD thoroughly to Prepare SRS to seek approval on technical high level and low-level design of application.
- Involve in data modeling of the DB objects (Star Schema). Responsible for creating data flow diagrams of the to-be system, creating conceptual, logical and physical data model diagrams in Erwin.
- Work with DBA to get these models reviewed as per data standard definitions pertinent to organization’s data dictionary.
- Work with DBA to get the Data warehouse and Data Mart data models installed in Oracle database.
- Reconciliation and evidence creation for data movement from source systems to target database (DW).
- Work with Business users to establish data patterns - IDE
- Cleansing, fixing duplicate issues and bad data issues using IDQ
- Create and present High level & low-level ETL flow design.
- Design ETL framework and Lead ETL development efforts, represent in Architecture forums to seek approvals.
- Coordinate with IT Teams for successful migration of code to production environment. Responsible for production deployment and assist support team in quick defect identification and analysis if any within desired SLA.
- Involved in coding of mappings, sessions, workflows using Informatica 8.6 version along with rest 3-member team.
- Own Creation and use of control-m scripts to automate workflows on production for one-time/daily and weekly loads.
- Used Teradata utilities extensively TPUMP, FastLoad, MLoad to Load the data from the flat files.
- Conducting reviews (design & code reviews) in ARB and DW/BI forums to seek consensus on design and code for GRR Application.
Technology/Software: Business Objects XIR3.2/3.1 (Designer, Web intelligence, Crystal Xcelsius4.5, Xcelsius 2008), SAP Source systems, Agile Methodologies, Crystal Reports XI/10, Micro Strategy 9.2, Informatica 9.0, Teradata, UNIX, Windows, MDM, CA Erwin 7.1
Confidential, NY
Business Intelligence Analyst
Responsibilities:
- Interaction with client and business users for requirement gathering.
- High level & low-level ETL flow design. High level & low-level ETL flow design.
- Informatica, database and UNIX shell script generic design without any hard coding.
- Proper compliance to business logic.
- Receiving code acceptance from clients.
- Working on code review comments.
- Ensure Team cohesiveness and completeness in correct understanding of the requirements and design from ETL Perspective.
- Reusable component & detailed technical design
- Documenting the High and Low-level design.
- Conducting reviews (design & code reviews)
- Informatica coding, test cases preparation, unit & integration testing.
- Created new forms in Crystal Report using Stored Procedures in SQL Server 2000
- Developed unlinked, on-demand sub reports and linked sub reports using shared variables and complex reports like cross-tab, drill down and hierarchical reports.
- Created reports include formatting options such as Grouping, Sorting, Drill-down, Parameter prompts
- Reconciliation and evidence creation.
- Involve in Dev & QA testing.
- Migration of the system to QA and UAT phase.
- Troubleshooting of problems in QA and UAT phase.
- Update all the project related documents in VSS, the project-tracking tool.
- Have a constant follow-up whether the defects for each task is been logged on to the defect tracking system.
Technology/Software: Informatica 8.6, UNIX, DB2, Oracle9i, Perl, Crystal Reports 8.5
Confidential, Cincinnati, OH
ETL Developer
Responsibilities:
- Interaction with OSC and understand requirements.
- High level & low-level ETL flow design.
- Session restartability & orphan data management.
- UNIX shell script design.
- Control-M (Unix Scheduler) chart design.
- Discuss the efficiency and restartability of code design with TFG group.
- Ensure Team cohesiveness and completeness in correct understanding of the requirements and design.
- Ensure team coordination while development and review activity.
- Reusable component & detailed technical design
- Documenting the High and Low-level design.
- Conducting reviews (design & code reviews)
- Informatica coding, test cases preparation, unit & integration testing
- Used Teradata utilities extensively TPUMP, FastLoad, MLoad to Load the data from the flat files
- Reconciliation and evidence creation.
- Involved in Dev & QA testing.
Technology/Software: Informatica 8.1, Teradata, Oracle10g, UNIX, Seibel Analytics Reports
Confidential
Developer/Analyst
Responsibilities:
- Interaction with business users and requirement gathering.
- High level & low-level ETL flow design.
- Understanding the functionality of Geneva Billing System and Siebel CRM.
- Involved in detailed technical design
- Coding the Mappings in OWB to load data into the Staging, ODS and ADS.
- Scheduling the Unix Scripts to Load data into staging.
- Creation of Sessions and Process flows to schedule these mappings.
- Design and Creation of Unit Test cases
- Performing Unit testing and Integrated Testing.
- Reconciliation and evidence creation.
- Designing the Reports for Business Analysis.
- Involved in QA & UAT testing. Handled issues, performed analysis on these issues and initiated interaction with client and business users to see that results cater to their requirements. If necessary, performed quick & minor code fixes upon Client’s requirement.
Technology/Software: PL/SQL, Oracle 9i, Siebel, Oracle Warehouse Builder, Crystal Reports
Confidential
ETL Designer
Responsibilities:
- Interaction with business users and requirement gathering.
- High level & low-level ETL flow design
- Unix shell script design
- Involved in detailed technical design
- Design of Mappings employing various transformations, filters, joiners, SQL overrides etc to load data from multiple databases into Warehouses
- Scheduling the Unix Scripts to Load data into staging.
- Creation of Sessions and Workflows to schedule these mappings.
- Creation of Unit Test cases
- Perform unit testing of these Workflows from source to staging and staging to Warehouse of data warehouse
- Reconciliation and evidence creation.
Technology/Software: Informatica 6.2.2, SQL server 2000, DB2 UDB version 7, UNIX shell scripts, Maestro scheduler.