- Data Modeling professional with around 9 years of total IT experience and expertise in data modeling in designing data warehouse, data mart and analysis of Online Transactional Processing(OLTP), data warehouse (OLAP) and business Intelligence (BI) applications.
- Experience in Requirement gathering, System analysis, handling business and technical issues & communicating with both business and technical users.
- Expert in the Data Analysis, Design, Development, Implementation and Testing using Data Conversions, Extraction, Transformation and Loading (ETL) and SQL Server, ORACLE and other relational, non - relational databases.
- Experience in designing star schema (identification of facts, measures and dimensions), Snowflake schema for Data Warehouse, ODS architecture by using tools like Erwin 9.6/8.2/7.0, Power Designer 15, Embarcadero E-R Studio and Microsoft Visio.
- Well versed in Normalization and De-Normalization techniques for optimum performance in relational and dimensional database environments and have performed normalization up to 3NF.
- Experience in Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in ETL.
- Experience in latest version of Quality Stage which is a cleansing tool and is a part of the IBM Information Server and it appears as a major component within IBM'S Infosphere Data Stage.
- Proficient in developing and maintaining of packages, functions, application procedures, stored procedures, indexes, using Oracle PL/SQL database programming language.
- Created PLSQL Packages, Functions and Procedures according to business requirements that loads data to Tables.
- Extensively developed Data Mapping and Data Conversion scripts using SQL, PL/SQL.
- Experienced with different Relational databases like Teradata, Oracle and SQL Server.
- Experienced in system design, data architecture and modeling- relational for OLTP systems and dimensional for data warehouses (Inmon methodology) and data marts (Kimball methodology), data governance, master data and metadata management.
- Capture, validate and publish metadata in accordance with enterprise data governance policies and MDM taxonomies.
- Extensive experience working in an Agile and Waterfall environment.
- Project utilized Waterfall and Agile methodologies and an iterative development cycle to develop the product, new processes, and to establish standards and procedures.
- Having good working experience in Data Vault which is used in maintain Historical Data in the Enterprise Data Warehouse.
- Hands on experience with an ETL tools like Informatica and SSIS
- Expertise in implementing Reverse engineering, Model Merge, Model Update, Naming Conventions and forward engineering using various data modeling tools like Erwin, E R Studio, Power Designer
- Experience working with AWS, Python, ETL, SQL, Data warehousing or other similar systems/ tools
- Worked with various RDBMS like Oracle 9i/10g/11g, SQL Server 2005/2008, DB2 UDB and Teradata.
- Created STTM data mapping Documents to load data from target using different transformations and executed workflows for data loads to target systems.
- Extensive experience on building dashboards using Tableau
- Well-versed in writing SQL queries to perform end-to-end ETL validations and support Ad-hoc business requests.
Data Modeling Tools: ERwin 9.5/8.2/7.3/7.0, Power Designer 15/12/11, Embarcadero ER Studio 6.6, IBM Rational Software Architect 7.1, MS Visio 2000/2007/2010, ER Studio.
Reporting Tools: SAS, SAS BI,WEB REPORT STUDIO,ENTERPRISE GUIDE, Business Objects, Crystal reports 9,Business Intelligence, SSRS,, Qlikview 9/10/11.X series, Tableau, Cognos 8, Micro Strategy, Business Objects XI, Crystal Reports 2008.
ETL Tools: Informatica 8x, 9x, SSIS.
Databases: Oracle11g/10gMS SQL Server2005/2008, MS Access, Teradata,DB2, Sybase 12
Operating Systems: Microsoft Windows10/2000/XP/Vista/7, UNIX, LINUX.
Confidential, Chicago, IL
Sr. Data Modeler/Data Analyst
- Complete study of the in-house requirements for the data warehouse. Analyzed the DW project database requirements from the users in terms of the dimensions they want to measure and the facts for which the dimensions need to be analyzed.
- Prepared the questioners to users and created various templates for dimensions and facts.
- Conducting user interviews, gathering requirements, analyzing the requirements using Rational Rose, Requisite pro RUP
- Created logical data model from the conceptual model and it's conversion into the physical database design using Erwin
- Created dimensional model based on star schemas and designed them using Erwin.
- Experience in designing star schema, Snowflake schema for Data Warehouse, ODS architecture.
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snowflake Schemas .
- Experience in multidimensional data modeling, such as star schemas, snowflakes, denormalized models, handling “slow-changing” dimensions/attributes
- Created PL/SQL Stored Procedures, Functions, Packages and Triggers, extensively used cursors, ref cursors, User defined object types, Records in PL/SQL Programming.
- Involved in Unit testing of PL/SQL Stored Procedures, Functions and Packages.
- Used Erwin for creating tables using Forward Engineering
- Wrote SQL complex queries for implementing business rules and transformations.
- Assist with user testing of systems, developing and maintaining quality procedures, and ensuring that appropriate documentation is in place
- Responsible for identifying and documenting business rules and creating detailed Use Cases
- Defined the naming standards and glossary for data warehouse
- Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and Bulk collects.
- Designing and building scalable, maintainable ETL for Netezza and SQL Server
- Maintained warehouse metadata, naming standards and warehouse standards.
- Collected the information about the existing ODS by reverse engineering the ODS.
- Identified/documented data sources and transformation rules required to populate and maintain data warehouse content.
- Assisted in designing the overall ETL strategy
- Generated DDL statements for the creation of new Sybase objects like table, views, indexes, packages and stored procedures.
Environment: Erwin 8.2, PL/SQL Developer, Teradata, TOAD Data Analyst - 2.1,Oracle 11g, Informatica Power center 9.1,TD SQL Assistant, Microsoft Visio
Confidential, Washington, D.C
Data Analyst / Data Modeler
- Conducted JAD sessions, wrote meeting minutes and also documented the requirements.
- Collected requirements from business users and analyzed based on the requirements.
- Designed and built Data marts by using Star Schema.
- Involved in designing Context Flow Diagrams, Structure Chart and ER- diagrams.
- Gathered and documented requirements of a Qlikview application from users.
- Implementation of full lifecycle in Data warehouses and Business Data marts with Star Schemas, Snowflake Schemas, SCD & Dimensional Modelling .
- Strong understanding in the principles of Data ware housing using Fact Tables, Dimension Tables, star schema modelling and snowflake schema modelling, Slowly changing dimensions, foreign key concepts, referential integrity.
- Extracted data from various sources (SQL Server, Oracle, text files and excel sheets), used ETL load scripts to manipulate, concatenate and clean source data.
- Involved in the data transfer creating tables from various tables, coding using PL/SQL, Stored Procedures and Packages.
- Involved in database development by creating PL/SQL Functions, Procedures, and Packages, Cursors, Error handling and views.
- Demonstrated Qlikview data analyst to create custom reports, charts and bookmarks.
- Involved in creating scripts for data manipulation and management.
- Extensive system study, design, development and testing were carried out in the Oracle environment to meet the customer requirements.
- Serve as a member of a development team to provide business data requirements analysis services, producing logical and Physical data models using Erwin 7.1.
- Created tables, views, procedures and SQL scripts.
- Utilized organizational standards for data naming, structuring and documentation.
- Responsible for defining database schemas to support business data entities and transaction processing requirements.
- Ensure the business metadata definitions of all data attributes and entities in a given data model are documented to meet standards.
- Ensure the first cut physical data model includes business definitions of the fields (columns) and records (tables) were generated.
- Closely worked with ETL process development team.
- Writing and executing customized SQL code for ad hoc reporting duties and used other tools for routine report generation.
- Worked as part of a team of Data Management professionals supporting a Portfolio of development projects both regional and global in scope.
- Applied organizational best practices to enable application project teams to produce data structures that fully meet application needs for accurate, timely, and consistent data that fully meets its intended purposes.
- Conducted peer reviews of completed data models and plans to ensure quality and integrity from data capture through usage and archiving.
Environment: Erwin, Oracle11g, SQL server 2012, Informatica Power center 9.1, Cognos
Confidential, Philadelphia, PA
Data Modeler / Data Analyst
- Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
- Created OLTP, ODS and OLAP architectures
- Responsible for the analysis of business requirements and design implementation of the business solution.
- Developed Conceptual, Logical and Physical data models for central model consolidation.
- Develop logical and physical data models including star schemas, snowflake schemas, and highly normalized data models
- Experience with creating and maintaining data definitions, data models (dimensional/snowflake), query optimization and OLAP tools is required as well as familiarity with metadata management, business semantics, and metadata workflow management
- Worked with DBAs to create a best fit physical data model from the logical data model.
- Conducted data modeling JAD sessions and communicated data-related standards.
- Used E R Studio for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of data analysis responsibilities.
- Developed process methodology for the Reverse Engineering phase of the project.
- Involved using ETL tool Informatica to populate the database, data transformation from the old database to the new database using Oracle.
- Involved in the creation, maintenance of Data Warehouse and repositories containing Metadata.
- Involved in the critical design review of the finalized database model.
- Involved in the study of the business logic and understanding the physical system and the terms and condition for database.
- Created documentation and test cases, worked with users for new module enhancements and testing.
Environment: E R Studio, Informatica, Teradata, Oracle10g, SQL Server, MS Visio
Confidential, Columbus, OH
Data Modeler/ Data analyst
- Participated in requirement gathering session, JAD sessions with users, Subject Matter experts, Architect’s and BAs.
- Participated in Sprint Testing and documented the Project Change Control and the Impact analysis, worked with BA’s on GAP Analysis.
- Applied Normalization techniques and created Logical and Physical models based on the requirements.
- Conducted and participated in Database design review meetings.
- Prepared Enterprise Naming Standard files and also project specific naming standard files in some exception cases.
- Solid experience in relational modeling, dimensional modeling, conceptual, logical modeling, physical modeling, star schema, snowflakes schema, ER diagrams, granularity, cardinality and database reengineering
- Capturing the demographic information from different applications by providing source application data dictionaries, identifying the demographic information and by providing DDL.
- Involved in the redesigning of Legacy systems, Modifications, Enhancements and Break Fixes to existing systems and in integration of one system with the other.
- Worked on Forward and Reverse Engineering using Erwin, reverse engineered XSD structures, excel spread sheets and copybooks.
- Worked on Comparing different models, different versions of models using complete compare in Erwin and also compared Databases directly and produced alter scripts.
- Worked on upgrading Erwin Data modeler from v7.3 to v9.5, served as Erwin Repository administrator
- Worked with BI team in providing SQL queries, Data Dictionaries and mapping documents (Report attributes to Database columns).
- Acted as a Strong Data Analyst analyzing the data from low level in conversion projects, provided mapping documents between Legacy, Production and User Interface systems.
- Extensively performed Data Profiling, Data Cleansing, De-duplicating the data and has a good knowledge on best practices.
- Worked with Architects in designing conceptual model for the Data warehouse and Identified Facts and Dimensions, designed Logical and Physical data models
- Designed different type of STAR schemas like detailed data marts and Plan data marts, Monthly.
- Worked on Converting Physical only models to Logical and Physical models.
Environment: Oracle10g, SQL SERVER, Erwin, Informatica, Sybase Power Designer,.
- Involved in requirement gathering and database design and implementation of star-schema, snowflake schema/dimensional data warehouse using Erwin 7.0.
- Initiated Use Case Analysis using UML, which provided the framework for potential use case deliverables and their inter-relationships.
- Involved in the study of the business logic and understanding the physical system and the terms and condition for sales Data mart.
- Generated meta-data reports from data models.
- Used Erwin for creating logical and physical data models for Oracle and Teradata database design, star schema design, data analysis, documentation, implementation and support.
- Used SQL Server Integrations Services (SSIS) for extraction, transformation, and loading data into target system from multiple sources
- Created SSIS packages to clean and load data to data warehouse.
- Created package to transfer data between OLTP and OLAP databases.
- Created SSIS Packages using Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, and Execute Package Task etc to generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse.
- Created a table to save dirty records.
- Extracted data from XML to SQL server.
- Created SSIS packages for data Importing, Cleansing, and Parsing etc. Extracted, cleaned and validated
- Created complex queries to automate data profiling process needed to define the structure of the pre staging and staging area.
- Struck a balance between scope of the project and user requirements by ensuring that it covered minimum user requirements without making it too complex to handle.
- Requirement gathering from the users. A series of meetings were conducted with the business system users to gather the requirements for reporting.
- Involved in mapping the load data to the target based on primary, and foreign key relationship considering the constrained based load order
- Designed the mapping with a slowly changing dimension Type2 to keep track of historical data
- Designed a mapping to process the incremental changes that exits in the source table. Whenever source data elements were missing in source tables, these were modified/added in consistency with third normal form based OLTP source database. with end users to identify key dimensions and measures that were relevant quantitative.
- Involved in logical and physical designs and transform logical models into physical implementations for Oracle and Teradata.
- Used Erwin7.0 tool for reverse engineering and target database schema conversion process.
Environment: HP-Unix 11.11, Oracle 10g, Teradata 12, MS SQL Server 2008, SQL Server 2008 Integration Services, Erwin 7.0, ER/ Studio 8, XML, Microsoft Excel 2007.