- Above 9+ years of experience as Sr. Data Architect/Modeler/Data Analyst with excellent understanding ofDataWarehouse andDataMart designing.
- Experience in designing star schema, Snowflake schema forDataWarehouse, ODS architecture.
- Excellent understanding industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies
- Experience in developing Map Reduce Programs using Apache Hadoop for analyzing teh bigdataas per teh requirement.
- Good experienced in Dimensional and RelationalDataModeling using Star and Snowflake Schemas, OLTP/OLAP system, Fact and Dimension tables, Conceptual, Logical and Physicaldata Modeling using Erwin.
- Expertise in designing thedatawarehouse using Ralph Kimball's and Bill Inmon techniques.
- Strong background in variousDataModeling tools using ERWIN, ER/Studio and Power Designer.
- Extensive experience in Normalization (1NF, 2NF, 3NF and BCNF) and De - normalization techniques for improved database performance in OLTP and Data Warehouse/Data Mart environments.
- Well-versed in designing Star and Snowflake Database schemas pertaining to relational and dimensional data modeling.
- Experienced in Client-Server application development using Oracle PL/SQL, SQL PLUS, SQL Developer, TOAD, SQL Loader.
- Experience in analyzing data using Hadoop Ecosystem including Map Reduce, HDFS, Hive, Spark, Spark Streaming, Elastic Search, Kibana, Kafka, HBase, Zookeeper, PIG, Sqoop and Flume.
- Experience in cloud development architecture on Amazon AWS, EC2, EC3, Elastic Search, Elastic Load balancing, Redshift and AMI and Basic on MS Azure.
- Experienced with SQL Server and T-SQL in constructing Temporary Tables, Table variables, Triggers, user functions, views, Stored Procedures.
- Extensive experience in shell scripting, Python, Perl, Ruby, or any other scripting language
- Strong experience with architecting highly performing databases using PostgreSQL, MYSQL and Cassandra.
- Having experience in writing complex SQL queries to perform end-to-end ETL validations and support Ad-hoc business requests.
- Extensive experience in developed Stored Procedures, Triggers, Functions, Packages using SQL/PLSQL.
- Experience in Database Creation and maintenance of physicaldatamodels with Oracle, Teradata, Netezza, DB2 and SQL Server databases.
- Experienced in working with Teradata Utilities like Fast load, Multi load, Tpump and Fast Export Teradata Query Submitting and processing tools like BTEQ and Teradata SQL Assistant.
- Well versed in conducting Gap analysis, Joint Application Design (JAD) session, User Acceptance Testing (UAT), Cost benefit analysis and ROI analysis.
- An excellent team player & technically strong person who TEMPhas capability to work with business users, project managers, team leads,architectsand peers, thus maintaining healthy environment in teh project.
Analysis and Modeling Tools: Erwin 9.6/9.5, Sybase Power Designer, Oracle Designer, ER/Studio 9.7, Star-Schema, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables.
Database: Microsoft SQL Server 2014/2012 Teradata 15/14, Oracle 12c/11g MS Access, Poster SQL, Netezza, SQL Server, Oracle.
OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9.
ETL Tools: SSIS, Pentaho, Informatica Power 9.6
Reporting Tools: Business Objects, Crystal Reports
Operating Systems: Microsoft Windows 8/7, UNIX, Linux.
BigData: Hadoop, HDFS 2, Hive, Pig, HBase, Sqoop, Flume
Tools & Software: TOAD, MS Office, BTEQ, PL/SQL, SQL Assistant, SQL PLUS, SQL LOADER
Confidential, Centennial, CO
Sr. Data Modeler
- Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
- Interacting with Business Analysts to gather teh user requirements and participated in data modeling sessions.
- Worked with DevOps team, if any necessary changes like modeling issues and enhancements has done as part of teh requirements.
- Worked on Software Development Life Cycle (SDLC) with good working noledge of testing, agile methodology, disciplines, tasks, resources and scheduling.
- Worked on NoSQL databases including HBase, Mongo DB, and Cassandra.
- Implemented multi-data center and multi-rack Cassandra cluster.
- Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
- Involved in source to target (MDM) Data mapping sessions with IBM as they master teh target.
- Created 3 NF business area data modeling with de-normalized physical implementation; data and information requirements analysis.
- Performed Reverse Engineering of teh current application using Erwin, and developed Logical and Physical data models for Central Model consolidation.
- Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
- Created E/R Diagrams, Data Flow Diagrams, grouped and created teh tables, validated teh data, for lookup tables.
- Involved in non-transactional data entities of an organization of MDM dat TEMPhas teh objective of providing processes for collecting, aggregating, matching, consolidating, quality-assurance, persistence and distribution.
- Redefined many attributes and relationships in teh reverse engineered model and cleansed unwanted tables and columns as part of Data Analysis responsibilities.
- Performed legacy application data cleansing, data anomaly resolution and developed cleansing rule sets for ongoing cleansing and data synchronization.
- Developed teh data warehouse model (Kimball's) with multiple data marts with conformed dimensions for teh proposed central model of teh Project.
- Implemented Star Schema methodologies in modeling and designing teh logical data model into Dimensional Models.
- Conducted design sessions with Business Analysts and ETL developers to come up with a design dat satisfies teh organization's requirements.
- Worked with Database Administrators (DBAs) to finalize teh physical properties of teh tables such as Partition key, based on volumetric.
- Analyzed teh data from teh sources, designed teh data models and then generated scripts to create necessary tables and corresponding records for DBAs using Informatica.
- Developed number of Data correction scripts using PL/SQL to handle any manual adjustments/corrections in System.
- Member of Data model and Design Review group dat oversees data model changes and system design across Enterprise Data Warehouse.
- Collaborated teh data mapping document from source to target and teh data quality assessments for teh source data.
- Optimized and updated UML Models (Visio) and Relational Data Models for various applications.
- Experienced with BI Reporting in Design and Development of Queries, Reports, Workbooks, Business Explorer Analyzer, Query Builder, Web Reporting.
- Generated various reports using SQL Server Report Services (SSRS) for business analysts and teh management team.
- Designed OLTP system environment and maintained documentation of Metadata.
- Coordinate with Data Architects to Design Big Data, Hadoop projects and provide for a designer dat is an idea-driven.
- Configured Hadoop Ecosystems to read data transaction from HDFS and Hive.
- Prepared reports to summarize teh daily data quality status and work activities.
- Performed ad-hoc analyses, as needed, with teh ability to comprehend analysis as needed.
Environment: Oracle 12C, Teradata 15.0, Teradata Sql Assistant, MDM, Informatica 9.6.1, Toad for Oracle 11.5 Expert, Erwin 9.7, MS Visio, OBIEE, Python, JIRA, AWS, Redshift, SSRS, Hadoop & Ad-Hoc
Confidential, Atlanta, GA
Sr. Data Architect/ Data Modeler
- Worked as aDataModeler/Architectto generateDataModels using Erwin and developed relational database system.
- Led Architectural Design in BigData, Hadoop projects and provide for a designer dat is an idea-driven.
- Involved in several facets of MDM implementations includingDataProfiling, Metadata acquisition anddatamigration
- Built relationships and trust with key stakeholders to support program delivery and adoption of enterprise architecture.
- Designed both 3NFdatamodels for ODS, OLTP systems and dimensionaldatamodels using Star and Snow Flake Schemas.
- Loadeddatainto Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL-like access on Hadoopdata
- Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in teh form of Entity Relationships and elicit more information.
- Used Architecting Work Flows, Activity Hierarchy & Process Flows; Documenting using Interface Diagrams, Flow Charts & Specification Documents.
- Generated and DDL (DataDefinition Language) scripts using ERWIN and assisted DBA in Physical Implementation ofDataModels.
- DevelopedDatamapping,DataGovernance, Transformation and Cleansing rules for teh MasterDataManagement Architecture involving OLTP, ODS and OLAP.
- Worked in NoSQL database on simple queries and writing Stored Procedures for Normalization and De-normalization.
- Designed and Developed Oracle, PL/SQL Procedures and UNIX Shell Scripts forData Import/Export andDataConversions.
- Created HBase tables to load large sets of structured, semi-structured and unstructureddata coming from UNIX, NoSQL and a variety of portfolios.
- Worked with teh ETL team to document teh transformation rules fordatamigration from OLTP to Warehouse environment for reporting purposes.
Environment: Erwin r9.6, Oracle 12c, NoSQL, MDM, OLAP, OLTP, Star Schema, Snowflake Schema, Hadoop, Hive, Unix, MS Visio, ETL, HDFS, ODS, PL/SQL, Metadata, DBA.
Confidential - Washington, DC
Sr. Data Architect/ Data Modeler
- As anArchitect, implemented MDM hub to provide clean, consistentdatafor a SOA implementation.
- Developed strategies fordataacquisitions, archive recovery, and implementation of databases and working in adatawarehouse environment, which includesdatadesign, database architecture, and Metadata and repository creation.
- Involved in BigDataAnalytics and Massively Parallel Processing (MPP) architectures like Greenplum and Teradata.
- Implemented dimension model (logical and physicaldatamodeling) in teh existing architecture using Erwin9.5
- Created MDM, OLAP data architecture, analytical data marts, and cubes optimized for reporting.
- Handled importing ofdatafrom variousdatasources, performed transformations using Hive, Map Reduce, loadeddatainto HDFS and Extracted thedatafrom MySQL into HDFS using Sqoop
- Used External Loaders like Multi Load, T Pump and Fast Load to loaddatainto Teradata 14.1 Database.
- Worked on generating and documenting Metadata while designing OLTP and OLAP systems environment
- Maintained enterprise models on Model Mart and contributed in Mart upgrade and other POC initiatives by enterprise team (Datalineage,DataMovement and Erwin Web Portal).
- Worked on Implementation of full lifecycle inDatawarehouses and BusinessDatamarts with Star Schemas, Snowflake Schemas, SCD & Dimensional Modeling.
- Created, managed, and modified logical and physicaldatamodels using a variety ofdatamodeling philosophies and techniques including Inmon or Kimball
- Worked on Teradata SQL queries, Teradata Indexes, MDM Utilities such as Mload, Tpump, Fast load and Fast Export.
- Designed and implemented basic PL/SQL queries for testing and sales report/datavalidation.
- Performed theDataMapping,Datadesign (DataModeling) to integrate thedataacross teh multiple databases in to EDW.
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetchdatafrom Teradata database.
- Normalized teh database up to 3NF to put them into teh Star Schema of thedatawarehouse.
- Defined best practices fordatamodeling and extraction and ensure architectural alignment of teh designs and development.
Environment: Erwin9.5, Teradata 14.1, Hive, Star Schema, Snowflake Schema, Hadoop, ODS, T-SQL, SQL, ETL, MDM, PL/SQL, OLAP, OLTP
Confidential - Kansas, KS
Sr. Data Architect/ Data Modeler
- Developed Logical and PhysicalDatamodels using ER Studio tool across teh subject areas based on teh specifications and established referential integrity of teh system.
- Worked closely with business,datagovernance, medical professionals, SMEs and vendors to definedatarequirements.
- Developed various QlikViewDataModels by extracting and using thedatafrom various sources files, Excel, and Bigdata, Flat Files.
- Implemented Python scripts to import/export JSON file, which contains teh customer survey information and/or asset information, to/from teh database.
- DevelopedDataMapping,DataGovernance, and transformation and cleansing rules involving OLTP, ODS.
- Independently coded new programs and design Tables to load and test teh program TEMPeffectively for teh given POC's using BigData/Hadoop.
- Worked on translating high level business requirements into solution, infrastructure architectures, involved enterprisearchitectsto align strategic view.
- Involved in OLAP model based on Dimension and FACTS for efficient loads ofdatabased on Star Schema structure on levels of reports using multi-dimensional models such as Star Schemas and Snowflake Schema
- Generated DDL statements for teh creation of new ER/studio objects like table, views, indexes, packages and stored procedures.
- Develop, Implement & Maintain teh Conceptual, Logical & PhysicalDatamodels using ER/Studio - Forward/Reverse Engineer Databases (for teh Staging, Normalized & presentation layers).
- Involved in debugging and Tuning teh PL/SQL code, tuning queries, optimization for teh Oracle, and DB2 database.
- DevelopedDataMigration and Cleansing rules for teh Integration Architecture (OLTP, ODS, DW).
- Worked indatafrom database DB2 using Informatica to load it into a single repository forData analysis.
- Worked on process improvement, normalization/de-normalization, data extraction, data cleansing, data manipulation
Environment: ER Studio, Star Schema, Snowflake Schema, Python, ODS, Hadoop, Spark, ETL, PL/SQL, MDM, OLTP, Oracle 10g/11g, DB2, MS Access and Excel.
Confidential - New York, NY
Sr. Data Modeler/ Data Analyst
- Created PhysicalDataAnalystfrom teh LogicalDataAnalystusing Compare and Merge Utility in ER Studio and worked with teh naming standards utility.
- Involved in database development by creating Oracle PL/SQL Functions, Procedures and Collections.
- Developed dimensional model forDataWarehouse/OLAP applications by identifying required facts and dimensions.
- Created Entity Relationship Diagrams (ERD), Functional diagrams,Dataflow diagrams and enforced referential integrity constraints.
- Used forward engineering to generate DDL from teh PhysicalDataModel and handed it to teh DBA.
- Used OracleDataMining (ODM) to perform thoroughdataanalysis on multifamily house loans and government loans.
- Worked on designing Logical and PhysicalDataModels for different database applications using teh ER Studio.
- Worked on client-Server application development using Oracle PL/SQL, SQL PLUS and SQL LOADER.
- Worked in importing and cleansing ofdatafrom various sources like Netezza, Oracle, flat files, SQL Server 2008 with high volumedata
- Designed STAR schema for teh detaileddatamarts and plandatamarts consisting of confirmed dimensions.
- Worked on performance tuning of SQL queries fordatawarehouse consisting of many tables with large amount ofdata.
- Participated in updating teh dimensional model, and identify teh Facts and Dimensions
- Used T-SQL stored procedures to transferdatafrom OLTP databases to staging area and finally transfer intodatamarts.
- Developed SQL Queries to fetch complexdatafrom different tables in remote databases using joins, database links and bulk collects.
- Assisted inDataStage Server jobs to loaddatafrom sequential files, flat files and MS Access.
- Involved in teh creation, maintenance ofDataWarehouse and repositories containing Metadata.
Environment: ER Studio, Oracle10g, Netezza,OLAP, OLTP, ODS, ODM, T-SQL, SQL Server2008, PL/SQL, Star Schema, Snowflake Schema, etc.
Sr. Data Modeler/ Data Analyst
- Gathered teh various reporting requirement from teh businessanalysts.
- Created physical and logicaldatamodels using Erwin.
- Generated and DDL (DataDefinition Language) scripts using Erwin and assisted DBA in Physical Implementation ofdataModels.
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on teh basis of using defect reports.
- Designed and developed Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object oriented Design) using UML and Visio.
- Used Erwin's reverse engineering to connect to existing database and ODS to graphically represent teh Entity Relationships.
- Involved in M-LOAD, Fast-load and T-pump loading to migratedatafrom Oracle to Teradata.
- Managed, updated and manipulated report orientation and structures with teh use of advanced Excel functions including Pivot Tables and V-Lookups
- Involved in Data mining, transformation and loading from teh source systems to teh target system
- Facilitated (JAD) Joint Application Development sessions to identify business rules and requirements and documented them in a format dat can be reviewed and understood by both business people and technical people
- Provided quality data review for completeness, inconsistencies, erroneous and missing data according to data review plan
- Created and executed test scripts, cases, and scenarios dat will determine optimal system performance according to specifications.
- Improved performance on SQL queries used Explain plan / hints /indexes for tuning created DDL scripts for database. Created PL/SQL Procedures and Triggers.
- Defined ETL architecture which includes load pattern for staging and ODS layer using Microsoft SSIS ETL tool, file archival process,datapurging process, batch execution process.
- Createddatamasking mappings to mask teh sensitivedatabetween production and test environment.
- Worked on Normalization and De-Normalization techniques for both OLTP and OLAP systems.
- Worked ondataprofiling anddatavalidation to ensure teh accuracy of thedatabetween teh warehouse and source systems.
- Participated in performance management and tuning for stored procedures, tables and database servers.
- Environment: Erwin 7.0, Teradata, SSIS, T-SQL, Oracle 9i/8i, OLAP, OLTP, ODS, PL/SQL, OOD, MS Excel and MS Visio.