We provide IT Staff Augmentation Services!

Data Architect/ Data Modeler Resume

5.00/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY:

  • More than 11 year experienced in Data ArchitectureData Modeling (Both Dimensional and Relational Models), Data Analysis,Data Warehousing and DatabaseManagement. Areas of specialization include  Database Design, Data Federation, Data Integration (ETL), Metadata/Semantic, Static and OLAP/CubeReporting and Testing
  • Experienced in designing star schema (identification of facts, measures and dimensions), Snowflake schema for Data Warehouse, ODSArchitecture by using tools like Erwin Data Modeler, PowerDesigner, E - RStudio and MicrosoftVisio.
  • Extensive experienced in Normalization (1NF, 2NF, 3NF and BCNF) and De-normalization techniques for improved database performance in OLTP and Data Warehouse/Data Mart environments. 
  • Experienced in Teradata Administrator, Teradata Assistant, BTEQ, FastLoad, MultiLoad, TPump, FExport, PMON, VisualExplain, TPT, OLE Data load). 
  • Experienced in Dimensional Data Modeling experience using Data modeling, Relational Data modeling, ER/Studio, Erwin, SybasePowerDesigner, StarJoinSchema/Snowflakemodeling, FACT& Dimensions tables, Conceptual, Physical & logical data modeling.
  • Excellent Knowledge of RalphKimball and BillInmon's approaches to Data Warehousing. 
  •  Expertise in Database Performance Tuning using OracleHints, Explainplan, TKPROF, Partitioning and Indexes 
  • Solid in-depth understanding of Information security concepts, Data modeling and RDBMS concepts. 
  •  Experienced in Designed and developed Data models for Database (OLTP), the Operational DataStore (ODS), Data warehouse (OLAP), and federated databases to support client enterprise Information Management Strategy.
  • Experienced in integration of various relational and non-relational sources such as DB2, Oracle, Netezza, SQLServer, NoSQL, COBOL, XML and Flat Files, to Netezza database. 
  •  Experienced in designing DB2Architecture for Modeling a Data Warehouse by using tools like Erwin9.6.1,PowerDesigner and E-RStudio
  • Good knowledge in OLAP, OLTP, BusinessIntelligence and Data Warehousing concepts with emphasis on ETL and Business Reporting needs. 
  • Proficient in Oracle Tools and Utilities such as TOAD, SQL*Plus and SQLNavigator
  • Experienced in Logical Data Model (LDM) and Physical Data Models (PDM) using Erwin datamodeling tool. 
  • Worked with Tableau9.1.2 and TableauServer9.1.1. Created and Published workbooks on Tableau server.
  • Excellent understanding and working experience of industry standard methodologies like Software Development Life Cycle (SDLC), as per Rational Unified Process (RUP),Agile and Waterfall Methodologies. 
  • Experienced in Data Masking using Various Tools for Online Transaction Processing (OLTP) andData Warehousing (OLAP)/Business Intelligence(BI) applications Online Transaction Processing (OLTP) and Data Warehousing (OLAP)/Business Intelligence (BI) applications. 
  • Strong analytical and problem solving skills, excellent communication and presentation skill, and a good team player.

TECHNICAL SKILLS:

Databases: Microsoft SQL Server12/14/16, Teradata 15.0, Oracle 12c/11g/10g, Netezza, DB2 and NoSQL databases

Data Modeling Tools: Erwin R7.3/8/6/R9.6.1, IBM Infosphere Data Architect, ER Studio and Power Designer 

BI Tools: Tableau, Tableau server, Tableau Reader,SAP Business Objects

Programming Languages: Oracle PL/SQL, UNIX Shell Scripting 

Operating Systems: Microsoft Windows 9x / NT / 2000/XP / Vista/7 and UNIX. 

Tools & Utilities: TOAD 9.6, Microsoft Visio 2010.

Version Control Tool: VSS, SVN, CVS, TFS

Development tools: PL/SQL Developer, Toad, SQL Developer 

Scheduling: Control-M, Cron job 

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Waterfall Model.

Other: Performance tuning, High Volume Data Management

PROFESSIONAL EXPERIENCE:

Confidential, Minneapolis, MN

Data Architect/ Data Modeler

Responsibilities:

  • Designed data architecture for the Intelligence Creation and Distribution data store and informationflow.
  • Involved in Planning, Defining and Designing data base using Erwin on business requirement and provided documentation.
  • Involved in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams with all related entities and relationship with each entity based on the rules provided by the business manager using ERWIN r9.6.
  • Created ERD diagrams using Open Sphere Model and implemented concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Worked on Database using Oracle, XML, DB2, Teradata14, Netezza, SQL server, Big Data and NoSQL.
  • Worked on Normalization and De-normalization concepts and design methodologies.
  • Like Ralph Kimball and Bill Inmon's Data Warehouse methodology.
  • Executed Hive queries on Parquet tables stored in Hive to perform data analysis to meet the business requirements and worked on importing and exporting data from Oracle and DB2 into HDFS and HIVE using Sqoop.
  • Done data migration from an RDBMS to a NoSQL database, and gives the whole picture for data deployed in various data systems.
  • Attained good knowledge in Hadoop Data Lake Implementation and HADOOPArchitecture for client business data management. 
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDMData Warehouse, Data Provisioning, ETL, and BI
  •  Worked on Normalization and De-Normalization techniques for both OLTP and OLAP systems. 
  • Analyze source systems for data acquisitionArchitect data feathering logic across different source systems. 
  • Worked with Data Lake, Spark, AWSRedshift
  • Designed in Physical Data Modelusing Erwinr9.6 data modeling tool and managed Metadata fordata models
  • Coordinated with Data Architects and Data Modelers to create new schemas and view in Netezza for to improve reports execution time, worked on creating optimized Data-Mart reports. 
  • Created logical and physical data models using Erwin9.6 for new requirements and existing databases, maintained database standards, data design, integration, migration and analyzed data in different systems. 
  • Gathered and analyzed existing physical data models for in scope applications and proposed the changes to the data models according to the requirements. 
  • Work with ETL members to implement data acquisition logic and resolve data defects. 
  • Used the AgileScrum methodology to build the different phases of Software development life cycle. 
  • Used Tableau, PowerBI to create dashboards and participated in the process of choosing the right tool for Dashboards and Analytics. 

Environment: ERWIN r9.6, Netezza, Oracle12c, Teradata15, T-SQL, SQL Server 2016, DB2, SSIS, SSRS, R, SAS, HTML, Python, javascript, UNIX, Tableau 9.1.2, MySQL, Hadoop, Hive, Pig, Map Reduce, Spark, Mongodb, MDM, PL/SQL, ETL etc.

Confidential, Cincinnati, OH

Data Architect/ Data Modeler

Responsibilities:

  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies. 
  • Involved in Normalization and de-normalization OLAP and OLTP systems, process including relational database, table, constraints (Primary key, foreign key, Unique and check) and Indexes. 
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW
  • Strong knowledge in Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, OLAP, OLTP, Star Schema, Snow Flake Schema, Fact Table and Dimension Table
  • Extensively used Erwinr9 for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse. 
  • Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE Methodologies. 
  • Created XML parser for converting the XML data to CSV and then load the data into the data model after validating the data. 
  • Source code development and changes including package/ function / Report /UNIXshellscripts
  •  Involved in database development by creating OraclePL/SQL Functions, Procedures and Collections. 
  •  Used SQL tools like TeradataSQL Assistant and TOAD to run SQL queries and validate the data in warehouse
  • Developed ETL's for Data Extraction, Data Mapping and data Conversion using SQL, PL/SQL and various shell scripts in Data stage. 
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval. 
  • Designing and coding of corelogic in complex reports. 
  • Extensively involved in analyzing various data formats using industry standard tools and effectively communicate them with business users and SME's
  • Worked with the UNIX team and installed TIDAL job scheduler on QA and Production Netezza environment. 

Environment:Metadata, Netezza, Oracle11g, Teradata14.1, T-SQL, SQL Server 2014, DB2, SSIS, R, Python, Hadoop, Spark, Map Reduce, UNIX, HTML, Java, MySQL, ERWIN r9, MDM, PL/SQL, SPSS, ETL, Informatica Power Center etc.

Confidential, Merrifield, VA

Sr. Data Analyst/Modeler

Responsibilities:

  • Performed analysis of all data into, out of and within company in support of Data Warehousing efforts. 
  •  Use complex Excel functionality such as pivottables, pivotgraphs, vlookups, concatenation and other functionality to analyze and clean up traffic databases 
  • Worked on discovering entities, attributes, relationships and business rules from functional requirements 
  • Contributed in delivering logical/physical data model (Dimensional & Relational) concepts like Star-Schema Modeling, SnowflakeSchema Modeling. 
  • Used ErwinR9.0, created Conceptual, Logical and Physical data models. 
  • Performed DataAnalysis and DataProfiling and worked on data transformations and data quality rules. 
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master DataManagementArchitecture involving OLTP, ODS and OLAP
  • Worked in enhancement of the existing Teradata processes running on the Enterprise DataWarehouse 
  • Created SQL scripts and analyzed the data in MSAccess/Excel and Worked on SQL and SAS script mapping. 
  • Wrote PL/SQL Stored Procedures, Functions, Cursors, Triggers, Packages and SQL queries using join conditions in UNIX environment. 
  • Wrote ad-hocSQL queries and worked with SQL and Netezza databases 
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas 
  •  Involved in data from various Source Systems like Oracle, SQL Server and FlatFiles as per the requirements. 
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica
  •  Extensively worked on Teradata tools & utilities like FastLoad, MultiLoad, TPump, Fast Export, TeradataParallelTransporter (TPT) and BTEQ
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQLServer2008 with high volume data 
  • Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, RelationalData (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources. 

Environment: Oracle 10g, Microsoft SQL Server 2008, Teradata 13.1, SQL Developer, SQL Manager, Erwin r9, SQL Developer Data Modeler, Visio, Informatica, Crystal Reports

Confidential, Philadelphia, PA

Sr. Data Analyst/Modeler

Responsibilities:

  • Involved in Planning, Defining and Designing data base using ER Studio on business requirement and provided documentation. 
  • Part of the team responsible for the analysis, design and implementation of the business solution. 
  • Forward engineer the physical data model to generate DDL and worked with DBA to implement it. 
  • Developed the logicaldatamodels and physicaldatamodels that capture current state/future state data elements and data flows using ERStudio
  • Handled performance requirements for databases in OLTP and OLAP models. 
  • Analyzed the weblog data using the HiveQL to extract number of unique visitors per day, page views, visit duration, most purchased product on website. 
  • Translate business requirements into conceptual, logical data models and integration data models, model databases for integration applications in a highly available and performance configuration using ER/Studio
  • Reverse Engineered DB2 databases and then forward engineered them to Teradata using ERStudio
  • Extensively used Normalization techniques (up to 3NF). 
  • Responsible for data modeling and building a star schema model in ERStudio
  • Created complex Informatica mappings using various transformations like Aggregator, Stored Procedure, SQL Transformation, Lookups, Normalizer transformation, mapplet, using Informatica designer. 
  • Designed the physical model for implementing the model into oracle10g physical data base. 
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information. 
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis. 

Environment: Erwin r8, SQL Server 2005, SQL Server Analysis Services 2008, SSIS 2008, SSRS 2008, Oracle 10g, Business Objects XI, Rational Rose, Data stage, MS Office, MS Visio

Confidential

Sr. Data Analyst/Modeler

Responsibilities:

  • Extensively used TOAD and Normalization Techniques to design Logical/Physical Data Models, relational database design. 
  • Involved in creating Physical and Logical models using Erwin
  •  Involved in OLAP model based on Dimension and FACTS for efficient loads of data based on Star Schema structure on levels of reports using multi-dimensional models such as StarSchemas and SnowFlake Schema for developing Cubes using MDM
  • Created and maintained Database Objects (Tables, Views, Indexes, Partitions, Synonyms, Database triggers, Stored Procedures) in the data model. 
  • Involved in SDLC including requirements gathering, designing, developing, testing, and release to the working environment. Developed and maintained Requirement Traceability Matrix (RTM). 
  • Worked on building the data model using ERStudio as per the requirements, discussion and approval of the model from the BA. 
  • Provided subject matter expertise as appropriate to ETL requirements, information analytics, modeling & design, development, and support activities. 
  • Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Worked on Data Quality Framework design, logical\physical model and architecture
  • Worked with DataWarehouse Extract and load developers to design mappings for Data Capture, Staging, Cleansing, Loading, and Auditing. 
  • Developed, enhanced and maintained SnowFlakes and StarSchemas within data warehouse and data mart conceptual & logical data models. 
  • Communicated data needs internally and externally to 3rd parties. Analyzed and understood the data options and documented data dictionary. 
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2 
  • Preparation of various test documents for ETL process in Quality Center. 
  • In depth analyses of data report was prepared weekly, biweekly, monthly using MSExcel, SQL&UNIX.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues. 
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources. 
  • Strong communication, interpersonal, intuitive, analytical and problem solving skills. 
  • Excellent knowledge of MicrosoftOffice with an emphasis on Excel
  • Provided maintenance support to customized reports developed in CrystalReports 

Environment: Netezza, Erwin 9x, DB2, Information Analyzer, Informatica, MDM, Quality centre, Excel, MS-Word, ETL Tools Informatica9.5/8.6/9.1 Oracle 8i/10g, Teradata V2R13/R14.10, Teradata SQL Assistant

We'd love your feedback!