We provide IT Staff Augmentation Services!

Sr. Data Architect/modeler Resume

4.00 Rating

Westlake, TX

SUMMARY

  • Over 8 years of Senior Data Architect/Modeler/Analyst with IT professional experienced in Data Analysis, Data Modeling, Data Architecture, designing, developing, and implementing data models for enterprise - level applications and systems.
  • Data Warehousing: Full life-cycle project leadership, business-driven requirements, capacity planning, gathering, feasibility analysis, enterprise and solution architecture, design, construction, data quality, profiling and cleansing, source-target mapping, gap analysis, data integration/ETL, SOA, ODA, data marts, Inman/Kimball methodology, Data Modeling for OLTP, canonical modeling, Dimension Modeling for data ware house star/snowflake design.
  • Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables.
  • Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
  • Worked on Informatica Power Center tools-Designer, Repository Manager, Workflow Manager.
  • Practical understanding of teh Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Expertise in Data Governance, Collibra Software, and Business Analytics.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Experience in BI/DW solution (ETL,OLAP, Data mart), Informatica, BI Reporting tool like Tableau and Qlikview and also experienced leading teh team of application, ETL, BI developers, Testing team
  • Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
  • Solid experience in Relational Modeling, Dimensional Modeling, Conceptual, Logical Modeling, Physical Modeling, Star Schema, Snowflakes Schema, ERD (IDEF1X and IE notation) ER Diagrams, Granularity, Cardinality and Database Reengineering.
  • I’ve implemented tools to extract, transform and Load Data using Hive, pig andHBase with data mining and management with huge data sources (Oracle/Teradata, Hadoop, NoSQL, Netezza) with solid experience in working and creating 3rd Normal Forms (ODS) and Dimensional models (OLAP).
  • Excellent noledge in preparing required project documentation and tracking and reporting regularly on teh status of projects to all project stakeholders.
  • Implemented tools to extract, transform and Load Data using Java, Hive, pig andHBase with data mining and management with huge data sources (Oracle/Teradata, Hadoop) with solid experience in working and creating 3rd Normal Forms (ODS) and Dimensional models (OLAP).
  • Extensive ETL testing experience using Informatica 9x/8x, Talend, Pentaho.
  • Work on Background process in oracle Architecture. Also drill down to teh lowest levels of systems design and construction.
  • Experienced in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
  • Heavy use of Access queries, V-Lookup, formulas, Pivot Tables, etc. Working noledge of CRM Automation Salesforce.com, SAP.
  • Experienced in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, COBOL, XML and Flat Files, to Netezza database.
  • Business Intelligence: Requirements analysis, Key Performance Indicators (KPI), metrics development, sourcing and gap analysis, OLAP concepts and methods, aggregates / materialized views and performance, rapid prototyping, tool selection, semantic layers Excellent experience in writing SQL queries to validatedatamovement between different layers in data warehouse environment.
  • Assist in creating communication materials based on data for key internal /external audiences.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Proficient in System Analysis, ER/Dimensional Data Modeling, Database design and implementing RDBMS specific features.

TECHNICAL SKILLS

Data Modeling Tools: Erwin, Rational System Architect, IBM Infosphere Data Architect, ER Studio and Oracle Designer

Database Tools: Microsoft SQL Server2012/2015, Teradata 15.0, Oracle 11g/9i/12c and MS Access.

ETL/Datawarehouse Tools: Informatica 9.6/9.1/8.6.1/8.1 , SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Tableau 8.2.

Big Data: Hadoop, Hive, Pig, Hbase, Spark.

Cloud: AWS.

BI Tools: Tableau 7.0/8.2, Tableau server 8.2, Tableau Reader 8.1,SAP Business Objects, Crystal Reports Packages Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server

Quality Assurance Tools: Win Runner, Load Runner, Test Director, Quick Test Pro, Quality Center, Rational Functional Tester.

Tools: OBIE 10g/11g, SAP ECC6 EHP5, Go to meeting, Docusign, Insidesales.com, Share point, Mat-lab.

Project Execution Methodologies: Agile, Ralph Kimball and BillInmondatawarehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD)

Version Tool: VSS, SVN, GIT.

Testing and defect tracking Tools: HP/Mercury (Quality Center), Quick Test Professional, Performance Center, Requisite, MS Visio.

Operating System: Windows, Unix, Sun Solaris

PROFESSIONAL EXPERIENCE

Confidential, Westlake, TX

Sr. Data Architect/Modeler

Responsibilities:

  • Massively involved in Data Architecture role to review business requirement and compose source to target data mapping documents.
  • Responsible for Data Architecture, Data Modeling, Data Integration, Data quality & Metadata management solution design and delivery for Enterprise EDW and Hadoop environment.
  • As aArchitectimplement MDM hub to provide clean, consistentdatafor a SOA implementation
  • Independently coded new programs and design Tables to load and test teh program TEMPeffectively for teh given POC's using BigData/Hadoop.
  • Worked with BTEQ to submit SQL statements, import and exportdata, and generate reports in Teradata.
  • Defined and deployed monitoring, metrics, and logging systems on AWS.
  • Translate business anddatarequirements into Logicaldatamodels in support of EnterpriseData Models, ODS, OLAP, OLTP, OperationalDataStructures and Analytical systems.
  • Full life cycle ofDataLake,DataWarehouse with Bigdatatechnologies like Spark, Hadoop.
  • Responsible for technicalDatagovernance, enterprise wideDatamodeling and Database design.
  • Developed Data Migration and Cleansing rules for teh Integration Architecture (OLTP, ODS, DW).
  • Designed both 3NFdatamodels for ODS, OLTP systems and dimensionaldatamodels using Star and Snow flake Schemas.
  • Worked extensively on ER Studio for multiple Operations across Hartford in both OLAP and OLTP applications.
  • Involved in OLAP model based on Dimension and FACTS for efficient loads ofdatabased on Star Schema structure on levels of reports using multi-dimensional models such as Star Schemas and Snowflake Schema.
  • Performed theDataMapping,Datadesign (DataModeling) to integrate thedataacross teh multiple databases in to EDW.
  • Created Communities, Domains, Assets, hierarchies in Collibra.
  • Spearheaded teh establishment of teh Enterprise Business Glossary, including Business Terms, BT Descriptions, and Business Rules; teh Tiering Criteria, encompassing Tier 1, 2 or 3; and teh Data Linkages between teh Metadata and Lineage documents for Collibra, IDQ, and IMM data governance tools. Integrated process to manage data quality
  • Preparation of business (Collibra) and technical metadata (IBM Infosphere)
  • Perform administrative tasks, including creation of database objects such as database, tables, and views, using SQL DCL, DDL, and DML requests.
  • Work with Enterprise Data Governance team to review enterprise standards for data quality playbook.
  • Prepare Data Flows and Write SQL Quiries for Data Profiling/Data Quatily check list on Source/Target data and formalize Data Governance Rules. Presenting Data profiling results from QlikView for decision making.
  • Participated in several project activities includingDataArchitectdesign, ETL design, QA support, Code review.
  • Designed thedatamarts using teh Ralph Kimball's DimensionalDataMart modeling methodology using ER Studio.
  • Experience using MapReduce, and "Big data" work on Hadoop and other NOSQL platforms.
  • Review system architecture, data flow, data warehouse dimensional model, DDL to identify teh area for improvement to reduce teh loading & reporting time for a meter reading system.
  • Managed and worked with offshore teams to create workflows on Collibra
  • Incorporated business requirements in quality conceptual, logicaldatamodels using ER Studio and created physicaldatamodels using forward engineering techniques to generate DDL scripts.
  • Designing normalized and star schemadataarchitectures using ER Studio and forward engineering these structures into Teradata.
  • Responsible for Bigdatainitiatives and engagement including analysis, brainstorming, POC, and architecture.
  • Transformation Services (DTS), and DataStage and ETL package design, and RDBM systems like SQL Servers, Oracle, and DB2.
  • Identifying Data Governance issues and formulating refined business process and data flow for long term solution.
  • Installation and Configuration of other Open Source Software like Pig, Hive,HBase, Flume and Sqoop.
  • Prepared a web UI for theHBasedatabase for crud operation like put, get, scan, delete and update.
  • Generated periodic reports based on teh statistical analysis of teh data using SQL Server Reporting Services (SSRS).
  • Worked on NoSQL databases includingHBase, Mongo DB, and Cassandra. Implemented multi-data center and multi-rack Cassandra cluster.
  • Selecting teh appropriate AWS service based on data, compute, database, or security requirements.
  • Used Flume extensively in gathering and moving log data files from Application Servers to a central location in Hadoop Distributed File System (HDFS) for data science.
  • Working on Amazon Redshift and AWS and architecting a solution to load data, create data models and run BI on it.
  • Migrated DTS packages to SSIS and modified using DTS editor wizard.
  • Business Data Lineage from Critical Data Elements to DQ Measures to Business Rules mapped on Collibra
  • Developed and configured on Informatica MDM hub supports teh MasterDataManagement (MDM), Business Intelligence (BI) andDataWarehousing platforms to meet business needs.
  • Worked with Hadoop eco system covering HDFS, HBase, YARN and Map Reduce.
  • Developed a dashboard solution for analyzing STD statistics by building SSIS cubes and Tableau.
  • Developed, Implemented & Maintained teh Conceptual, Logical & PhysicalDataModels using ER/Studio 9 (ER Studio)- Forward/Reverse Engineered Databases.
  • Involved in Agile project management environment.
  • Developed PL/SQL scripts to validate and load data into interface tables
  • Involved in maintaining data integrity between Oracle and SQL databases.

Environment: ERStudio9.7, Teradata15, Star Schema, Snowflake Schema, AWS, HBase, Pig, Hive, Sqoop, OLAP, OLTP, Oracle12c, ODS, Business Objects, MDM, Hadoop, SQL Server 2012, Spark, Cassandra.

Confidential, Tampa, FL

Sr. Data Architect/Modeler

Responsibilities:

  • Worked as a Data Modeler/Architect to generate Data Models using Erwin and developed relational database system.
  • Involved in requirement gathering and analysis with Business analyst, systems analysts, Developers and DBA and translated them into detailed reporting requirements.
  • Designed teh Logical Data Model using ERWIN 9.64 with teh entities and attributes for each subject areas.
  • Developed long term data warehouse roadmap and architectures, designs and builds teh data warehouse framework per teh roadmap.
  • Specifies overall Data Architecture for all areas and domains of teh enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL and Big Data technologies.
  • Designed, developed, and maintained Enterprise Data Architecture for enterprise data management including business intelligence systems, data governance, data quality, enterprise metadata tools, data modeling, data integration, operational data stores, data marts, data warehouses, and data standards.
  • Data modeling, Design, implement, and deploy high-performance, custom applications at scale on Hadoop /Spark.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Applied data analysis, data mining and data engineering to present data clearly.
  • Worked with multiple Databases including RDBMS Technologies (MySql, Oracle) and NoSQL databases (Cassandra, HBase, Neo4J)
  • Ensured high-quality data and understand how data is generated out experimental design and how these experiments can produce actionable, trustworthy conclusions.
  • Reverse engineered some of teh databases using Erwin.
  • Knowledge with multiple Hadoop clusters usingKerberosand Sentry.
  • Proficiency in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, Redshift, SQL Server, and Oracle).
  • Createddatamodels forAWSRedshift and Hive from dimensionaldatamodels.
  • Good Experience in creating cubes by usingPentahoSchema Workbench
  • Advises/leads projects involving teh ETL related activities and teh migration or conversion of data between enterprise data systems. Coordinates interactions between central IT, business units, and data stewards to achieve desired organizational outcomes.
  • Used thedataIntegration toolPentahofor designing ETL jobs in teh process of buildingDatawarehouses andDataMarts.
  • Advises on and enforces data governance to improve teh quality/integrity of data and oversight on teh collection and management of operational data.
  • Able to guide / partner with VP / Directors for architecting solutions for teh Big data Organization
  • Integrated crystal reports using Erwin Data Modeler.
  • Us Erwin to support for TeradataV15 and SSL.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for teh Master Data Management Architecture involving OLTP, ODS.
  • Involved in designing Logical and Physical data models for different database applications using teh Erwin.

Environment: Oracle 12c, MS-Office, SQL Architect, TOAD Benchmark Factory, Teradatav15, SQL Loader, Big Data SharePoint, ERwin r 9.64, DB2, MS-Office, SQL Server 2008/2012.

Confidential, Omaha, NB

Sr. Data Analyst/Modeler

Responsibilities:

  • Gatheird and translated business requirements, worked with teh Business Analyst and DBA for requirements gathering, business analysis, and testing and project coordination.
  • Extensively used Erwin 9.1 for developing data model using star schema methodologies
  • Experienced in Using CA Erwin Data Modeler (Erwin) for Data Modeling (data requirements analysis, database design etc.) of custom developed information systems, including databases of transactional systems and data marts.
  • Owned and managed all changes to teh data models. Created data models, solution designs and data architecture documentation for complex information systems.
  • Worked with reverse engineering Data Model from Database instance and Scripts.
  • Extensively used Erwin r9.1 for Data modeling. Created Staging and Target Models for teh Enterprise Data Warehouse.
  • Worked on Sendingdatafrom hdfs(Hive db) togreenplumusing sqoop.
  • Developed MapReduce to write User Profiles and Personalized Recommendation data intoHBase.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN.
  • Worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Creation of BTEQ, Fast export, Multi Load, TPump, Fast load scripts for extracting data from various production systems .
  • Loaded and transformed large sets of structured, semi structured and unstructureddatausing Hadoop/BigDataconcepts
  • Worked on AWS Redshift and RDS for implementing models and data on RDS and Redshift.
  • Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, data types, volumetric, domain definitions, and corporate meta-data definitions.
  • Responsible for developing and supporting a data model and architecture that supports and enables teh overall strategy of expanded data deliverables, services, process optimization and advanced business intelligence.
  • Designed and troubleshootPentahodataintegration (kettle) jobs, transformations.
  • Designing Star Schema and Snow Flake Schema on Dimensions and Fact Tables
  • Expertise in Informatica, DB2, Microstrategy and UNIX Shell scripting
  • Worked with Data Vault Methodology Developed normalized Logical and Physical database models
  • Designed ER diagrams (Physical and Logical using Erwin) and mapping teh data into database objects and identified teh Facts and Dimensions from teh business requirements and developed teh logical and physical models using Erwin.
  • Analysis, reporting and tracking of defects on a daily basis.
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Implemented Data Vault Modeling Concept solved teh problem of dealing with change in teh environment by separating teh business keys and teh associations between those business keys, from teh descriptive attributes of those keys using HUB, LINKS tables and Satellites.
  • CreatedHBasetables to load large sets of structured, semi-structured and unstructured data coming from VOD and ODOL.
  • Wrote and running SQL, BI and other reports, analyzing data, creating metrics/dashboards/pivots/etc.
  • Gather and analyze business data requirements and model these needs. In doing so, work closely with teh users of teh information, teh application developers and architects, to ensure teh information models are capable of meeting their needs.
  • Working along with ETL team for documentation of transformation rules for data migration from OLTP to warehouse for purpose of reporting.
  • Transformed Logical Data Model to Physical Data Model ensuring teh Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.

Environment: Python, MySQL, PostgreSQL,, SQL Server, Erwin, Informatica, AWS Redshift, RDS, Big Data, JDBC, NOSQL, Spark, Scala, Star Schema, Snow Flake Schema .

Confidential, Bronx, NY

Sr. Data Analyst/Modeler

Responsibilities:

  • Developed teh logical data models and physical data models that capture current state/future state data elements and data flows using ER Studio.
  • Reverse Engineered DB2 databases and tan forward engineered them to Teradata using ER Studio.
  • Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
  • Involved in meetings with SME (subject matter experts) for analyzing teh multiple sources.
  • Involved in SQL queries and optimizing teh queries in Teradata.
  • Created DDL scripts using ER Studio and source to target mappings to bring teh data from source to teh warehouse.
  • Wrote Sybase stored procedures to retrieve data from Facets database application for Web interface member eligibility inquiry and Web interface claim status inquiry.
  • Developed teh design & Process flow to ensure that teh process is repeatable.
  • Performed analysis of teh existing source systems (Transaction database)
  • Involved in maintaining and updating Metadata Repository with details on teh nature and use of applications/datatransformations to facilitate impact analysis.
  • Created DDL scripts using ER Studio and source to target mappings to bring teh data from source to teh warehouse.
  • Designed teh ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata .
  • Worked in importing and cleansing ofdatafrom various sources like Teradata, Oracle, flatfiles, MS SQL Server with high volumedata
  • Designed Logical & Physical Data Model /Metadata/ data dictionary usingErwinfor both OLTP and OLAP based systems.
  • Identify, assess and intimate potential risks associated to testing scope, quality of teh product and schedule.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data
  • Worked extensively on ER Studio for multiple Operations across Atlas Copco in both OLAP and OLTP applications.
  • Developed enhancements toMongo DBarchitecture to improve performance and scalability.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conductdataanalysis.
  • Produced PL/SQL statement and stored procedures in DB2 for extracting as well as writingdata.
  • Co-ordinate all teams to centralize Meta-data management updates and follow teh standard Naming Standards and Attributes Standards for DATA &ETL Jobs.
  • Finalize teh naming Standards for Data Elements and ETL Jobs and create a Data Dictionary for Meta Data Management.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data

Environment: ER Studio, Business Objects XI, Rational Rose,Datastage, MS Office, MS Visio, SQL, SQL Server, Rational Rose, Crystal Reports 9, SQL Server 2008, SQL Server Analysis Services, SSIS, Oracle 10g.

Confidential

Data Analyst/Modeler

Responsibilities:

  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and PhysicalDataModels.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Attended and participated in information and requirements gathering sessions
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN
  • Created and maintained Logical Data Model (LDM) for teh project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
  • Validated and updated teh appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
  • Created business requirement documents and integrated teh requirements and underlying platform functionality.
  • Excellent noledge and experience in Technical Design and Documentation.
  • Used forward engineering to create a physical data model with DDL that best suits teh requirements from teh Logical Data Model.
  • Involved in preparing teh design flow for theDatastageobjects to pull thedatafrom various upstream applications and do teh required transformations and load thedatainto various downstream applications
  • Performed logicaldatamodeling, physicaldatamodeling (including reverse engineering) using teh ErwinDataModeling tool.
  • Experience in developing dashboards and client specific tools in Microsoft Excel and Power Point.
  • Responsible for teh development and maintenance of Logical and Physical data models, along with corresponding metadata, to support Applications.

Environment: Erwin8.0, ER- Studio6.0/6.5, Toad 8.6, Informatica 8.0, IBM OS 390(V6.0), DB2 V7.1, Oracle9i, PL/SQL, Solaris 9/10, Windows Server 2003 & 2008. NZSQL,

We'd love your feedback!