We provide IT Staff Augmentation Services!

Data Architect/data Modeler Resume

5.00/5 (Submit Your Rating)

Baltimore, MD

SUMMARY:

  • Over 10 years of Experience working as a Data Architect/Data Analyst/Data Modeler with emphasis on Data Mapping, Data Validation in Data Warehousing Environment.
  • Solid in - depth understanding of Information security concepts, Data modeling and RDBMS concepts.
  • Experienced working in all phases of SDLC from analysis, design, coding, unit testing, system testing and user acceptance testing.
  • Extensive experience in development of T-SQL, OLAP, PL/SQL, Stored Procedures, Triggers, Functions, Packages, performance tuning and optimization for business logic implementation.
  • Responsible for architecture design,datamodeling, and implementation ofBigDataplatform and analytic applications
  • Hands-on in Hadoop, MemSQL, Hive, HBase, Sqoop, Spark Streaming, Spark SQL and Impala
  • Experienced in Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in ETL.
  • Proficient in handling complex processes using SAS/ Base, SAS/ SQL, SAS/ STAT SAS/Graph, Merge, Join and Set statements, SAS/ ODS.
  • Experience in Creating Audit control system for ETL process forBigDataandDatawarehouse Application.
  • Experience in designing of on-line transactional processing (OLTP), operationaldatastore (ODS) and decision support system (DSS) (e.g.,DataWarehouse) databases, utilizingDatavault (hub and spoke), dimensional and normalizeddatadesigns as appropriate for enterprise-wide solutions
  • Experience in developing Map Reduce Programs using Apache Hadoop for analyzing thebigdataas per the requirement.
  • Experience in Handling Huge volume ofdatain/out from Teradata/BigData
  • Worked and extracteddatafrom various database sources like Oracle, SQL Server, DB2, and Teradata.
  • Hands on experience with modeling using ERWIN in both forward and reverse engineering cases.
  • Extensive experience on business intelligence (and BI technologies) tools such as OLAP,Data warehousing, reporting and querying tools,Datamining and Spreadsheets.
  • Implemented Ralph Kimball'sdatawarehouse methodologies (Star schema & Dimensional Modeling) and aware ofDataVault.
  • Experienced inBigDataHadoop Eco System including Map Reduce, Map reduce 2, YARN, flume, Sqoop, Hive, Apache Spark, Scala
  • Created DDL scripts for implementingDataModeling changes. Created ERWIN crystal reports in HTML, RTF format depending upon the requirement, PublishedDatamodel in model mart, created naming convention files, coordinated with DBAs' to apply thedatamodel changes.
  • Solid experience in System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE& Waterfall Methodologies.
  • Good understanding and exposure to SQL queries and PL/SQL stored procedures, triggers, functions and packages.

TECHNICAL SKILLS:

Programming Languages: SQL, PL/SQL, UNIX shell Scripting, PERL, AWK, SED

Databases: Teradata R12 R13 R14.10, MS SQL Server, DB2, Netezza

Tools: MS-Office suite (Word, Excel, MS Project and Outlook), VSS

Testing and defect tracking Tools: HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite, MS Visio & Visual Source Safe

Operating System: Windows, UNIX, Sun Solaris

Big Data: Hadoop, HDFS 2, Hive, Pig, H Base, Sqoop, Flume.

ETL/Data warehouse Tools: Informatica, SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Tableau, Pentaho

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin

Tools & Software: TOAD, MS Office, BTEQ, Teradata SQL Assistant.

PROFESSIONAL EXPERIENCE

Confidential, Baltimore, MD

Data Architect/Data Modeler

Responsibilities:

  • Created and maintained Logical and Physical models for the data mart and created partitions and indexes for the tables in the data mart.
  • Performed Data profiling and Analysis applied various data cleansing rules designed data standards and architecture/designed the relational models.
  • Creating new data designs and ensuring dat they fall within the realm of the overall Enterprise BI Architecture.
  • Installed and configured Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster
  • Created logical data model from the conceptual model and its conversion into the physical database design using ERWIN9.6
  • Worked extensively onBigData, extensively used Data Stage an ETL tool to design mappings to movedatafrom Source to Target database-using Stages.
  • Maintained meta data (data definitions of table structures) and version controlling for the data model.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Built Cloud architecture on Amazon AWS.
  • Responsible for the development of target data architecture, design principals, quality control, and data standards for the organization
  • Worked with DBA to create Best-Fit PhysicalDataModel from the logicalDataModel using Forward Engineering in Erwin.
  • ELT, using TalendBigDatafor processing thedata. Transformations done include joins, reformats, sorts, if-else conditional transforms etc.
  • Developed and maintains data models and data dictionaries, data maps and other artifacts across the organization, including the conceptual and physical models, as well as metadata repository
  • Defined and implemented data quality maintenance processes
  • Developed SQL scripts for creating tables, Sequences, Triggers, views and materialized views.
  • Conducted performance analysis and created partitions, indexes and Aggregate tables.
  • Utilized Erwin's forward/reverse engineering tools and target database schema conversion process.
  • Good working noledge in normalizing the tables up to 1NF, 2NF & 3NF.
  • Perform data profiling and data analysis to enable identify data gaps and familiarize with new source system data.
  • Developed test plans and test cases for QA Unit Testing, System Testing and Enterprise testing.
  • Analyze existing source system with the halp of Data Profiling and source system data models thus creating individual data models for various domains/subject areas for the proposed data warehouse solution.
  • Used Normalization methods up to 3NF and De-normalization techniques for effective performance in OLTP systems.
  • Proposed the EDW data design to centralize the data scattered across multiple datasets.
  • Worked on the development of Data Warehouse, Business Intelligence architecture dat involves data integration and the conversion of data from multiple sources and platforms.
  • Follow the Type 2-dimension methodology to accommodate designing and maintaining for history data.

Environment: Oracle 12c, SQL Plus, Erwin 9.6, MS Visio, Source Offsite (SOS), Windows XP, QC Explorer, Share point workspace, Metadata, Datastage, SQL, PL/SQL, Business Objects XI3.5, COBOL, Quick Data.

Confidential, Charlotte, NC

Sr. Data Architect/Data Modeler

Responsibilities:

  • Played key role in defining all aspects of Data Governance - data architecture, data security, master data management, data archival & purging and meta data.
  • Created Big Data framework, Hadoop.
  • Implemented in defining and modeling financial Logical and Physical Data Model's.
  • Analyzed the associated requirements for the data management initiative.
  • Used Teradata OLAP functions like RANK, ROW NUMBER, QUALIFY, CSUM and SAMPLE.
  • Performed Data analysis to ensure Data quality and the appropriate data needed for a given initiative
  • Full life cycle of Data Lake, Data Warehouse with Big data technologies like Spark, Hadoop, Cassandra.
  • Conducted source to target mapping analysis
  • Analyzed database detailed requirements with the data stewards.
  • Used Erwin to developed Logical and Physical models for the data warehouse(Though FSLDM was not implemented at dis organization, with prior noledge of FSLDM a 3NF model was developed)
  • Identified required facts and dimensions, created dimensional model for the reporting system (Operations Performance) using Erwin.
  • Collaborated with DBA's, Business Analysts and Data Stewards conducted design review sessions to validate the developed models and logical mapping.
  • Developed and maintained Data Dictionary to create Metadata Reports for technical and business purpose
  • Responsible for different Data mapping activities from Source systems to Teradata.
  • Facilitated development, testing and maintenance of quality guidelines and procedures along with necessary documentation.
  • Point of contact for collecting and documenting, business and technical metadata.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT)
  • Integrated the packaging tasks with relevant teams for seamless transition from testing to implementation

Environment: Informatica, Teradata 14,Erwin, Linux, Mac OS, Unix, Oracle 11g,ETL Tools Informatica, Big Data, Hadoop, Windows Server 2015.

Confidential - Dayton, OH

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Participated in all phases including Analysis, Design, Coding, Testing and Documentation. Gathered Requirements and performed Business Analysis.
  • Extensively used SQL for Data Analysis.
  • Involved in Reverse engineering on existing Data model to understand the data flow and business flows.
  • Experienced in developing Entity-Relationship diagrams and modeling Transactional Databases and Data Warehouse using ER/Studio and Power Designer.
  • Designed ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) as per business requirements.
  • Created Schema objects like Indexes, Views, and Sequences.
  • Designed high level ETL architecture for overall data transfer from the OLTP to OLAP with the halp of SSIS.
  • Involved in Designing the DB2 process to Extract translates and load data from OLTP Oracle database system to Teradata data warehouse.
  • Conducted team meetings and proposed ETL strategy.
  • Facilitated meetings with the business and technical team to gather necessary analytical data requirements.
  • Involved in Reverse Engineering the existing Stored Procedures and write Mapping Documents for them.
  • Extensively worked on documentation of Data Model, Mapping Transformation and Scheduling jobs.
  • Worked extensively with Business Objects XI Report Developers in solving critical issues of defining hierarchy, loops and Contexts.
  • Deployed naming standard to the Data Model and followed company standard for Project Documentation.

Environment: DB2, SSIS/SSAS/SSRS, ER/Studio, ETL, Pl-SQL, SAP BO, MS Excel, MS VISIO

Confidential - Denver, CO

Data Modeler/Data Analyst

Responsibilities:

  • Performed Data Analysis and profiling of source data to better understand the sources.
  • Performed Data Modeling like Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables using Analysis Services Erwin.
  • Managed, updated and manipulated report orientation and structures with the use of advanced Excel functions including Pivot Tables and V-Lookups.
  • Used data cleansing techniques, Excel pivot tables, formulas, and charts.
  • Extensively used MS Access to pull the data from various data bases and integrate the data.
  • Analysis included data reports, generalized reports, SQL queries Teradata to Netezza and DB2 platform, variable distribution reports for clients as well as vendors.
  • Involved in mapping spreadsheets dat will provide the Data Warehouse Development (ETL) team with source to target data mapping, inclusive of logical names, physical names, data types, domain definitions, and corporate meta-data definitions.
  • Extensively used SQL for Data Analysis and to understand the data behavior.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Created Schema objects like Indexes, Views, and Sequences, triggers, grants, roles, Snapshots.
  • Designed the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
  • Developed Star Schema and Snowflake Schema in designing the Logical Model into Dimensional Model.
  • Conducted several Physical Data Model training sessions with the ETL Developers. Worked with them on day-to-day basis to resolve any questions on Physical Model.
  • Extensively worked on documentation of Data Model, Mapping, Transformations and Scheduling jobs.
  • Involved in ETL mapping documents in data warehouse projects.

Environment: Teradata, Erwin, SAS, Quality Center, ETL, MS Excel 2007, PL/SQL, Business Objects XIR2, Teradata R12, Teradata SQL Assistant.

Confidential - Overland Park, KS

Data Analyst

Responsibilities:

  • Collaborate with Data modelers, ETL developers in the creating the Data Functional Design documents.
  • Performed Data analysis and Data profiling using complex SQL on various sources systems including Oracle.
  • Create various Data Mapping Repository documents as part of Metadata services (EMR).
  • Used CA Erwin Data/ Modeler (Erwin) for data modeling (data requirements analysis, database design etc.) of custom developed information systems, including databases of transactional systems and data marts.
  • Provide inputs to development team in performing extraction, transformation and load for data marts and data warehouses.
  • Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
  • Document various Data Quality mapping document, audit and security compliance adherence
  • Comfort manipulating and analyzing complex, high-volume, and high-dimensionality data from varying data sources.
  • Ability to communicate the results of analyses in a clear and effective manner.
  • Understanding of basic statistics calculations
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Interact with Business System Analysts and Software Developers to transform business requirements and application requirements into appropriate data model solutions
  • Work with the business and the ETL developers in the analysis and resolution of data related problem tickets.

Environment: Talend, Oracle9i, ETL, MS Excel, MS Access, UNIX, Windows XP, SQL, PL/SQL, Talend Data Quality, SQL scripts, Power Designer, VBA.

We'd love your feedback!