We provide IT Staff Augmentation Services!

Sr. Data Architect/data Modeler Resume

0/5 (Submit Your Rating)

Malvern, PA

SUMMARY

  • Over 10 + years of working experienced as Data Architect/Modeler andDataAnalyst with high proficiency in requirement gathering anddatamodeling including design and support of various applications in OLTP, Data Warehousing, OLAP and ETL Environment.
  • Experienced in designing star schema (identification of facts, measures and dimensions), Snowflake schema forDataWarehouse,ODS Architecture by using tools like ErwinDataModeler, Power Designer, E - R Studio and Microsoft Visio.
  • Experienced in SQLqueries and optimizing the queries in Oracle, SQL Server2014,DB2, Netezza&Teradata.
  • Experienced in metadata design, real time BI Architecture includingDataGovernance for greater ROI.
  • Experienced Data Modeler with conceptual, Logical and Physical Data Modeling skills, Data Profiling skills, Maintaining Data Quality, Taradata 15/14, experienced with JAD sessions for requirements gathering, creating Data Mapping, documents, writing functional specifications, queries.
  • Excellent experienceinSQL Loader, SQL Data, SQL Data Modeling, Reporting, SQL Database Development to load data from the Legacy systems into Oracle Databases using control files and used Oracle External Tables feature to read the data from flat files into Oracle staging tables. Used EXPORT/IMPORTOracle utilities to help the DBA to migrate the databases from Oracle 12c/11g/10g /9i.
  • Extensive knowledge in data modeling in relational and non-relational databases using industry best practice data modeling techniques and tools such as Erwin, Embarcadero or similar data modeling tools.
  • Experienced inDataModeling retaining concepts of RDBMS, Logical and PhysicalDataModeling until 3NormalForm (3NF) and MultidimensionalDataModeling Schema (Star schema, Snow-Flake Modeling, Facts and dimensions).
  • Experienced in designing Architecture for Modeling a DataWarehouse by using tools like ERwin r9.6/r9.5/r9.1/8.X, Power Designer and E-R Studio.
  • Experienced in Database using Oracle, XML, DB2, Teradata, Netezza,SQL server, Big Data and NoSQL.
  • Good knowledge in Database Creation and maintenance of physicaldatamodels with Oracle, Teradata, Netezza, DB2 and SQL Serverdatabases.
  • Experienced in Data Analysis, Data Migration, Data Cleansing,Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as AbInitio and InformaticaPower Center.
  • Experienced in SQL and good knowledge in PL/SQL programming and developed Stored Procedures and Triggers and Data Stage, DB2, UNIX, Cognos, MDM, UNIX, Hadoop, Pig.
  • Experienced in Data Modeling including Data Validation/Scrubbing and Operational assumptions.
  • Very good knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and Identifying Data Mismatch.
  • Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and Fast Export.
  • Experienced in Normalization and Demoralization processes, Logical and Physical Data Modeling techniques.
  • Experienced in Database performance tuning and Data Access optimization, writing complex SQL quires and PL/SQL blocks like stored procedures, Functions, Triggers, Cursors and ETL packages.

TECHNICAL SKILLS

DataModeling Tools: Erwin r9.6/r9.5/r9.1/8.X, ER Studio and Oracle Designer.

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9.

Oracle: Oracle 12c/11g/10g/9i, R2 database servers with RAC, ASM, Data Guard, Grid Control and Oracle Golden Gate(Oracle Enterprise Manager),Oracle Data Guard, SQL* Net, SQL Loader and SQL*PLUS, AWR,ASH, ADDM, Explain Plan.

ETL Tools: SSIS, Pentaho, Informatica9.6.

Programming Languages: Java, Base SAS and SAS/SQL, SQL, T-SQL, HTML, Java Script, CSS, UNIX shells scripting, PL/SQL.

Database Tools: Oracle, Teradata, Netezza, Microsoft SQL Server 2014/2012/2008/2005 , and MS Access, Postger SQL.

Web technologies: HTML, DHTML, XML, JavaScript

Reporting Tools: Business Objects, Crystal Reports

Operating Systems: Microsoft Windows 9x / NT / 2000/XP / Vista/7 and UNIX Windows 98, 95, Windows NT, Windows XP, 7.

Tools: & Software: TOAD, MS Office, BTEQ, Teradata SQL Assistant

Big Data: Hadoop, HDFS 2, Hive, Pig, HBase, Sqoop, Flume.

Other tools: TOAD, SQL *PLUS, SQL*LOADER, MS Project, MS Visio and MS Office, Have worked on C++, UNIX, PL/SQL.

PROFESSIONAL EXPERIENCE

Sr. Data Architect/Data Modeler

Confidential, Malvern, PA

Responsibilities:

  • Working with business and IT teams to develop and maintain mappings/rules between business applications & business data and interfaces. Set standards for data management (master & transaction), analyze current state and conceive desired future state.
  • Provides the architectural leadership in shaping strategic, business technology projects, with an emphasis on application architecture. Utilizes domain knowledge and application portfolio knowledge to play a key role in defining the future state of large, business technology programs.
  • Creates ecosystem models (e.g. conceptual, logical, physical, canonical) that are required for supporting services within the enterprise data architecture (conceptual data model for defining the major subject areas used, ecosystem logical model for defining standard business meaning for entities and fields, and an ecosystem canonical model for defining the standard messages and formats to be used in data integration services throughout the ecosystem).
  • Demonstrated experience in design and implementation of an enterprise data model, metadata solution and data life cycle management in both RDBMS, Big Data environments.
  • Identifies user requirements by researching and analyzing user needs, preferences, objectives, and working methods and studies how users consume content, including data categorization and labeling.
  • Created logical/physical data models using ERwin for new requirements and existing databases, maintained database standards, provided architectural guidance for various data design/integration/migration and analyzeddata in different systems.
  • Applies architectural and technology concepts to address scalability, security, reliability, maintainability and sharing of enterprise data.
  • Involved in Logical modeling using the Dimensional Modeling techniques such as Star Schema and Snow Flake Schema.
  • Worked ondatabase design, relational integrity constraints, OLAP, OLTP, Cubes andNormalization(3NF) &De-normalizationof database.
  • Worked with various Teradata15 tools and utilities like Teradata Viewpoint, Multi Load, ARC, Teradata Administrator, BTEQ and other Teradata Utilities.
  • Extensively used AginityNetezzawork bench to perform various DML, DDLetc operations on Netezzadatabase.
  • Developed logical and physical model for schemas, created standard data documentation, such as ERD and Flow Diagram.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
  • Designed Physical Data Model (PDM) using IBM Info sphere Data Architect data modeling tool and ORACLE PL/SQL.
  • Developed LINUXShell scripts by using NZSQL/NZLOAD utilities to load data from flat files to Netezzadatabase.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.

Environment: Erwin r9.6, Normalization and De-Normalization, UNIX, Taradata SQL Assistance, MDM/Activators, Netezza, Aginity, Star and Snow Flake DDL, PL/SQL, ETL, DB2, Associated Data Marts, Data Stores, DB2, Oracle12c and Teradata15 etc.

Sr. Data Architect/Data modeler

Confidential, Radnor, PA

Responsibilities:

  • Work with data modeling and repository toolsets to support of the development of data models and related work products, including reports and briefings.
  • Collaborate with business and technical subject matter experts to develop graphical representations of data and work flows from a business and system level perspective.
  • Accountable for defining database physical structure, functional capabilities, and security, backup, and recovery specifications and for the installation of database systems.
  • Maintain database performance through the resolution of application development and production issues.
  • Presented the Data Scenarios via, Erwin9.5 logical models and excel mockups to visualize the data better.
  • Loaded data directly from Oracle toNetezzawithout any intermediate files.
  • Designed the Layout for Extraction,DataPump and Replication for prams forData Replication/DataDistribution.
  • Involved inNormalization/De-normalizationtechniques for optimum performance in relational and dimensional databaseenvironments.
  • Worked in importing and cleansing ofdatafrom various sources like Teradata, Oracle12c, Netezza, flat files, PL/SQL Serverwith high volumedata.
  • Worked onData Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, OLAP, OLTP,StarSchema,SnowFlakeSchema, Fact Table and Dimension Table.
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
  • Performed Data Analysis tasks on warehouses from several sources like Oracle11g, Teradata, DB2, and XML etc. and generated various reports and documents.
  • Worked withETLdeveloperfor design and development of Extract, Transform and Load processes for data integration projects to build data marts.
  • Performed Data analysis, statistical analysis, generated reports, listings and graphs using SAS Tools-SAS/Base, SAS/Macros and SAS/Graph, SAS/SQL, SAS/Connect, SAS/Access.
  • Worked in the capacity of ETL Developer (Oracle Data Integrator (ODI) / PL/SQL) to migrate data from different sources in to target Oracle Data Warehouse.
  • Used External Loaders like Multi Load, T Pump and Fast Load to load data into Teradata14.1Database.
  • Involved in Troubleshooting and quality control of data transformations and loading during migration from Oracle systems intoNetezzaEDW.
  • Strong Knowledge on concepts of DataModeling Star Schema/Snowflake modeling, FACT& Dimensions tables and Logical&Physical data modeling.
  • Worked on Informatica Utilities Source Analyzer, warehouse Designer, Mapping Designer, Mapplet Designer and TransformationDeveloper.

Environment: ERwin9.5, PL/SQL, ETL, Netezza, MySQL, Oracle10g, DB2, logical/physical data models, ER/Studio, OLTP/OLAP, UNIX, Star schema and Snowflake schema, Teradata, Netezza, Data Architecture, Data profiling, Data analysis, data mapping and Data architecture, Taradata14.1.

Sr. Data Modeler/Data Analyst

Confidential - Cincinnati, OH

Responsibilities:

  • Worked with data compliance teams, data governance team to maintain data models, Metadata, Data Dictionaries; define source fields and its definitions.
  • Performing Source System Analysis, database design,datamodeling for the warehouse layer using MLDM concepts and package layer using Dimensional modeling.
  • Documented logical, physical, relational and dimensional data models. Designed the Data Marts in dimensional data modeling using star and snowflake schemas.
  • Transformed Logical Data Model to ERwin,Physical Data Model ensuring the Primary Key and Foreign Key relationships in PDM, Consistency of definitions of Data Attributes and Primary IndexConsiderations.
  • Extensively developed Oracle10g stored packages, procedures, functions and database triggers using PL/SQL for ETL process, data handling, logging, archiving and to perform Oracle back-end validations for batch processes.
  • Used Netezza SQL, Stored Procedures, and NZload utilities as part of the DWH appliance framework.
  • Worked with the UNIX team and installed TIDAL job scheduler on QA and ProductionNetezza environment.
  • Worked with DBA's to create a best-fit Physical Data Model from the logical data model.
  • Worked with BTEQ to submit SQL statements, import and exportdata, and generate reports in Teradata.
  • Designed and documented Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object Oriented Design) using UML and Visio.
  • Worked in development and maintenance using OracleSQL,PL/SQL,SQLLoader, and Informatica Power Center9.1.
  • Involved inNetezzadatabase like creating tables, sequences, synonyms, joins, functions and operators.
  • Created and implementedMDMdatamodel for Consumer/Provider for HealthCareMDMproduct from Variant.
  • Used Erwin9.1 for effective model management of sharing, dividing and reusingmodel information and design for productivity improvement.
  • Coordinated with DataArchitects and Data Modelers to create new schemas and view inNetezzafor to improve reports execution time, worked on creating optimized Data-Mart reports.
  • Extensively used SQL Loader to load data from the Legacy systems into Oracledatabases using control files and used Oracle External Tables feature to read the data from flat files into Oracle staging tables.

Environment: ERwin9.1, Taradata, Oracle10g, PL/SQL, UNIX, TIDAL, Normalization and De-normalization, Informatica Power Center, MDM, SQL Server, Netezza, DB2, Star schema and Snow Flake schema, Aginity, Architecture, SAS/Graph,SAS/SQL,SAS/Connect andSAS/Access.

Sr. Data Modeler/Data Analyst

Confidential, Jersey City, NJ

Responsibilities:

  • Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects and produced Logical /Physical Data Models.
  • Performed Data Analysis, Data Migration and data profiling using complex SQL on various sources systems including Oracle and Teradata13.1.
  • Involved inNormalization/De-normalization, Normal Form and database design methodology. Expertise in using data modeling tools like MS Visio and Erwin Tool for logical and physical design of databases.
  • Involved in Dimensional modeling (StarSchema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Used extensively BaseSAS,SAS/Macro,SAS/SQL, and Excel to develop codes and generated various analytical reports.
  • Used ModelManagerOption in Erwin to synchronize the data models in Model Mart approach.
  • Worked with the ETL team to document the transformation rules for data migration from OLTP to Warehouse environment for reporting purposes.
  • Involved in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, SFDC,Netezza, SQL Server, COBOL, XML and Flat Files.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server2008 with high volume data.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Developed the code as per the client's requirements usingSQL,PL/SQLand Data Ware housing concepts.
  • Involved in several facets ofMDMimplementations including Data Profiling, metadata acquisition and data migration.
  • Involved in purpose of this project is to migrate the Current Optum Rx Data Warehouse from the Iseriesdatabase environment to aNetezzaappliance.

Environment: ERwin, Data Modeling, Star schema, Informatics, Taradata13, PL/SQL, BTEQ, DB2, Oracle, SQL, Teradata, Netezza, Normalization and De-normalized, Conceptual, Logical, Extended Logical, ETL, UNIX, Microsoft SQL Server, TOAD.

Data Analyst/Modeler

Confidential, Anaheim, CA

Responsibilities:

  • DevelopedSQLandPL/SQLscripts for migration of data between databases.
  • Involved in requirement analysis, ETL design and development for extracting data from the source systems likeDB2, sybase, Oracle, flat files and loading intoNetezza.
  • Worked on MySQLdatabase on simple queries and writing Stored Procedures forNormalizationDe-normalization.
  • Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin / Star Schema.
  • Generated customized reports usingSAS/MACRO facility, PROC REPORT, PROC TABULATE and PROC SQL.
  • Involved in mapping spreadsheets that will provide the Data Warehouse Development (ETL) team with source to target data mapping, inclusive of logical names, physical names, data types, domain definitions, and corporate meta-data definitions.
  • Converted physical database models from logical models, to build/generateDDLscripts.
  • Created mapping document and mappings inMDMHUB ORS using various cleanse list and cleanse functions.
  • Strong knowledge of database such as Oracle, Microsoft SQL Server, DB2,Netezza.

Environment: ERwin, Informatica ETL, Taradata SQLs, PL/SQL, Netezza, DB2, Oracle, Normalization / De-normalization, Star Schema and Snow Flake Schema, TOAD, ETL, Meta Data, MySQL, sybase.

We'd love your feedback!