We provide IT Staff Augmentation Services!

Data Architect/ Data Modeler Resume

Minneapolis, MN

SUMMARY

  • More than 11 year experienced in Data Architecture,DataModeling (Both Dimensional and Relational Models),Data Analysis,Data Warehousing and Database Management. Areas of specialization include Database Design,Data Federation,Data Integration (ETL), Metadata/Semantic, Static and OLAP/Cube Reporting and Testing.
  • Experienced in designing star schema (identification of facts, measures and dimensions), Snowflake schema forDataWarehouse, ODSArchitecture by using tools like ErwinDataModeler, PowerDesigner, E - RStudio and MicrosoftVisio.
  • Extensive experienced in Normalization (1NF, 2NF, 3NF and BCNF) and De-normalization techniques for improved database performance in OLTP andDataWarehouse/DataMart environments.
  • Experienced in Teradata Administrator, Teradata Assistant, BTEQ, FastLoad, MultiLoad, TPump, FExport, PMON, VisualExplain, TPT, OLEDataload).
  • Experienced in Dimensional Data Modeling experience using Data modeling, Relational Data modeling, ER/Studio, Erwin, SybasePowerDesigner, StarJoinSchema/Snowflakemodeling, FACT& Dimensions tables, Conceptual, Physical & logical data modeling.
  • Excellent Knowledge of RalphKimball and BillInmon's approaches toDataWarehousing.
  • Expertise in Database Performance Tuning using OracleHints, Explainplan, TKPROF, Partitioning and Indexes
  • Solid in-depth understanding of Information security concepts,Datamodeling and RDBMS concepts.
  • Experienced in Designed and developedDatamodels for Database (OLTP), the OperationalDataStore (ODS),Datawarehouse (OLAP), and federated databases to support client enterprise Information Management Strategy.
  • Experienced in integration of various relational and non-relational sources such as DB2, Oracle, Netezza, SQLServer, NoSQL, COBOL, XML and Flat Files, to Netezza database.
  • Experienced in designing DB2Architecture for Modeling aDataWarehouse by using tools like Erwin9.6.1,PowerDesigner and E-RStudio.
  • Good knowledge in OLAP, OLTP, BusinessIntelligence andDataWarehousing concepts with emphasis on ETL and Business Reporting needs.
  • Proficient in Oracle Tools and Utilities such as TOAD, SQL*Plus and SQLNavigator.
  • Experienced in LogicalDataModel (LDM) and PhysicalDataModels (PDM) using Erwindatamodeling tool.
  • Worked with Tableau9.1.2 and TableauServer9.1.1. Created and Published workbooks on Tableau server.
  • Excellent understanding and working experience of industry standard methodologies like Software Development Life Cycle (SDLC), as per Rational Unified Process (RUP),Agile and Waterfall Methodologies.
  • Experienced inDataMasking using Various Tools for Online Transaction Processing (OLTP) andDataWarehousing (OLAP)/Business Intelligence(BI) applications Online Transaction Processing (OLTP) andDataWarehousing (OLAP)/Business Intelligence (BI) applications.
  • Strong analytical and problem solving skills, excellent communication and presentation skill, and a good team player.

TECHNICAL SKILLS

Databases: Microsoft SQL Server12/14/16, Teradata 15.0, Oracle 12c/11g/10g, Netezza, DB2 and NoSQL databases

Data Modeling Tools: Erwin R7.3/8/6/R9.6.1, IBM Infosphere Data Architect, ER Studio and Power Designer

BI Tools: Tableau, Tableau server, Tableau Reader,SAP Business Objects

Programming Languages: Oracle PL/SQL, UNIX Shell Scripting

Operating Systems: Microsoft Windows 9x / NT / 2000/XP / Vista/7 and UNIX.

Tools: & Utilities: TOAD 9.6, Microsoft Visio 2010.

Version Control Tool: VSS, SVN, CVS, TFS

Development tools: PL/SQL Developer, Toad, SQL Developer

Scheduling: Control-M, Cron job

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Waterfall Model.

Other: Performance tuning, High VolumeDataManagement

PROFESSIONAL EXPERIENCE

Confidential, Minneapolis, MN

Data Architect/ Data Modeler

Responsibilities:

  • Designeddataarchitecture for the Intelligence Creation and Distribution datastore and informationflow.
  • Involved in Planning, Defining and Designing data base using Erwin on business requirement and provided documentation.
  • Involved in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams with all related entities and relationship with each entity based on the rules provided by the business manager using ERWIN r9.6.
  • Created ERD diagrams using Open Sphere Model and implemented concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Worked on Database using Oracle, XML, DB2, Teradata14, Netezza, SQL server, Big Data and NoSQL.
  • Worked on Normalization and De-normalization concepts and design methodologies.
  • Like Ralph Kimball and Bill Inmon's Data Warehouse methodology.
  • Executed Hive queries on Parquet tables stored in Hive to perform data analysis to meet the business requirements and worked on importing and exporting data from Oracle and DB2 into HDFS and HIVE using Sqoop.
  • Done data migration from an RDBMS to a NoSQL database, and gives the whole picture for data deployed in various data systems.
  • Attained good knowledge in HadoopDataLake Implementation and HADOOPArchitecture for client businessdatamanagement.
  • Specifies overallDataArchitecture for all areas and domains of the enterprise, includingData Acquisition, ODS, MDM,DataWarehouse,DataProvisioning, ETL, and BI.
  • Worked on Normalization and De-Normalization techniques for both OLTP and OLAP systems.
  • Analyze source systems fordataacquisition.Architectdatafeathering logic across different source systems.
  • Worked withDataLake, Spark, AWSRedshift.
  • Designed in PhysicalDataModelusing Erwinr9.6datamodeling tool and managed Metadata fordatamodels.
  • Coordinated withDataArchitectsandDataModelers to create new schemas and view in Netezza for to improve reports execution time, worked on creating optimizedData-Mart reports.
  • Created logical and physicaldatamodels using Erwin9.6 for new requirements and existing databases, maintained database standards,datadesign, integration, migration and analyzeddatain different systems.
  • Gathered and analyzed existing physicaldatamodels for in scope applications and proposed the changes to thedatamodels according to the requirements.
  • Work with ETL members to implementdataacquisition logic and resolvedatadefects.
  • Used the AgileScrum methodology to build the different phases of Software development life cycle.
  • Used Tableau, PowerBI to create dashboards and participated in the process of choosing the right tool for Dashboards and Analytics.

Environment: ERWIN r9.6, Netezza, Oracle12c, Teradata15, T-SQL, SQL Server 2016, DB2, SSIS, SSRS, R, SAS, HTML, Python, javascript, UNIX, Tableau 9.1.2, MySQL, Hadoop, Hive, Pig, Map Reduce, Spark, Mongodb, MDM, PL/SQL, ETL etc.

Confidential, Cincinnati, OH

Data Architect/Data Modeler

Responsibilities:

  • Designed and developed architecture fordataservices ecosystem spanning Relational, NoSQL, and BigDatatechnologies.
  • Involved in Normalization and de-normalization OLAP and OLTP systems, process including relational database, table, constraints (Primary key, foreign key, Unique and check) and Indexes.
  • Developed MapReduce programs to parse the rawdata, populate staging tables and store the refineddatain partitioned tables in the EDW.
  • Strong knowledge inDataWarehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, OLAP, OLTP, Star Schema, Snow Flake Schema, Fact Table and Dimension Table.
  • Extensively used Erwinr9 forDatamodeling. Created Staging and Target Models for the EnterpriseDataWarehouse.
  • Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE Methodologies.
  • Created XML parser for converting the XMLdatato CSV and then load thedatainto thedatamodel after validating thedata.
  • Source code development and changes including package/ function / Report /UNIXshellscripts.
  • Involved in database development by creating OraclePL/SQL Functions, Procedures and Collections.
  • Used SQL tools like TeradataSQL Assistant and TOAD to run SQL queries and validate thedatain warehouse.
  • Developed ETL's forDataExtraction,DataMapping anddataConversion using SQL, PL/SQL and various shell scripts inDatastage.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
  • Designing and coding of corelogic in complex reports.
  • Extensively involved in analyzing variousdataformats using industry standard tools and effectively communicate them with business users and SME's.
  • Worked with the UNIX team and installed TIDAL job scheduler on QA and Production Netezza environment.

Environment: Metadata, Netezza, Oracle11g, Teradata14.1, T-SQL, SQL Server 2014, DB2, SSIS, R, Python, Hadoop, Spark, Map Reduce, UNIX, HTML, Java, MySQL, ERWIN r9, MDM, PL/SQL, SPSS, ETL, Informatica Power Center etc.

Confidential, Merrifield, VA

Sr. Data Analyst/Modeler

Responsibilities:

  • Performed analysis of alldatainto, out of and within company in support ofDataWarehousing efforts.
  • Use complex Excel functionality such as pivottables, pivotgraphs, vlookups, concatenation and other functionality to analyze and clean up traffic databases
  • Worked on discovering entities, attributes, relationships and business rules from functional requirements
  • Contributed in delivering logical/physicaldatamodel (Dimensional & Relational) concepts like Star-Schema Modeling, SnowflakeSchema Modeling.
  • Used ErwinR9.0, created Conceptual, Logical and Physical data models.
  • Performed DataAnalysis and DataProfiling and worked on data transformations and data quality rules.
  • DevelopedDataMapping,DataGovernance, Transformation and Cleansing rules for the MasterDataManagementArchitecture involving OLTP, ODS and OLAP.
  • Worked in enhancement of the existing Teradata processes running on the EnterpriseDataWarehouse
  • Created SQL scripts and analyzed thedatain MSAccess/Excel and Worked on SQL and SAS script mapping.
  • Wrote PL/SQL Stored Procedures, Functions, Cursors, Triggers, Packages and SQL queries using join conditions in UNIX environment.
  • Wrote ad-hocSQL queries and worked with SQL and Netezza databases
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
  • Involved indatafrom various Source Systems like Oracle, SQL Server and FlatFiles as per the requirements.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Extensively worked on Teradata tools & utilities like FastLoad, MultiLoad, TPump, Fast Export, TeradataParallelTransporter (TPT) and BTEQ.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQLServer2008 with high volume data
  • Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, RelationalData (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.

Environment: Oracle 10g, Microsoft SQL Server 2008, Teradata 13.1, SQL Developer, SQL Manager,Erwinr9, SQL Developer DataModeler, Visio, Informatica, Crystal Reports

Confidential, Philadelphia, PA

Sr. Data Analyst/Modeler

Responsibilities:

  • Involved in Planning, Defining and Designingdatabase using ER Studio on business requirement and provided documentation.
  • Part of the team responsible for the analysis, design and implementation of the business solution.
  • Forward engineer the physicaldatamodel to generate DDL and worked with DBA to implement it.
  • Developed the logicaldatamodels and physicaldatamodels that capture current state/future state data elements and data flows using ERStudio.
  • Handled performance requirements for databases in OLTP and OLAP models.
  • Analyzed the weblogdatausing the HiveQL to extract number of unique visitors per day, page views, visit duration, most purchased product on website.
  • Translate business requirements into conceptual, logical data models and integration data models, model databases for integration applications in a highly available and performance configuration using ER/Studio.
  • Reverse Engineered DB2 databases and then forward engineered them to Teradata using ERStudio.
  • Extensively used Normalization techniques (up to 3NF).
  • Responsible for data modeling and building a star schema model in ERStudio.
  • Created complex Informatica mappings using various transformations like Aggregator, Stored Procedure, SQL Transformation, Lookups, Normalizer transformation, mapplet, using Informatica designer.
  • Designed the physical model for implementing the model into oracle10g physical data base.
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis.

Environment: Erwin r8, SQL Server 2005, SQL Server Analysis Services 2008, SSIS 2008, SSRS 2008, Oracle 10g, Business Objects XI, Rational Rose, Data stage, MS Office, MS Visio

Confidential

Sr. Data Analyst/Modeler

Responsibilities:

  • Extensively used TOAD and Normalization Techniques to design Logical/PhysicalDataModels, relational database design.
  • Involved in creating Physical and Logical models using Erwin.
  • Involved in OLAP model based on Dimension and FACTS for efficient loads ofdatabased on Star Schema structure on levels of reports using multi-dimensional models such as StarSchemas and SnowFlake Schema for developing Cubes using MDM.
  • Created and maintained Database Objects (Tables, Views, Indexes, Partitions, Synonyms, Database triggers, Stored Procedures) in the data model.
  • Involved in SDLC including requirements gathering, designing, developing, testing, and release to the working environment. Developed and maintained Requirement Traceability Matrix (RTM).
  • Worked on building the data model using ERStudio as per the requirements, discussion and approval of the model from the BA.
  • Provided subject matter expertise as appropriate to ETL requirements, information analytics, modeling & design, development, and support activities.
  • Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Worked on Data Quality Framework design, logical\physical model and architecture.
  • Worked with DataWarehouse Extract and load developers to design mappings for Data Capture, Staging, Cleansing, Loading, and Auditing.
  • Developed, enhanced and maintained SnowFlakes and StarSchemas within data warehouse and data mart conceptual & logical data models.
  • Communicated data needs internally and externally to 3rd parties. Analyzed and understood the data options and documented data dictionary.
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
  • Preparation of various test documents for ETL process in Quality Center.
  • In depth analyses of data report was prepared weekly, biweekly, monthly using MSExcel, SQL&UNIX.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Strong communication, interpersonal, intuitive, analytical and problem solving skills.
  • Excellent knowledge of MicrosoftOffice with an emphasis on Excel.
  • Provided maintenance support to customized reports developed in CrystalReports

Environment: Netezza, Erwin 9x, DB2, Information Analyzer, Informatica, MDM, Quality centre, Excel, MS-Word, ETL Tools Informatica9.5/8.6/9.1 Oracle 8i/10g, Teradata V2R13/R14.10, Teradata SQL Assistant

Hire Now