We provide IT Staff Augmentation Services!

Sr. Data Architect/data Modeler Resume

5.00/5 (Submit Your Rating)

Boston, MA

SUMMARY

  • Over 9+ years of experience in Data Analysis, Data Modeling, Data Architect, Data Warehouse & Business intelligence professional with applied information Technology.
  • Experienced with Data Conversion, Data Quality, and Data Profiling, Performance Tuning and System Testing and implementing RDBMS features.
  • Expertise in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata, Netezza.
  • Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
  • Solid hands on experience with administration of data model repository, documentation in Metadata portals in such as Erwin, ER/Studio and Power Designer tools.
  • Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server 2008 and Teradata.
  • Experience in analyzing data using Hadoop Ecosystem including HDFS, Hive, Spark, Spark Streaming, Elastic Search, Kafka, HBase, Zookeeper, Pig, Sqoop, and Flume.
  • Experienced in various Teradata utilities like Fast load, Multiload, BTEQ, and Teradata SQL Assistant.
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • In-depth knowledge of SSAS, SSRS, SSIS, T-SQL Reporting and Analytics.
  • Expertise in writing SQL queries and optimizing the queries in Oracle, SQL Server, Netezza, Hive & Teradata.
  • Experienced in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, COBOL, XML and Flat Files, to Netezza database.
  • Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER/Studio, TOAD Modeler and SQL Modeler.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Proficient in System Analysis, ER/Dimensional Data Modeling, Database design and implementing RDBMS specific features.
  • Developed ETL programs using Informatica to implement the business requirements.
  • Worked in ETL and data integration in developing ETL mappings and scripts, guided team for the transformations and all aspects of SDLC that includes requirements gathering, analysis, design, and development.
  • Experienced in setting up connections to different Databases like Oracle, SQL, DB2, Hadoop, Teradata and Netezza according to users requirement.
  • Solid understanding Unified Modeling Language (UML), Object Modeling Technique (OMT), Extreme Programming (XP), and Object Oriented Analysis (OOA).
  • Experienced in data analysis and data profiling using complex SQL on various sources systems including Oracle, Teradata, and Hive platforms.
  • Extensive experience using data modeling tools Erwin, ER/Studio, IBM info sphere, Power designer and Microsoft Visio.
  • Experience in designing Architecture for Modeling a Data warehouse by using tools like Erwin r9.7/r9.6, Sybase Power Designer and E-R Studio.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.
  • Expertise in Physical Modeling for multiple platforms such as Oracle/Teradata/ SQL Server/DB2.

TECHNICAL SKILLS

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server2016/2014, MS Access2007/2005, Netezza.

Data Modeling: Erwin9.6.4/9.5, ER/Studio9.7, Power Designer.

Tools: & Software: TOAD, MS Office, BTEQ, Teradata SQL Assistant, Netezza Aginity.

Version Tool: VSS, SVN, CVS.

Cloud Technology: AWS, AWS Redshift, Microsoft Azure.

Big Data Technologies: Hadoop, Hive, HDFS, HBase, Flume, Sqoop, Spark, Pig, Impala, MapReduce.

Other tools: TOAD, SQL PLUS, SQL LOADER, MS Project, MS Visio and MS Office, Unix, PL/SQL, Crystal reports XI.

Project Execution Methodologies: Agile, Ralph Kimball and BillInmon’s data warehousing methodology, Rapid Application Development (RAD), Joint Application Development (JAD)

PROFESSIONAL EXPERIENCE

Confidential, Boston, MA

Sr. Data Architect/Data Modeler

Responsibilities:

  • Implemented ODS in increments (Agile approach of project management).
  • Extensively used Agile methodology as the Organization Standard to implement the data models.
  • Worked on Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, OLAP, OLTP, Star Schema, Snow Flake Schema, Fact Table and Dimension Table.
  • Analyzed the source data and worked with the Data Architect in designing and developing the logical and physical data models for the Enterprise Data Warehouse.
  • Created Conceptual, Logical Data model using Erwin for both relational and dimensional models.
  • Worked with Netezza and Oracle databases and implemented various logical and physical data models for them.
  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data models.
  • Developed, deployed and managed several MongoDB clusters whilst implementing robustness and scalability via Sharing and replication, including automating tasks with own scripts and open-source tools for performance tuning and system monitoring.
  • Worked on MongoDB database concepts such as locking, transactions, indexes, Sharding, replication, schema design.
  • Developed enhancements to MongoDB architecture to improve performance and scalability.
  • Experience in managing MongoDB environment from availability, performance and scalability perspectives.
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Involved in designing Logical data model and Physical conceptual data model using Erwin.
  • Created the dimensional logical model with approximately 10 facts, 30 dimensions with 500 attributes using Erwin.
  • Performed the Data Mapping, Data design (Data Modeling) to integrate the data across the multiple databases in to EDW.
  • Assisted in the oversight for compliance to the Enterprise Data Standards, data governance and data quality.
  • Involved in software development with Big Data technologies such as Hadoop, Sqoop, Flume, Kafka, Hive, Pig, Oozie, Storm, Cassandra, & Apache Nifi.
  • Created data models for AWS Redshift and Hive from dimensional data models.
  • Worked on Data modeling, Advanced SQL with Columnar Databases using AWS
  • Migrated reference data from existing product into Informatica MDM hub
  • Worked with Hadoop eco system covering HDFS, HBase, Yarn and MapReduce.
  • Managed, designed, and created the Star Schema and Snowflake Schema for a financial data mart using Erwin and DB2 using Ralph Kimball dimensional modeling techniques.
  • Worked with MapReduce frameworks such as Hadoop and associated tools (hive, pig, sqoop, etc)
  • Designed facts and dimension tables and defined relationship between facts and dimensions with Star Schema and Snowflake Schema in SSAS.
  • Worked on Business Intelligence solution using Redshift DB, and Tableau.
  • Worked with Netezza and Teradata databases and implemented various logical and physical data models for them.
  • Performance tuning heavy queries and optimizing Informatica MDM jobs.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Extracted data from various sources like Oracle, Mainframes, and flat files and loaded into the target Netezza, Teradata database.
  • Worked on Tableau for insight reporting and data visualization
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Designed Star and Snowflake Data models for Enterprise Data Warehouse using ERWIN

Environment: Erwin 9.5, NoSQL, AWS, Tableau, HBase, Hadoop, ODS, Oracle 12c, ETL, MDM, PL/SQL, OLAP, OLTP, AWS, Netezza, Teradata, Data Mapping, Oozie, Cassandra.

Confidential, Dearborn, MI

Sr. Data Architect/Data Modeler

Responsibilities:

  • Extensively used Agile methodology as the Organization Standard to implement the data models.
  • Excellent experience and knowledge on Data Warehouse concepts and dimensional data modeling using Ralph Kimball methodology.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Involved in designing Logical and Physical data models for different database applications using the Erwin 9.6.
  • Translate business and data requirements into Logical data models in support of Enterprise Data models, ODS, OLAP, OLTP, Operational Data Structures and Analytical systems.
  • Developed Extraction, Transformation and Loading (ETL) processes to acquire and load data from internal and external sources.
  • Created several Master Data Models (MDM) those unify data for critical concepts across applications.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow flake Schemas.
  • Responsible for Meta data Management, keeping up to date centralized metadata repositories using Erwin modeling tools.
  • Involved in the Analysis, design, testing and Implementation of Business Intelligence solutions using Data Warehouse, ETL, OLAP, Client/Server applications.
  • Involved in AWS Architecture design and modification, gap analysis
  • Handled importing data from various data sources, performed transformations using Hive, Map Reduce, and loaded data into HDFS.
  • Performed Hive programming for applications that were migrated to big data using Hadoop
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL access on Hadoop data
  • Designed and developed a Data Lake using Hadoop for processing raw and processed claims via Hive and Informatica.
  • Created data model and imported data using mongo import.
  • Backed up databases using Mongo DB backup facility in OPS manager.
  • Performance tuning and stress-testing of NoSQL database environments in order to ensure acceptable database performance in production mode.
  • Implemented strong referential integrity and auditing by the use of triggers and SQL Scripts.
  • Produced and enforced data standards and maintain a repository of data architecture artifacts and procedures.
  • Provides architectures, patterns, tooling choices and standards for master data and hierarchy life cycle management.

Environment: ERWIN 9.6, MS SQL Server2016, AWS, Oracle12c, SQL, Hive, MapReduce, HDFS, Hadoop, Teradata, Netezza, PL/SQL, Informatica, SSIS, SSRS.

Confidential - Chicago, IL

Sr. Data Architect/Data Modeler

Responsibilities:

  • Designing and implementing data processing pipelines with a combination of the following technologies: Hadoop, Map Reduce, Spark, Hive, Kafka, Avro, SQL and NoSQL data warehouses.
  • Developed 3NF normalized Logical and Physical database models to design OLTP system
  • Created Logical and Physical models using Erwin tool for oracle and Teradata database designs.
  • Documented logical, physical, relational and dimensional data models.
  • Involved in relational and dimensional Data Modeling for creating Logical and Physical design of the database and ER diagrams using data modeling like Erwin.
  • Performed Data Analysis and data profiling using complex SQL on various sources systems including Oracle 10g and Teradata.
  • Extracted data from Oracle and upload to Teradata tables using Teradata utilities Fast Load & Multiload.
  • Worked on AWS technologies such as Redshift, RDS, EMR
  • Used reverse engineering for a wide variety of RDBMS, including MS Access, Oracle and Teradata to connect to existing database and create graphical representation using Erwin.
  • Created a Logical Design and Physical Design in Erwin and used Erwin for developing Data Model using Star Schema methodologies.
  • Identified the entities and relationship between the entities to develop Conceptual Model using ERWIN.
  • Established process & framework to facilitate knowledge sharing and enable operational transparency across Big Data teams.
  • Designed the Data Marts in dimensional data modeling using star and snowflake schemas.
  • Identified entities and attributes and developed Conceptual, Logical and Physical Models using ERWIN.
  • Used Teradata 14 utilities such as Fast Export, Mload for handling various tasks data migration/ETL from OLTP Source Systems to OLAP Target Systems.
  • Involved in Netezza Administration Activities like backup/restore, performance tuning, and Security configuration
  • Worked on designing the whole data warehouse architecture from scratch, from ODS to data marts
  • Created dimensional model for the reporting system by identifying required dimensions and facts using ER Studio 9.5.
  • Worked with the Business Analyst, QA team in their testing and DBA for deployment of databases in the correct Teradata Environment, business analysis, testing and project co-ordination.

Environment: Erwin 9.1, Teradata14, Data Modeler, Oracle10g, T-SQL, SQL Server, MDM, PL/SQL, ETL, Informatica Power center 9.6 etc.

We'd love your feedback!