We provide IT Staff Augmentation Services!

Sr. Data Architect/data Modeler Resume

0/5 (Submit Your Rating)

Greensboro, NC

SUMMARY

  • Above 9+ years of IT experience as Data Architect/Data Modeler & Data Analyst in the Architecture, Design and Development.
  • Strong experience with different project methodologies including Agile Scrum Methodology and Waterfall methodology.
  • Excellent Experience with Big data technologies like Hadoop, Big Query, MongoDB, Hive, HBase, Pig, Cassandra and MongoDB
  • Good understanding and hands on experience in setting up and maintaining NoSQL Databases like Cassandra, Mongo DB, and HBase
  • Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice - versa.
  • Excellent experience in creating cloud based solutions and architecture using Amazon Web services and Microsoft Azure.
  • Experienced in designing the Conceptual, Logical and Physical data modeling using Erwin and ER Studio, Sybase Power Designer tools.
  • Experience in Data Architecture, Data Modeling, designing and Data Analysis with Conceptual, Logical and Physical Modeling for Online Transaction Processing (OLTP), Online Analytical Processing (OLAP) and Data Warehousing.
  • Strong experience in Normalization (1NF, 2NF, 3NF and BCNF) and Denormalization techniques for effective and optimum performance
  • Good knowledge of Data Marts, Operational Data Store (ODS), Dimensional Data Modeling with Ralph Kimball Methodology using Analysis Services.
  • Experience in working with Business Intelligence and Enterprise Data Warehouse(EDW) including SSAS, Amazon Redshift and Azure Data Warehouse.
  • Good working experience with AWS Redshift db design and development and AWS S3 development.
  • Experienced in setting up connections to different Databases like Oracle, SQL, DB2, Teradata and Netezza according to users requirement.
  • Extensively experience on EXCEL PIVOT tables to run and analyze the result data set and perform UNIX scripting.
  • Extensive experience on usage of ETL & Reporting tools like SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS)
  • Experience in designing star schema, Snowflake schema for Data Warehouse, ODS architecture.
  • Strong experience in writing SQL and PL/SQL, Transact SQL programs for Stored Procedures, Triggers and Functions.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import and Data Export.

TECHNICAL SKILLS

Data Modeling Tools: Erwin R9.6.1/9.5, Sybase Power Designer, ER Studio and Oracle Designer

BigData& NOSQL: Hadoop, YARN, SQOOP, Flume, Kafka, Splunk, Hive, Pig, Oozie, Storm, Cassandra, HBase, MapReduce, Impala.

Database Tools: Microsoft SQL Server 2016/2014, Teradata 15.0, Oracle 12c/11g, DB2.

BI Tools: Tableau, Tableau server, Tableau Reader, SAP Business Objects, Crystal Reports

Packages: Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server

Operating Systems: Microsoft Windows 7/8/XP, Linux, UNIX.

Version Tool: VSS, SVN, CVS.

Tools: & Utilities: TOAD 9.6, Microsoft Visio 2010.

Quality Assurance Tools: Win Runner, Load Runner, Test Director, Quick Test Pro, Quality Center, Rational Functional Tester.

Methodologies: Agile, RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Waterfall Model.

PROFESSIONAL EXPERIENCE

Confidential - Greensboro, NC

Sr. Data Architect/Data Modeler

Responsibilities:

  • Developing full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Interacted with Business Analyst, SMEs and other Data Architects to understanding Business needs and functionality for various project solutions
  • As a Architect implement MDM hub to provide clean, consistent data for a SOA implementation.
  • Used Agile Methodology of Data Warehouse development using Kanbanize.
  • Designed the Logical Data Model using Erwin 9.6 with the entities and attributes for each subject areas
  • Implemented logical and physical relational database and maintained Database Objects in the data model using Erwin.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
  • Involved in Data Architecture, Data profiling, Data analysis, data mapping and Data architecture artifacts design.
  • Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
  • Extracting Mega Data from Amazon Redshift, AWS, and Elastic Search engine using SQL Queries to create reports.
  • Participated in Performance Tuning using Explain Plan and TKPROF.
  • Identified and evaluated current and emerging data management trends and technologies.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL access on Hadoop data
  • Designed and Developed Oracle12c, PL/SQL Procedures and UNIX Shell Scripts for Data Import/Export and Data Conversions.
  • Used ETL methodology for supporting data extraction, transformations and loading processing, in a complex MDM using Informatica.
  • Produced and enforced data standards and maintain a repository of data architecture artifacts and procedures.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Responsible for the data architecture design delivery, data model development, review, approval and Data warehouse implementation. .
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Experience in Data mining with querying and mining large datasets to discover transition patterns and examine financial reports.
  • Working on Physical design for both SMP and MPP RDBMS, with understanding of RDMBS scaling features.
  • Responsible for technical data governance, enterprise wide data modeling and database design.
  • Involved in using Excel and MS Access to dump thedataand analyze based on business needs.
  • Used Star Schema and Snowflake Schema methodologies in building and designing the Logical Data Model into Dimensional Models.

Environment: Erwin9.6, Oracle 12c, Hive, ODS, OLTP, Hadoop, MapReduce, HDFS, MDM, NoSQL, Business Objects, Agile, Unix, Spark, Cassandra, OLAP, Tableau

Confidential - Chicago, IL

Sr. Data Architect/Data Modeler

Responsibilities:

  • Responsible for the data architecture design delivery, data model development, review, approval and Data warehouse implementation.
  • Worked with DBAs to create a best fit Physical Data Model from the Logical Data Model using Erwin9.5
  • Involved in Technical consulting and end-to-end delivery with architecture, datamodeling,data governance and design - development - implementation of solutions.
  • Applied Master Data Management to create and maintain consistent, complete, contextual, and accurate business data for all stakeholders.
  • Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects.
  • Handled importing of data from various data sources, performed transformations using Hive, Map Reduce, loaded data into HDFS and Extracted the data from My SQL into HDFS using Sqoop
  • Designed and developed Data Marts by following Star Schema and Snowflake Schema Methodology using Erwin
  • Involved in reviewing business requirements and analyzing data sources form Excel/Oracle, SQL Server2014 for design, development, testing, and production rollover of reporting and analysis projects within Tableau Desktop to Netezza database.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Performed the Data Mapping, Data design (Data Modeling) to integrate the data across the multiple databases in to EDW.
  • Create MDM base objects, Landing and Staging tables to follow the comprehensive data model in MDM.
  • Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
  • Worked with Hadoop eco system covering HDFS, HBase, YARN and Map Reduce.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Involved with Key-value data modeling, data load process and classify the key business drivers for the data management initiative.
  • Ensured high-quality data and understand how data is generated out experimental design and how these experiments can produce actionable, trustworthy conclusions.
  • Worked with the ETL team to document the transformation rules for data migration from OLTP to Warehouse environment for reporting purposes.
  • Involved in the validation of the OLAP, Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
  • Worked in NoSQL database on simple queries and writing Stored Procedures for Normalization and De-normalization.
  • Successfully loaded files to Hive and HDFS from Oracle and Involved in loading data from UNIX file system to HDFS and involved in the validation of the OLAP Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
  • Involved in translating business needs into long-term architecture solutions and reviewing object models, data models and metadata.
  • Define metadata business-level (logical) terms through interactions with project teams, business subject matter experts, and data analysis.
  • Responsible for defining the testing procedures, test plans, error handling strategy and performance tuning for mappings, Jobs and interfaces.

Environment: ERwin9.1, Teradata14, Oracle10g, PL/SQL, Unix, Agile, TIDAL, MDM, ETL, BTEQ, SQL Server2008, Netezza, DB2, SAS, Tableau, UNIX, SSRS, SSIS, T-SQL, MDM, Informatica, SQL.

Confidential - Malvern, PA

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Performed in team responsible for the analysis of business requirements and design implementation of the business solution.
  • Analyzed the physical data model to understand the relationship between existing tables.
  • Developed logical and physical data models for central model consolidation.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Extensively used Erwin as the main tool for modeling along with Visio
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Created 3NF business area data modeling with de-normalized physical implementation data and information requirements analysis using ERWIN tool.
  • Forward Engineering the Data models, Reverse Engineering on the existing Data Models and Updates the Data models.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
  • Trained Spotfire tool and gave guidance in creating Spotfire Visualizations to couple of colleagues
  • Developed Contracting Business Process Model Workflows (current / future state) using Process Modeler software.
  • Analyzed the business requirements by dividing them into subject areas and understood the data flow within the organization
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable
  • Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required
  • Configured & developed the triggers, workflows, validation rules & having hands on the deployment process from one sandbox to other.
  • Effectively used triggers and stored procedures necessary to meet specific application's requirements.
  • Created SQL scripts for database modification and performed multiple data modeling tasks at the same time under tight schedules.
  • Worked on PL/SQL collections, index by table, arrays, bulk collect, FOR ALL, etc.
  • Performed data cleaning and data manipulation activities using NZSQL utility.
  • Cleansed the unwanted tables and columns as per the requirements as part of the duty being a Data Analyst.
  • Analyzed and understood the architectural design of the project in a step by step process along with the data flow
  • Created DDL scripts for implementing Data Modeling changes.
  • Created ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs' to apply the data model changes.

Environment: Erwin r8.2, Oracle SQL Developer, Oracle Data Modeler, Teradata 14, SSIS, Business Objects, SQL Server 2008, ER/Studio Windows XP, MS Excel.

Confidential - Cherry Hill, NJ

Sr. Data Modeler

Responsibilities:

  • Attended and participated in information and Requirements Gathering sessions.
  • Ensured that Business Requirements can be translated into Data Requirements.
  • Analyzed the business requirements by dividing them into subject areas and understood the data flow within the organization.
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
  • Created data trace map and data quality mapping documents.
  • Created Use Case Diagrams using UML to define the functional requirements of the application.
  • Initiated and conducted JAD sessions inviting various teams to finalize the required data fields and their formats.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Worked on Erwin Data Modeler in designing and maintaining the Logical /Physical Dimensional Data models and generating the DDL statements and working with the Database team in creating the tables, views, keys in the database.
  • Created physical data models with the details necessary for a complete physical data model, including the appropriate specifications for keys, constraints, Indexes and other physical model attributes.
  • Reverse engineered physical data models from databases and SQL scripts.
  • Compare data models and physical databases and keep changes in sync.
  • Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class words Standards Document.
  • Involved in Data profiling in order to detect and correct inaccurate data and maintain the data quality.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Developed and maintained Enterprise data naming standards by interacting with data steward, data governance team.
  • Divided model into subject areas for reflecting understandable view to business as well as data model reviewers.
  • Performed UAT testing before Production Phase of the database components being built.
  • Introduced a Data Dictionary for the process, which simplified a lot of the work around the project.
  • Worked with DBA extensively and recommended Oracle bitmap indexes and bitmap joins for star schema optimization and Oracle table partitioning for performance. Implemented materialized views.
  • Identified and tracked the slowly changing dimensions (SCD I, II, III & Hybrid/6) and determined the hierarchies in dimensions.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Assisted the ETL Developers and Testers during the development and testing phases.

Environment: Erwin Data Modeler r 7.3, Erwin Model Manager, TOAD, Oracle9i/10g, XML Files, Flat files, SQL/PL SQL and UNIX Shell Scripts

Confidential

Data Analyst/Data Modeler

Responsibilities:

  • Designed Logical & Physical Data Model /Meta data/ data dictionary usingErwin for development teams both offshore and on site.
  • Provided conceptual /functional /business analysis assistance to developers and DBA's using Erwin. Validated data models with client developers
  • Created various PhysicalDataModels based on discussions with DBAs and ETL developers.
  • Extensively used Star and Snowflake Schema methodologies.
  • Designed data models with GE- ERC standards up to 3rd NF (OLTP/ODS) and de normalized (OLAP) data marts with Star & Snow flake schemas.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Improved performance on SQL queries used Explain plan / hints /indexes for tuning created DDL scripts for database. Created PL/SQL Procedures and Triggers.
  • Created more than 20 New Models and with more than 100 tables. Used Star Schema and Snow flake Schema for data marts / Data Warehouse.
  • Created tables, views, sequences, triggers, table spaces, constraints and generated DDL scripts for physical implementation.
  • Worked at conceptual/logical/physical data model level using Erwin according to requirements.
  • Performed data mining on data using very complex SQL queries and discovered pattern.
  • Used SQL for Querying the database in UNIX environment.

Environment: Oracle 8i/9i, SQL, PL/SQL Developer 5.1,Sun Solaris8.0, Erwin 4.0, ER- Studio 6.0/6.5, Toad 7.6, Informatica 7.0, IBM OS 390(V6.0), DB2 V7.1.

We'd love your feedback!