We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Architect Resume

5.00/5 (Submit Your Rating)

Miami, FL

SUMMARY

  • Above 9+ years IT experience in Data Modeler/Data Architect and Data Analyst as a Proficient in gathering business requirements and handling requirements management.
  • Extensive knowledge and Hands on experience implementing cloud data lakes like Azure Data Lake Azure Data Lake and Azure Data Bricks.
  • Excellent knowledge onAWSinfrastructure services Amazon Simple Storage Service (Amazon S3), EMR, and Amazon Elastic Compute Cloud (Amazon EC2).
  • Experience working with data modeling tools like Erwin, Power Designer and ER/Studio.
  • Experience in designing Logical, Physical & Conceptual data models for to build the Data Warehouse.
  • Experience in creating models for Oracle/Teradata/ SQL Server/DB2.
  • Excellent understanding of an Approach to MDM to creating adatadictionary, Using Informatica or other tools to do mapping from sources to the Target MDMDataModel.
  • Experience in Agile Methodologies and SCRUM Process.
  • Experience in JIRA software for Plan, Track, and Report and Release management.
  • Experience in developing data models which will serve both OLTP and OLAP functionality as per business needs.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Sound knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Efficient in with Normalization (1NF, 2NF and 3NF) and De - normalization techniques for improved database
  • Extensively experience on Excel, Pivot tables to run and analyze the resultdataset and perform UNIX scripting.
  • Expertise in python-based environment along with analytics, data wrangling and excel data extracts.
  • Experience in rendering and delivering reports in desired formats by using reporting tools such as Tableau.
  • Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS).
  • Experience with Tableau in analysis and creation of dashboard and user stories.
  • Well versed in process flow diagrams, entity relationship diagrams during Analysis and Design using Visio.
  • Excellent problem solving and analytical skills with exceptional ability to learn and master new technologies efficiently

TECHNICAL SKILLS

Data Modeling Tools: Erwin R2Sp2, Power Designer, ER/Studio.

Cloud Services: Azure, GCP and AWS.

Databases: RDBMS (SQL Server, DB2, Teradata

Big Data: Hadoop3.3, Hive2.3, Sqoop1.4. Pig0.17.

Development Methodologies: Agile, Scrum, Waterfall

OLAP & ETL Tools: Tableau, SSIS, Talend, Informatica Power Center

Reporting Tools: MS Excel, Tableau, Power BI, SSIS

Programming languages: SQL, Python

PROFESSIONAL EXPERIENCE

Confidential

Sr. Data Modeler/Data Architect

Responsibilities:

  • As a Data Modeler/Data Architect involved in performing business area analysis and logical and physical data modeling using Erwin for data warehouse applications.
  • Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.
  • Performed the Data Modeling effort for the gaps identified while data mapping.
  • Involved in Planning, Defining and Designing data base using Erwin on business requirement and provided documentation.
  • Worked on Azure and architecting a solution to load data, create data models and run BI on it.
  • Developed data mapping, data governance and transformation and cleansing rules for the Master Data Management (MDM) Process.
  • Performed data integrity checks, data cleansing, exploratory data analysis, and feature engineer using Python.
  • Modified existing Talend mappings to load to Snowflake DB.
  • Created Logical and Physical Data Models using Erwin.
  • Worked with Big Data and Big Data on Cloud, Master Data Management and Data Governance.
  • Worked on Data load using Azure Data factory using external table approach.
  • Used Azure Data Factory extensively for ingesting data from disparate source systems.
  • Designed and built a Data Discovery Platform for a large system integrator using Azure HdInsight components.
  • Used U-SQL to interact multiple source streams with in Azure Data Lake.
  • Developed the required data warehouse model using Star schema for the generalized model.
  • Resolved multiple Data Governance issues to support data consistency at the enterprise level.
  • Worked on the Microsoft Azure environment (blob storage, Data Lake, AZ copy) using Hive as extracting language.
  • Captured, stored and processed both structured and unstructured data using Python, SQL.
  • Worked on big data technologies like HDFS, Spark, Mapreduce, and Hive to extract, migrate and load data of various.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Involved in loading data intoSnowflaketables from internal stage usingSnowSQL.
  • Built a solution using Azure managed services StorSimple and Blob storage to archive on-premises data to cloud.
  • Designed and developed user defined functions, stored procedures, triggers for Cosmos DB.
  • Worked on streaming data to consume data from Kafka topics and load the data to landing area for reporting in near real time.
  • Designed and implemented RestApi to access snowflake DB platform.
  • Provided suggestion to implement multitasking for existing Hive Architecture inHadoop.
  • Worked withDataLake,Spark and Azure.
  • Analyzed data using Hadoop components Hive and Pig.
  • Involved in capturingdatalineage, table and column data definitions, valid values and others necessary information in the data models.
  • Created and Configured Azure Cosmos DB.
  • Generated JSon files from the JSon models created for Zip Code, Group and Claims using Snowflake DB.
  • Used Power BIto create dashboards and participated in the process of choosing the right tool for Dashboards and Analytics.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).

Tools: Erwin R2Sp2, Agile, SnowflakeDB, MDM, Talend, Hadoop3.3, Python3.5, Azure, CosmosDB, Power BI, Rest API.

Confidential - Miami FL

Data Modeler/Data Architect

Responsibilities:

  • Worked with Data Modeler/Data Architect to implement data model changes in database in all environments.
  • Lead the development and implementation of the logical data model and physical data design utilizing data architecture and modeling standards.
  • Conducted Design discussions and meetings to come out with the appropriate Data Model.
  • Participate in gathering and defining of business requirements to support the team across all projects.
  • Worked with Data Stewards and Business analysts to gather requirements for MDM Project.
  • Extensively used Agile Method for daily scrum to discuss the project related information.
  • Managed deployments and migrations of services to MS Azure.
  • Worked on data integrity and queries execution efficiency applying knowledge ofAzure Data Bricks
  • Creating Logical/physical Data Model in ERwin and have worked on loading the tables in the Data Warehouse.
  • Performed data analysis on source data to be transferred into an MDM structure.
  • Designed and implemented a fully operational production grade large scale data solution on Snowflake Data Warehouse.
  • Worked on Azure Power BI Embedded to integrate the reports to application.
  • Used Azure reporting services to upload and download reports.
  • Participated in performance management and tuning for stored procedures, tables and database servers.
  • Applied Data Governance rules for primary qualifier, Class words and valid abbreviation in table name and Column names.
  • Developed Python utility to validate HDFS tables with source tables.
  • Worked with Azure BLOB and Data lake storage and loading data into Azure SQL Synapse analytics (DW).
  • Involved in using ETL tool Informatica to populate the database,datatransformation from the old database to the new database using Oracle.
  • Used Normalization (1NF, 2NF & 3NF) and De-normalization techniques for effective performance in OLTP and OLAP systems.
  • Analyzed Azure Data Factory and Azure Data Bricks to build new ETL process in Azure.
  • Wrote complex SQL scripts to analyze data present in different Datawarehouse’s like Snowflake.
  • Created complex program unit using PL/SQL Records, Collection types
  • Created and maintained Metadata, including table, column definitions.
  • Worked for map reduce and query optimization forHadoophiveandHBasearchitecture.
  • Worked extensively on importing and exported data into HDFS using Sqoop.
  • Architected and documented Azure SQL Data Warehouse (Synapse Analytics).
  • Designed and developed a Data Lake using Hadoop for processing raw and processed claims via Hive and Informatica.
  • Worked with DBA to create the physical model and tables .
  • Identified various facts and dimensions from the source system and business requirements to be used for the data warehouse.
  • Createddesignsand process flows on how tostandardize Power BI dashboardsto meet thebusiness requirement.
  • Designed and Configured Azure Cloud relational servers and databases analyzing current and future business requirements.
  • Created and modified various Stored Procedures used in the application using T-SQL.

Tools: Erwin9.8, Oracle, PL/SQL Agile, Azure, Snowflake DW, Power BI, HBase1.2, ETL, MDM, Python, T-SQL,.

Confidential, Boston, MA

Data Modeler

Responsibilities:

  • Documented all data mapping and transformation processes in the Functional Design documents based on the business requirements.
  • Generated and DDL (Data Definition Language) scripts using Erwin and assisted DBA in Physical Implementation of data Models.
  • Created conceptual, logical and physical models for Data Warehouse Data Vault and Data Mart.
  • Used JIRA as an agile tool to keep track of the stories that were worked on using the agile methodology.
  • Implemented a proof of concept deploying this product in Amazon Web Services AWS.
  • Created Fast Export, MultiLoad, Fast Load for batch Processing.
  • Worked on cloud technologies and Involved in Amazon EC2 and S3 and supporting both the development and production environment.
  • Worked on migrating of EDW to AWS using EMR and various other technologies
  • Responsible for full data loads from production to AWS Redshift staging environment.
  • Worked on both OLTP and OLAP databases and creating performance benchmarks and generating SQL*Plus reports.
  • Performed AWS lambda functions on S3.
  • Documented Informatica mappings in Excel spread sheet.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Designed the Data Marts in dimensional Data modeling using star and snowflake schemas.
  • Developed ETL data mapping and loading logic for MDM loading from internal and external sources.
  • Defined and deployed monitoring, metrics, and logging systems on AWS.
  • Extensively used Star and Snowflake Schema methodologies.
  • Involved in database development by creating Oracle PL/SQL Functions, Procedures and Collections.
  • Used AWS glue catalog with crawler to get the data from S3 and perform sql query operations
  • Gathered requirements, analyzed and wrote the design documents.
  • Designed thedatamarts using the Ralph Kimball's DimensionalDataMart modeling methodology using Erwin.
  • Used GIT for code reviews during stages of branching, merging and staging.
  • Designeddataprocess flows using Informatica to sourcedatainto Statements database on Oracle platform.
  • Prepared Dashboards using calculations, parameters in Tableau.
  • Planned and defined system requirements to Use Case Scenario and Use Case Narrative using the UML methodologies.

Tools: Erwin9.7, Oracle PL/SQL, Agile, AWS, JIRA, OLAP, OLTP, GIT, Informatica, Tableau, MDM, MS Excel.

Confidential - San Jose, CA

Data Analyst/Data Modeler

Responsibilities:

  • As a Data Analyst/Data Modeler, I was responsible for all data related aspects of a project.
  • Involved in the data analysis and database modeling of both OLTP and OLAP environment.
  • Provided data sourcing methodology, resource management and performance monitoring for data acquisition.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements.
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
  • Gathered the various reporting requirement from the business analysts.
  • Created PhysicalDataModel from the LogicalDataModel using Compare and Merge Utility in ER/Studio.
  • Developed Snowflake Schemas by normalizing the dimension tables as appropriate.
  • Developed MDM meta data dictionary and naming convention across enterprise.
  • Used Teradata Fast Load utility to load large volumes of data into empty Teradata tables in the Data mart for the Initial load.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
  • Worked ondatavalidation to ensure the accuracy of thedatabetween the warehouses.
  • Utilized forward/reverse engineering tools and target database schema conversion process.
  • Worked on Key performance Indicators (KPIs), design of star schema and snowflake schema in Analysis Services (SSAS).
  • Performed data reconciliation between integrated systems.
  • Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
  • Generated DDL scripts for database modification, Teradata, Macros, Views and set tables.
  • Created Complex SQL Queries using Views, Indexes, Triggers, Roles, Stored procedures and User Defined Functions worked with different methods of logging in SSIS.
  • Involved in writing stored procedures and packages using PL/SQL.
  • Designed and developed dynamic advanced T-SQL, stored procedures, XML, user defined functions, parameters, views, tables, triggers and indexes.

Tools: SQL, SSIS, SSAS, OLAP, OLTP ER/Studio, Teradata, MDM, T-SQL, PL/SQL.

We'd love your feedback!