We provide IT Staff Augmentation Services!

Data Architect Resume

0/5 (Submit Your Rating)

Lowell, AR

SUMMARY

  • 9 years of experience in Data Modeling, Data Architecture, Data Cloud Migration, Data Analysis and Data Governance.
  • Expertise in Normalization (1NF, 2NF, 3NF and BCNF)/De - normalization techniques for effective and optimum performance in OLTP and OLAP
  • Experienced in building Enterprise Data Warehouse or Data warehouse or Data Lake from Scratch using both Kimball and Inmon Approach.
  • Strong understanding of the principles of Data warehousing, Fact Tables and Dimension Tables.
  • Expertise in designing, implementing and monitoring Mongo DB Clusters, Databases and Collections.
  • Expertise in designing relational database designs using ERWIN data modeler tool, ER Studio and SAP Power Designer.
  • Extensive experience in designing non-relational, document bases, NoSQL databases using Hackolade tool.
  • Expertise in designing, implementing and maintaining Elastic Search Indexes.
  • Experience in Data migration from on Premise servers to Cloud.
  • Migrated DB2 databases from on premise server to Azure SQL Server on PAAS and IAAS platforms.
  • Experience in designing, implementing and maintaining Elastic Search indexes for centralized logging and search functionality for the app.
  • Great experience in migrating legacy database systems to cloud database solutions.
  • Worked with Data Governance team on Information classification, data purging and data archiving needs.
  • Extensive experience managing PII data and masking PII data using SQL queries.
  • Designed Azure Blob Storage databases to store non-textual Binary Data.
  • Good experience of Performance tuning, optimizing SQL queries, writing complex SQL queries using functions, stored procedures.
  • Extensive experience in Software development life cycle (SDLC) using Agile development methodologies.
  • Extensive experience in working with multiple teams and in Onsite-offshore delivery mode/coordination, experience with the global service delivery mode.
  • Experienced in creating artifacts for projects which include specification documents, data mapping and data analysis documents.

TECHNICAL SKILLS

Relational Databases: NETEZZA, Teradata, Oracle, MySql, PostGreSQL, MS SQL Server, Vertica

Non-Relational Databases: Azure Cosmos DB, Mango DB, Elastic search, Blob Storage, Neo4J, Databricks

ETL Tools: Informatica Power Center, SSIS, ADLS, Apache Airflow

Scheduler/Workflow: Airflow, Jenkins, Windows Task scheduler

DataModeling Tools: Erwin, Microsoft Visio 2003, ER Studio, SAP PowerDesigner, Hackolade

Languages: PL/SQL, TSQL, PySQL, Java Basics

PROFESSIONAL EXPERIENCE

Confidential, Lowell, AR

Data Architect

Responsibilities:

  • Responsible to provide transactional and dimensional data warehouse solutions to various projects.
  • Design, implement and deployed multiple OLTP databases to support various applications
  • Work on Normalization techniques and Normalize data in 3rd Normal Form(3NF) for OLTP requirements.
  • Involved in designing, implementing and maintaining Enterprise Data warehouse facts and dimensions using Star Schema concepts.
  • Designing and maintaining many fact and dimension tables to support reporting requirement
  • Define standards and best practices of data modeling for enterprise.
  • Work on Migrating data from IBM Mainframe DB2 server to Microsoft Azure Cloud platform.
  • Implemented Apache Airflow DAGs for data orchestration.
  • Involved in managing and maintaining the CI/CD pipeline using DevOps toolsets JENKINS, GitHub, Azure DevOps
  • Analyzed existing PII data and implemented data masking strategies to mask user sensitive data.
  • Involved with Data Privacy team in writing standards for PII data masking.
  • Work extensively with ETL team to retire legacy systems and migration to the cloud.
  • Designing, implementing and monitoring Mongo DB Clusters, Databases and Collections.
  • Work with business stake holders to understand their needs and provide guidance accordingly.
  • Create Database artifacts such as Logical and Physical Data Models, Data dictionaries and ER diagrams.
  • Provide Data Architecture solutions in Data Modeling, Data Placement Strategy, Data Integration Strategy, Reference Data Management.
  • Designed, implemented and maintained Elastic Search indexes for centralized logging and search functionality for the app.
  • Involved in KAFKA PR reviews and mapping the data elements to the source data fields.
  • Work with application teams to understand the requirement before creating Azure SQL Server, Cosmos DB, Azure BLOB Storage accounts.
  • Review Elasticsearch index change request. Review Elasticsearch mappings to meet standards across applicants and teams.
  • Work on gathering data retention requirements and assists information lifecycle governance team to set archive and purge policy of data.
  • Worked on Global information Systems project to design Raster and Vector Models for Geo-fence data modeling requirement.
  • Did PoC on Microsoft Data Factory for migrating data from on premise to Microsoft Azure cloud platform.
  • Work on Improving database performance and query execution by indexing and portioning.
  • Worked with ETL team to learn Azure Databricks. Created Databricks workspace and workflows as part of the PoC.

Confidential, Nashville, TN

Data Modeler/Architect

Responsibilities:

  • Developed logical data models and physical database design and generated database schemas using SAP PowerDesigner.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snowflake Schemas.
  • DevelopedDataMapping,DataGovernance, and Transformation and cleansing rules for the MasterDataManagement Architecture involving OLTP, ODS.
  • Developed enhancements to MongoDB architecture to improve performance and scalability.
  • Involved in writing T-SQL working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration.
  • Provided guidance and solution concepts for multiple projects focused on data governance and master data management.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
  • Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class words Standards Document.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Did PoCs on LogStash and Elastic Search for ETL processes.
  • Did PoC on Octopus for code deployment.
  • Used SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts and performed action in XML.
  • Backed up databases using Mongo DB backup facility in OPS manager.
  • Performance tuning and stress-testing of NoSQL database environments in order to ensure acceptable database performance in production mode.
  • Worked on TSQL log4 frame work for logging events and exception handling
  • Implemented strong referential integrity and auditing by the use of triggers and SQL Scripts.
  • Trained junior data modelers on data architecture standards and data modeling standards.

Confidential, Nashville, TN

Data Modeler

Responsibilities:

  • Understood the business need and Project requirements in a large data analytics product.
  • Designed and developed architecture fordataservices ecosystem spanning Relational, NoSQL, and BigDatatechnologies.
  • Worked on documenting data modeling standards for the client.
  • Worked closely with Data Governance team to help them understand the retention policy for databases.
  • Built relational database using ER Studio to capture source data files coming from CMMS(Centre for Medicare and Medicaid Services)
  • Built conceptual, logical, and physical data model for unstructured data.
  • Migrated data in AWS cloud using S3 buckets.
  • Completed ASPIRE healthcare Certification provided by EMIDS.
  • Attended numerous trainings to understand the Healthcare Domain and the concepts related to the project (Healthcare Informatics).
  • Collaborated with other data modelers to understand and implement best practices within the organization.
  • Used ER studio tool extensively for database design and modeling.
  • Used LIQUIBASE tool effectively for restoring and maintaining SQL scripts.
  • Took basic training on Amazon Web Services provided by the client.
  • Worked with Technical leads to implement AURORA DB Cluster and manage the Databases required by the project.
  • Conducted numerous POCs (Proof of Concepts) to efficiently import large data sets into the database from AWS S3 Bucket.
  • Improved database performance and query execution by indexing and portioning.
  • Completed a POC (Proof of Concept) to efficiently convert .CSV to PARQUET file format.
  • Learnt COLUMNAR data format as part of the above POC.

Confidential, Atlanta, GA

Data Modeler

Responsibilities:

  • Worked on subject areas such as SSR, Trauma, NSQIP-Adult, NSQIP-Peds, Bariatrics and CANCER
  • Worked with C&R team and directly with clients to gather business requirements and converted those to technical requirements.
  • Worked on Aptify front end for various data migration, data validation and testing activities.
  • Researched, evaluated, architect, and deployed new tools, frameworks, and patterns to build sustainable cloud Data platforms for our clients
  • Handled two different teams from Off-shore responsible for data Modeling.
  • Worked on creating User Stories for Modelers working offshore for Transactional system and data warehouse.
  • Lead weekly team meeting early in the morning with onshore and offshore team along with Product Owner to take weekly updates.
  • Conducted many Joint Applicant Development sessions with different SMEs and stakeholders on Product performance and new requirements.
  • Worked closely with Director and managers to schedule scrum calls and update about daily activities.
  • Designed Enterprise Logical Data Model, for specific subject areas based on the requirements.
  • Assisted Data Analyst with source to Target mapping documents and API specification documents.
  • Responsible for all metadata relating to the EDW’s overall data architecture, descriptions of data objects, access methods and security requirements.
  • Work on ELDM model maturity and enhancements to existing subject area based on the new entities and attributes.
  • Worked closely with Data Analysts and Senior ETL staff on the design of database load strategies and Data Profiling.
  • Designed the logicaldata model from the technical design documents and translating to a physical data model using ERWIN/ER Studio.
  • Worked directly with Emids team SVT and Migration team to understand SSR, Trauma and NSQIP variables in Aptify.
  • Worked on data mapping activities, data profiling for data from source to RDC.
  • Worked on various activities such as extracting data from different DBs using SQL queries.
  • Updating TRAUMA variables as per the new Trauma DATA Dictionaries releases.
  • Helping team to understand SSR
  • Worked on SOAPUI web service testing tool to test web Services.
  • Designed a data model for Trauma using ErWin 9.6 and created conceptual, logical and physical data models.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.

Confidential, Chesterton, IN

ETL Developer Intern

Responsibilities:

  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.
  • Extracted data from SAP system to Staging Area and loaded the data to the target database by ETL process using Informatica Power Center.
  • Did PoCs on various Transformations in Informatica like Source Qualifier, Aggregator, Router, Joiner, Rank, Sequence Generator, Transaction Control, and Lookup.
  • Designed and developed various PL/SQL stored procedures to perform various calculations related to fact measures.
  • Converted the PL/SQL Procedures to Informatica mappings and at the same time created procedures in the database level for optimum performance of the mappings.
  • Performed Unit testing and maintained test logs and test cases for all the mappings.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Parsing high-level design specification to simple ETL coding along with mapping standards.

We'd love your feedback!