We provide IT Staff Augmentation Services!

Sr. Data Architect Resume

2.00/5 (Submit Your Rating)

Alexandria, VA

PROFESSIONAL SUMMARY:

  • Over 10+ years of strong IT experience in Data Architecture, Data Modeling,Design and Development.
  • Good Knowledge of cloud services such as Amazon Elastic Compute Cloud (Amazon EC2 & S3) and MS Azure.
  • Experience in developing MapReduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Experience in analyzing data using Hadoop Ecosystem including HDFS, Hive, Spark, Spark Streaming, Elastic Search, Kafka, HBase, Sqoop, and Flume.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Management Systems (RDBMS) and from RDBMS to HDFS.
  • Experience in Migrated on premises enterprise data warehouse to cloud based snowflakeDataWarehousing solution.
  • Good working knowledge on Snowpipe and SnowflakeDB and SnowflakecloudDWHarchitecture.
  • Expert Experience and Knowledge in Master Data/Metadata Management (MDM), Data Quality and Data Governance implementation.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export
  • Designed the Data Marts in dimensionaldata modeling using star and snowflakeschemas.
  • Well versed in Normalization / De - normalization techniques for optimum performance in relational and dimensional database environments.
  • Experienced in various Teradata utilities like Fastload, Multiload, BTEQ, and Teradata SQL Assistant.
  • Experienced in enhancing complex enterprise data models, accurately representing the logical and DB nuances in physical data models.
  • Designed data models (ER and dimensional) for database platforms such as Oracle, SQL Server, and DB2.
  • Experience working with data modeling tools like Erwin, Power Designer and ER Studio.
  • Experience in dataanalysis using Hive, Pig Latin, and Impala.
  • Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
  • Expertise in designing Star schema, Snowflake schema for Data Warehouse, ODS architecture by using tools like Erwin data modeler, Power Designer, and E-R Studio.
  • Experience in setting up connections to different RDBMS Databases like Oracle, SQL Server, DB2, Teradata according to users requirement.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Good knowledge of Data Marts, Operational Data Store (ODS), Dimensional Data Modeling with Ralph Kimball Methodology using Analysis Services.
  • Experience in working with Business Intelligence and Enterprise Data Warehouse(EDW) including SSAS, QlikView, Greenplum, Amazon Redshift and AzureDataWarehouse.
  • Strong experience with architecting highly performance databases using PostgreSQL, PostGIS, MYSQL and Cassandra.
  • Good understanding and hands on experience in setting up and maintaining NoSQL Databases like Cassandra, MongoDB, and HBase
  • Experience with SQLServer and T-SQL in constructing Temporary Tables, Table variables, Triggers, user functions, views, Stored Procedures.
  • Expert knowledge in oracleforms/reports, IBM Informix, Mongo DB & Teradata.
  • Expert knowledge in BI tools like Cognos, Business Objects, Oracle BI Suite, SSRS & Tableau.
  • Experience Upstream Data Modeling Using Erwin, Embarcadero ER/Studio, Oracle SQL Developer Data Modeler and MS-Visio Data Modeling Tools.
  • Extensive experienced in Unix Shell Scripting and Perl scripting.

TECHNICAL SKILLS:

Cloud Management: Amazon Web Services(AWS), Amazon Redshift, Azure Data lake, Azure Cloud, Azure Data Factory, Snowflake Cloud (SnowSQL &Snowpipe)

NOSQL DB: Snowflake DB, Azure SQL DB, HBase, Cassandra.

Big Data: Hadoop 3.3, Hive, HDFS, Sqoop, Kafka

Data Modeling Tools: ER/Studio V17, Erwin 9.7, Power Sybase Designer 16.6.

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9

Programming Languages: SQL, PL/SQL, UNIX shell Scripting, PERL, AWK, SED

Databases: Oracle 12c, Teradata R15, MS SQL Server 2017, DB2.

Testing and defect tracking Tools: HP/Mercury, Quality Center, Win Runner, MS Visio 2016 & Visual Source Safe

Operating System: Windows 10/8, Unix, Sun Solaris

ETL/Data warehouse Tools: Informatica 10, SAP Business Objects XIR3.1/XIR2, Talend, Tableau

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.

WORK EXPERIENCE:

Confidential - Alexandria, VA

Sr. Data Architect

Responsibilities:

  • Working as an Architect and develop scalable, highly available, fault tolerant, secure systems for on-premises, hybrid and cloud-based data systems that meet client business needs.
  • As an Architect implement MDMhub to provide clean, consistent data for a SOA implementation.
  • Implemented AgileMethodology for building Integrated Data Warehouse, involved in multiple sprints for various tracks throughout the project lifecycle.
  • Involved in developing Database Design Document including Data Model Conceptual, Logical and PhysicalModels using Erwin9.64.
  • Responsible for analysis of massive and highly complex data sets, performing ad-hocanalysis and datamanipulation for dataintegration.
  • Designed and implemented scalable CloudData and Analyticalarchitecture solutions for various public and private cloud platforms using AWS.
  • Involved in analysis and Solution design for migrating on-prem Oracledatabase to CloudDatawarehouseSnowflake.
  • Designed and developed data architecture solutions in bigdataarchitecture or dataanalytics.
  • Built Snowpipepipelines for continuous data load to AWSS3 and SnowflakeDatawarehouse.
  • Built data pipelines using Python for handling JSONfiles from AWSS3ExternalStorage to into Snowflake.
  • Setup Snowflake Stage and Snowpipe for continuous loading of data from S3buckets into landing table.
  • Evaluate architecturepatterns, Define best patterns for data usage, datasecurity, data compliance, Define concept models, logical&physicaldatamodel.
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and extracted the data from Oracle into HDFS using Sqoop
  • Applied Data Governance rules (primary qualifier, Class words and valid abbreviation in Table name and Column names).
  • Designed and documented logical and physicaldatabase designs for Enterprise Application (OLTP), Data Warehouses (OLAP),NoSQL databases.
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Developed and presented data flow diagrams, conceptual diagrams, UML diagrams, ER flow diagrams, creating the ETL Source to Target mapping specifications and supporting documentation.
  • Developed long term data warehouseroadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Designed and developed Databricksnotebooks to read, transform and load from different sources like S3 perform transformations and load into Snowflake.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
  • Independently coded new programs and design Tables to load and test the program effectively for the given POC's using BigData/Hadoop.
  • Involved in Normalization/De-normalization techniques for optimum performance in relational and dimensional database environments.
  • Designed and developed pythonscripts that run on AWSEMR to load data into Snowflake.
  • Create views in Snowflake to support reporting requirements.
  • Design and development of MatillionETLjobs with Snowflakeclouddatabase.
  • Schedule the process to load the files via Snowflaketasks.
  • Created Snowpipe for continuous data load.
  • Convert and review code from oracle PL/SQL programming to snowflakecode, make performance changes and test.
  • Involved & coordinated in setup of AWS SNS notifications to auto trigger Snowpipe at regular intervals for continuous load.
  • Training other developers on snowflake and provide support as and when needed.
  • Support Cloud Strategy team to integrate analytical capabilities into an overall cloud architecture and business case development

Environment: ERWIN r9.6, Oracle12c, OLAP, OLTP, T-SQL, SQL, Linux, MDM, Hadoop, MapReduce, Snowflake DB, PL/SQL.

Confidential - Watertown, MA

Sr. Data Architect/Data Modeler

Responsibilities:

  • As a Sr. Data Architect/Modeler collaboratively worked with the Data modeling architects and other data modelers in the team to design the Enterprise Level Standard Data model.
  • Interacted with users for verifying User Requirements, managing Change Control Process, updating existing Documentation.
  • Working with the architecture and development teams to help choose data-related technologies, design architectures, and model data in a manner that is efficient, scalable, and supportable.
  • Worked closely with the development and database administrators to guide the development of the physical data model and database design.
  • Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Worked on designing Conceptual, Logical and Physical data models and performed data design reviews with the Project team members.
  • Involved in the creation, maintenance of SnowflakeData Warehouse and repositories containing Metadata.
  • Ensure the cloud data warehouse and data mart designs to efficiently support the reporting and BI team requirements.
  • Developed SnowflakeProcedure to perform transformations, load the data into target table and purge the stage tables.
  • Designed a datamodel on Snowflakedatabase to support reporting requirements.
  • Involved in versioning and saving the models to the data mart and maintaining the Data mart Repository.
  • Designed a STAR schema for sales data involving shared dimensions (Conformed) using Erwin Data Modeler.
  • Worked on building the Logical data model from the scratch from the XMLs as the data source.
  • Worked on building the data models to convert the data from one data Application to another in a way that suit the needs of the target database.
  • Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns.
  • Built Data Lake in Azure using Hadoop (HDInsight clusters) and migrated Data using Azure Data Factory pipeline.
  • Designed Lambda architecture to process streaming data using Spark. Data was ingested using Sqoop for structured data and Kafka for unstructured data.
  • Creation Azure Event Hubs, Azure Service Bus, Azure Service Analysis, Power BI for handling IOT Messages.
  • Performed Hive programming for applications that were migrated to big data using Hadoop.
  • Involved in creating Hive tables and loading and analyzing data using hive queries Developed Hive.
  • Executed Hive queries on Parquet tables stored in Hive to perform data analysis to meet the business requirements.
  • Produced 3NF data models for OLTP designs using data modeling best practices and modeling skills.
  • Worked with Data Stewards and Business analysts to gather requirements for MDM Project.
  • Worked with reversed engineer Data Model from Database instance and Scripts.
  • Responsible for defining the naming standards for data warehouse.
  • Enforced Referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Created Source to Target Mapping Documents to help guide the data model design from the Data source to the data model.
  • Involved in the validation of the OLAPUnittesting and System Testing of the OLAP Report Functionality and data displayed in the reports.
  • Created tables in Snowflake DBto load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Data Governance of RAW, Staging, Curated and Presentation Layers in AzureDataLakeStore.
  • Conducted and participated in JAD sessions with the users, modelers, and developers for resolving issues.
  • Applied data naming standards, created the data dictionary and documented data model translation decisions and also maintained DW metadata.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Participated in PerformanceTuning using Explain Plan and TKPROF.
  • Created SnowflakeTasks for scheduling the jobs.
  • Involved in the hands-on technical delivery of customer projects related to Azure.
  • Performance tuning and stress-testing of NoSQLdatabase environments in order to ensure acceptable database performance in production mode.

Environment: Erwin9.7, Hadoop3.0, PL/Sql, UNIX, Spark, Azure Data Lake, OLTP,Snowflake DB and Snowflake DW.

Confidential - San Antonio, TX

Sr. Data Modeler/Data Architect

Responsibilities:

  • Understand the high-level design choices and the defined technical standards for software coding, tools and platforms and ensure adherence to the same.
  • Used Agile Methodology of Data Warehouse development using Kanbanize.
  • Analyze business requirements and build logicaldatamodels that describe all the data and relationships between the data
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and SnowFlakeSchemas
  • Provided suggestion to implement multitasking for existing Hive Architecture in Hadoop also suggested UI customization in Hadoop
  • Architect and lead significant data initiatives in various data dimensions Master Data, Meta Data, Big Data&Analytics.
  • Involved in Planning, Defining and Designing database using ER/Studio on business requirement and provided documentation.
  • Translate business and data requirements into logicaldatamodels in support of EnterpriseDataModels, OperationalDataStructures and Analyticalsystems.
  • Partner with DBAs to transform logicaldatamodels into physicaldatabase designs while optimizing the performance and maintainability of the physical database
  • Work with Data Management to establish governance processes around metadata to ensure an integrated definition of data for enterprise information, and to ensure the accuracy, validity, and reusability of metadata.
  • Migrated SQLServerDatabase to MicrosoftAzureSQLDatabase.
  • Developed Full life cycle of Data Lake, Data Warehouse with Bigdata technologies like Spark and Hadoop.
  • Applied all phases of the Software Development Life Cycle, which include requirements definition, analysis, review of design and development, and integration and test of solution into the operational environment
  • Worked on AzurePowerBI Embedded to integrate the reports to application.
  • Developed Map Reduce programs to cleanse the data in HDFS obtained from heterogeneous data sources to make it
  • Lead database level tuning and optimization in support of application development teams on an ad-hoc basis.
  • Createddataschema and architecture ofdatawarehouse for standardizeddatastorage and access
  • Used data profiling automation to uncover the characteristics of the data and the relationships between data sources before any data-driven.
  • Used Azure reporting services to upload and download reports
  • Develop test scripts for testing sourced data and their validation and transformation when persisting in data stores that are physical representations of the data models
  • Designed and documented Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object Oriented Design) using UML and Visio.
  • Completed enhancement for MDM (Masterdatamanagement) and suggested the implementation for hybrid MDM (MasterDataManagement)
  • Designed processes and jobs to source data from Mainframe sources to HDFS staging zone
  • Integrated data from multiples sources including HDFS to Hive Data warehouse.

Environment: ER Studio 9.0, Hive, Hadoop, MDM, MS Azure, HDFS, PL/SQL, Sql Server, UNIX

Confidential - Newport Beach, CA

Sr. Data Modeler / Data Analyst

Responsibilities:

  • Massively involved in Sr. Data Modeler / Data Analyst role to review businessrequirement and compose source to target datamapping documents.
  • Analyzed businessrequirements, system requirements,and data mapping specification requirements.
  • Extensively used Agilemethodology as the Organization Standard to implement the dataModels.
  • Presented the data scenarios via, ER/Studiologicalmodels and excel mockups to visualize the data better.
  • Designed and Developed logical&physicaldatamodels and Meta Data to support the requirements.
  • Performed DataAnalysis and dataprofiling using complexSQL on various sources systems including Oracle.
  • Involved in extensive Datavalidation using SQL queries and back-end testing
  • Generated DDL statements for the creation of new ER/studio objects like table, views, indexes, packages and stored procedures.
  • Used SQL for Querying the database in Unix environment.
  • Worked on AWSRedshift and RDS for implementing models and data on RDS and Redshift
  • Designed Star and SnowflakeDataModels for EnterpriseDataWarehouse using E/R Studio.
  • Documented datadictionaries and businessrequirements for key workflows and process points.
  • Designed the datamarts using the RalphKimball'sDimensionalDataMartmodelingmethodology using E/R Studio.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models.
  • Worked on Normalization and De-Normalization techniques for both OLTP and OLAP systems.
  • Designed ER diagrams and mapping the data into database objects.
  • Monitored the Data quality and integrity of data was maintained to ensure effective functioning of department.
  • Worked with MDM systems team with respect to technical aspects and generating reports.
  • Assisted in the oversight for compliance to the Enterprise Data Standards, data governance and data quality.
  • Developed complex T-Sql code such as Stored Procedures, functions, triggers, Indexes, and views for the business application.
  • Designed and developed cubes using SQL ServerAnalysis Services (SSAS) using Microsoft Visual Studio.
  • Involved in complete SSIS life cycle in creating SSISpackages, building, deploying and executing the packages all environments.
  • Developed reports for users in different departments in the organization using SQL Server Reporting Services (SSRS).
  • Used ER/Studio for reverseengineering to connect to existing database and ODS.
  • Created graphical representation in the form of Entity Relationships and elicit more information.

Environment: Agile, ER/Studio v17, Oracle 12c, Unix, AWS, Amazon Redshift, OLTP, OLAP, NoSQL, MDM, T-Sql, SSAS, Microsoft Visual Studio 2016, SSIS, SSRS

Confidential

Data Analyst/Data Modeler

Responsibilities:

  • Worked as a Data Analyst/Modeler to generate DataModels using SAPPowerDesigner and developed relational database system.
  • Developed long term datawarehouseroadmap and architectures, designs and builds the datawarehouseframework per the roadmap.
  • Conducted user interviews, gathering requirements, analyzing the requirements using Rational Rose, Requisite pro RUP.
  • Developed logical data models and physical database design and generated database schemas using SAP PowerDesigner.
  • Analyzed the businessrequirements of the project by studying the BusinessRequirement Specification document.
  • Extensively worked on DataModeling tools SAP PowerDesigner DataModeler to design the data models.
  • Created ER diagrams using Power Designer modeling tool for the relational and dimensionaldatamodeling.
  • Involved in datamapping document from source to target and the dataquality assessments for the source data.
  • Responsible for data profiling and data quality checks to suffice the report requirements
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • Created SQL tables with referential integrity and developed queries using SQL, SQL*PLUS and PL/SQL.
  • Designed both 3NFdatamodels for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Designed the DatabaseTables&CreatedTable and ColumnLevelConstraints using the suggested naming conventions for constraint keys.
  • Reversed Engineered the existing database structure to understand the existing data models so that any changes in corporate would synchronize with current model.
  • Involved in Normalization / De-normalization, Normal Form and database design methodology.
  • Involved in dataanalysis and modeling for the OLAP and OLTP environment.
  • Conducted JAD Sessions with the SME, stakeholders and other management teams in the finalization of the User Requirement Documentation.
  • Wrote T-SQL statements for retrieval of data and Involved in performance tuning of T-SQL queries and Stored Procedures.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Created queries using BIR reporting variables, navigational attributes and Filters.
  • Used excel sheet, flat files, CSV files to generated Tableau ad-hoc reports
  • Handled performance requirements for databases in OLTP and OLAP models.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
  • Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
  • Developed Data mapping, Data Governance, Transformation and Cleansingrules for the Data Management involving OLTP, ODS and OLAP.

Environment: SAP, PowerDesigner 16.6, OLTP, OLAP, T-SQL, SSIS, SQL Server, SQL, PL/SQL, Rational Rose, ODS

We'd love your feedback!